CN116501210A - Display method, electronic equipment and storage medium - Google Patents

Display method, electronic equipment and storage medium Download PDF

Info

Publication number
CN116501210A
CN116501210A CN202310382648.9A CN202310382648A CN116501210A CN 116501210 A CN116501210 A CN 116501210A CN 202310382648 A CN202310382648 A CN 202310382648A CN 116501210 A CN116501210 A CN 116501210A
Authority
CN
China
Prior art keywords
dynamic effect
interface
electronic device
application
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310382648.9A
Other languages
Chinese (zh)
Inventor
金东洙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310382648.9A priority Critical patent/CN116501210A/en
Publication of CN116501210A publication Critical patent/CN116501210A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application provides a display method, electronic equipment and a storage medium, relates to the technical field of terminals, and can solve the problem that clamping occurs when the electronic equipment switches screen refresh rates. The method is applied to an electronic device, the electronic device supporting a first refresh rate and a second refresh rate, the method comprising: the electronic device displays a first interface on the display screen at a first refresh rate; the electronic equipment receives a first operation of a user; the electronic equipment responds to the first operation and starts playing the dynamic effect; the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate; the mobile effect is used for indicating a picture displayed in the process of switching the electronic equipment from the first interface to the second interface.

Description

Display method, electronic equipment and storage medium
The present application is a divisional application, the application number of the original application is 202210023829.8, the application date of the original application is 2022, 01 and 10, and the whole content of the original application is incorporated by reference in the present application.
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a display method, an electronic device, and a storage medium.
Background
Currently, a display screen of an electronic device mostly adopts a low-temperature polycrystalline oxide (low temperature polycrystalline oxide, LTPO) display screen, wherein the LTPO display screen refers to that a layer of oxide is added in a substrate of an organic light-emitting diode (OLED) display screen, so that energy consumption required for exciting pixel points is reduced, and power consumption during display of the electronic device can be reduced.
Among other things, LTPO displays are capable of supporting a variety of screen refresh rates, for example, from 144 hertz (Hz) to 1Hz. Electronic devices with LTPO display screens may employ different screen refresh rates when running different applications. For example, when the electronic device runs a video-like application, the screen refresh rate of the electronic device is 60Hz; and when the electronic device displays the desktop, the screen refresh rate of the electronic device is 120Hz. As such, switching of the screen refresh rate may be involved when the electronic device initiates different applications. For example, when the electronic device starts a video application on a desktop, the screen refresh rate of the electronic device may be switched from 120Hz to 60Hz. However, when the electronic device switches the screen refresh rate, a phenomenon of jamming occurs.
Disclosure of Invention
The embodiment of the application provides a display method, electronic equipment and a storage medium, which can solve the problem that the electronic equipment is blocked when the screen refresh rate is switched.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a display method is provided, the method being applied to an electronic device, the electronic device supporting a first refresh rate and a second refresh rate, the method comprising: the electronic device displays a first interface on the display screen at a first refresh rate; the electronic equipment receives a first operation of a user; the electronic equipment responds to the first operation and starts playing the dynamic effect; the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate; the mobile effect is used for indicating a picture displayed in the process of switching the electronic equipment from the first interface to the second interface.
Based on the first aspect, the electronic device displays a first interface on the display screen at a first refresh rate; when the electronic equipment receives a first operation of a user, and responds to the first operation, the electronic equipment starts to play the dynamic effect; when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment is switched to display a second interface on the display screen at a second refresh rate; because the mobile effect is used for executing the image displayed in the process that the electronic equipment is switched from the first interface to the second interface, when the electronic equipment detects that the mobile effect playing is finished, the electronic equipment switches the first refresh rate to the second refresh rate and displays the second interface, so that the problem of blocking caused by switching the refresh rate in the mobile effect playing process is avoided.
In one possible design manner of the first aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; the electronic device starts playing the dynamic effect, comprising: the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of a first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of a last frame picture in the N frame pictures on the display screen.
In the design mode, the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; because the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the motion effect starting position is used for indicating the position of the first frame picture in the N frame pictures on the display screen, and the motion effect ending position is used for indicating the position of the last frame picture in the N frame pictures on the display screen, so that the effect of playing motion effect of the electronic equipment can be improved
In one possible design manner of the first aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; the electronic device detecting the end of the active play comprises: when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame of the N frames of frames on the display screen.
In the design mode, when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; the target position is used for indicating the position of the last frame of the N frames on the display screen, namely when the electronic equipment plays the last frame of the N frames, the electronic equipment detects that the dynamic playing is finished, and the power consumption of the equipment is reduced.
In one possible design manner of the first aspect, the electronic device detects that the active play ends, including: when the duration of playing the dynamic effect of the electronic equipment meets the preset duration, the electronic equipment detects that the playing of the dynamic effect is finished.
In the design mode, when the time length of playing the dynamic effect of the electronic equipment meets the preset time length, the electronic equipment detects that the dynamic effect playing is finished, and the power consumption of the equipment is reduced.
In one possible design manner of the first aspect, the electronic device includes a target application, and the second interface is an interface of the target application; after the electronic device starts playing the dynamic effect, the method further comprises the following steps: the electronic equipment acquires first information; the first information comprises an application package name of the target application; the electronic equipment determines a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
In the design manner, after the electronic device starts playing the dynamic effect, the electronic device can determine a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen, so that the power consumption of the device is reduced.
In one possible design of the first aspect, the electronic device includes a target application; the first interface is a desktop of the electronic equipment, and the second interface is an interface after the target application is started; the dynamic effect comprises a picture displayed in the process that the electronic equipment starts the target application.
In one possible design manner of the first aspect, when the target application is not running in the background of the electronic device, the second interface is a main interface of the target application; or when the target application runs in the background of the electronic equipment, the second interface is an interface when the target application runs in the background.
In one possible design manner of the first aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; in the N frames of pictures, the size of each frame of picture is different; in the process of starting the target application, the electronic equipment sequentially displays N frames of images, and the sizes of the first frame of images to the N frame of images are sequentially increased in the N frames of images.
In the design mode, in the process of starting the target application, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the sizes of the first frame of pictures to the N frames of pictures are sequentially increased, so that the attractiveness of playing dynamic effects in the starting process of the target application is improved.
In one possible design of the first aspect, the electronic device includes a target application; the first interface is an interface of the target application, and the second interface is a desktop of the electronic equipment; the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
In one possible design manner of the first aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; in the N frames of pictures, the size of each frame of picture is different; in the process of exiting the target application, the electronic device sequentially displays N frames of images, and the sizes of the first frame of images to the N frame of images in the N frames of images are sequentially reduced.
In the design mode, in the process of exiting the target application, the electronic equipment sequentially displays N frames of images, and in the N frames of images, the sizes of the first frame of image to the N frame of image are sequentially reduced, so that the attractiveness of playing dynamic effects in the process of exiting the target application is improved.
In one possible design of the first aspect, the electronic device includes a source application and a target application; the first interface and the second interface are multitasking interfaces of the electronic equipment; the first interface includes an interface when the source application is running between the most recent tasks and the second interface includes an interface when the target application is running between the most recent tasks.
In one possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment responds to a first operation, starts playing the dynamic effect, and comprises the following steps: the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; and the dynamic effect component starts playing the dynamic effect according to the first target message.
In the design mode, a desktop starter responds to a first operation and sends a first dynamic effect notification message to a dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; the dynamic effect component starts playing dynamic effects according to the first target message, which is beneficial to reducing the power consumption of the equipment.
In one possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic device detecting the end of the active play comprises: when the mobile effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the mobile effect playing component informs the desktop manager that the mobile effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message.
In the design mode, when the mobile effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the mobile effect playing component informs the desktop manager that the mobile effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message, so that the power consumption of the equipment is reduced.
In one possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic device detecting the end of the active play comprises: when the duration of playing the dynamic effect by the dynamic effect playing component meets the preset duration, the dynamic effect playing component informs a desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message.
In the design mode, when the duration of playing the dynamic effect by the dynamic effect playing component meets the preset duration, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message, so that the power consumption of the equipment is reduced.
In a second aspect, an electronic device is provided, which has the functions of implementing the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided that includes a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: the electronic device displays a first interface on the display screen at a first refresh rate; the electronic equipment receives a first operation of a user; the electronic equipment responds to the first operation and starts playing the dynamic effect; the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate; the mobile effect is used for indicating a picture displayed in the process of switching the electronic equipment from the first interface to the second interface.
In one possible design of the third aspect, the dynamic effect includes consecutive N frames of pictures, where N is greater than or equal to 1; when executed by a processor, the computer instructions cause the electronic device to specifically perform the steps of: the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of a first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of a last frame picture in the N frame pictures on the display screen.
In one possible design of the third aspect, the dynamic effect includes consecutive N frames of pictures, where N is greater than or equal to 1; when executed by a processor, the computer instructions cause the electronic device to specifically perform the steps of: when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame of the N frames of frames on the display screen.
In one possible design of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to specifically perform the steps of: when the duration of playing the dynamic effect of the electronic equipment meets the preset duration, the electronic equipment detects that the playing of the dynamic effect is finished.
In one possible design of the third aspect, the electronic device includes a target application, and the second interface is an interface of the target application; when executed by the processor, the computer instructions cause the electronic device to further perform the steps of: the electronic equipment acquires first information; the first information comprises an application package name of the target application; the electronic equipment determines a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
In one possible design of the third aspect, the electronic device comprises a target application; the first interface is a desktop of the electronic equipment, and the second interface is an interface after the target application is started; the dynamic effect comprises a picture displayed in the process that the electronic equipment starts the target application.
In one possible design of the third aspect, the second interface is a main interface of the target application when the target application is not running in the background of the electronic device; or when the target application runs in the background of the electronic equipment, the second interface is an interface when the target application runs in the background.
In one possible design of the third aspect, the dynamic effect includes N consecutive frames of pictures, N being greater than or equal to 1; in the N frames of pictures, the size of each frame of picture is different; in the process of starting the target application, the electronic equipment sequentially displays N frames of images, and the sizes of the first frame of images to the N frame of images are sequentially increased in the N frames of images.
In one possible design of the third aspect, the electronic device includes a target application; the first interface is an interface of the target application, and the second interface is a desktop of the electronic equipment; the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
In one possible design manner of the third aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; in the N frames of pictures, the size of each frame of picture is different; in the process of exiting the target application, the electronic device sequentially displays N frames of images, and the sizes of the first frame of images to the N frame of images in the N frames of images are sequentially reduced.
In one possible design of the third aspect, the electronic device includes a source application and a target application; the first interface and the second interface are multitasking interfaces of the electronic equipment; the first interface includes an interface when the source application is running between the most recent tasks and the second interface includes an interface when the target application is running between the most recent tasks.
In one possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; and the dynamic effect component starts playing the dynamic effect according to the first target message.
In one possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; when the mobile effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the mobile effect playing component informs the desktop manager that the mobile effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message.
In one possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic device detecting the end of the active play comprises: when the duration of playing the dynamic effect by the dynamic effect playing component meets the preset duration, the dynamic effect playing component informs a desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that dynamic effect playing is finished according to the second dynamic effect notification message.
In a fourth aspect, there is provided a computer readable storage medium having stored therein computer instructions which, when run on a computer, cause the computer to perform the display method of any one of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the display method of any of the first aspects above.
The technical effects of any one of the design manners of the second aspect to the fourth aspect may be referred to the technical effects of the different design manners of the first aspect, and will not be described herein.
Drawings
Fig. 1 is a schematic diagram of a screen refresh rate switching according to an embodiment of the present application;
FIG. 2 is a second schematic diagram of a screen refresh rate switching according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram III of a screen refresh rate switching according to an embodiment of the present application;
FIG. 4a is a schematic diagram I of an interface switching according to an embodiment of the present application;
fig. 4b is a schematic diagram of playing a dynamic effect according to an embodiment of the present application;
fig. 5 is a schematic diagram two of interface switching provided in the embodiment of the present application;
fig. 6 is a schematic diagram III of interface switching according to an embodiment of the present application;
fig. 7 is a schematic diagram fourth of interface switching provided in an embodiment of the present application;
fig. 8 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
Fig. 9 is a schematic software framework of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic diagram of an electronic device interface display process flow provided in an embodiment of the present application;
fig. 11 is a schematic diagram of a second process flow of displaying an interface of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic diagram III of an electronic device interface display processing flow provided in an embodiment of the present application;
fig. 13 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 14 is a second flow chart of a display method according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the present application, "/" means that the related objects are in a "or" relationship, unless otherwise specified, for example, a/B may mean a or B; the term "and/or" in this application is merely an association relation describing an association object, and means that three kinds of relations may exist, for example, a and/or B may mean: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. Also, in the description of the present application, unless otherwise indicated, "a plurality" means two or more than two. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural. In addition, in order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", and the like are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ. Meanwhile, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
At present, when an electronic device starts an application, a starting action is played; and simultaneously, the deactuation effect can be played when the application is exited. The starting action refers to a process that the electronic equipment receives starting operation of a user to an interface of the electronic equipment for displaying the application; the exiting action refers to a process that the electronic equipment receives the exiting operation of the user to the desktop interface displayed by the electronic equipment. In the related art, the switching of the screen refresh rate of the electronic device is often in the process of playing the dynamic effect (such as starting the dynamic effect or exiting the dynamic effect), and the dynamic effect is a process of changing the interface, so that the electronic device switches the screen refresh rate when playing the dynamic effect, which can cause the electronic device to be blocked, thereby affecting the user experience.
For example, as shown in fig. 1, for example, in the case where the electronic device enters the video class application from the desktop, the electronic device switches the screen refresh rate during playing the dynamic effect of starting the video class application. Referring to fig. 1, it can be seen that, since the start-up action of the video application is a dynamic process, the screen refresh rate of the electronic device is directly switched from 120Hz to 60Hz during the process of playing the start-up action of the video application, thereby causing a jam.
Specifically, the jamming in the playing process of the dynamic effect mainly has two reasons. On the one hand, in the case that the electronic device adopts hardware refresh, as shown in fig. 2, due to the limitation of the hardware of the electronic device, the electronic device needs about two frames to switch the screen refresh rate, so that frame loss is caused, and thus, a clip is caused. On the other hand, when the screen refresh rate of the electronic device is switched, the user may feel unsmooth, that is, a stuck state, due to the change of the frame interval. Especially from high refresh rates (e.g., 120 Hz) to low refresh rates (e.g., 60 Hz).
In order to solve the above technical problems, a fixed delay mechanism is proposed in the related art. The fixed delay mechanism refers to the electronic device delaying the time of switching the screen refresh rate by a fixed time. However, because the duration of the application start-up effect and the application exit effect is inconsistent, the problem that the application start-up effect (or the application exit effect) is blocked can only be solved by delaying the time of switching the screen refresh rate by a fixed time, and the problem that the application start-up effect and the application exit effect are blocked can not be solved at the same time. Taking the glory application start-up effect and application exit effect as an example, the duration of the application start-up effect is 400ms, and the duration of the application exit effect is 700ms. Then, by delaying the time of the screen refresh rate switching by a fixed time (e.g. 400 ms), the application can be prevented from starting the active effect and the application cannot be prevented from exiting the active effect. If the time for switching the screen refresh rate is delayed by a fixed time (e.g., 700 ms), the application start-up and application exit-off effects can be avoided at the same time. However, if the user performs a touch operation (such as sliding, clicking, etc.) within 300ms after the application is started, the screen refresh rate is switched to cause a click.
In addition, because the performances of different devices are different, the switching time of the screen refresh rate may be advanced or delayed, so that the method for fixing the time delay needs to set different time delays for different devices, thereby increasing the workload of developers and reducing the research and development efficiency.
Based on this, the embodiment of the application provides a display method, which can enable the electronic device to switch the screen refresh rate after playing the dynamic effect, so that the problem of blocking when the electronic device switches the screen refresh rate can be solved.
For example, as shown in fig. 3, for example, in the case that the electronic device enters the video class application from the desktop, the electronic device switches the screen refresh rate after playing the startup action of the video class application. Referring to fig. 3, it can be seen that when the electronic device plays the startup action of the video-class application, the screen refresh rate is still 120Hz. And after the starting action of the video application is played, the screen refresh rate is switched to 60Hz.
It should be understood that the scheme of the embodiment of the application is applicable to the switching of the screen refresh rate in the dynamic scene. The dynamic scene refers to a scene in which the interface of the display screen of the electronic equipment changes. That is, in the case where the interface of the display screen of the electronic device is changed, the electronic device may generate switching of the screen refresh rate. Illustratively, the electronic device, while displaying the first interface, refreshes the display screen using a first refresh rate (e.g., 120 Hz); when the electronic device switches from the first interface to the second interface, the electronic device refreshes the display screen using a second refresh rate (e.g., 60 Hz). The process of switching the electronic device from the first interface to the second interface is a dynamic scene described in the embodiment of the application.
In some embodiments, the dynamic scenario may be, for example, an application launch scenario, an application exit scenario, an application switch (e.g., from one application to another), a scenario in which applications switch between recent tasks in a multi-tasking interface, and so on.
The following illustrates a dynamic scenario described in the embodiments of the present application by taking an electronic device as an example of a mobile phone. It should be understood that the scenarios described in the following embodiments are only some examples of the embodiments of the present application, and are not meant to limit the present application, and other active scenarios suitable for switching the screen refresh rate should also fall within the protection scope of the embodiments of the present application.
Taking an application as a "communication" application and a dynamic effect scene as a scene of starting the "communication" application as an example, as shown in (a) in fig. 4a, for example, the mobile phone responds to an icon 101 of the "communication" application operated by a user in a main screen interface of the mobile phone (for example, the user clicks the icon 101), and the mobile phone displays an interface 102 as shown in (b) in fig. 4 a. The interface 102 is an interface after the "communication" application is started. In some embodiments, as shown in fig. 4a (b), the interface 102 may be an interface for a user communication list. In the scenario of "communication" application start, the start-up action of the "communication" application is the process of clicking the icon 101 on the mobile phone display interface 102 by the user.
It should be noted that the active play has a fixed duration (e.g., 400 ms). And continuously playing a series of image frames within a fixed duration to obtain a dynamic scene. Taking the above-described start-up effect as an example, the start-up effect is a process in which the image displayed in the interface 102 is changed from small to large. Specifically, when the user clicks the icon 101, the start of the action is started, and the image displayed in the interface 102 is played; when the image displayed in the interface 102 is overlaid on the entire screen (or display screen), the start-up action ends.
For example, the mobile phone is taken as an example to continuously play six frames of image frames in a fixed duration of active playing. After the user clicks the icon 101 of the "communication" application, the mobile phone continuously plays the image frames of frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 as shown in fig. 4 b. The process of playing the image frames of the frame 1, the frame 2, the frame 3, the frame 4, the frame 5 and the frame 6 by the mobile phone is the process of starting the dynamic effect.
In addition, in the scenario of initiating a dynamic effect, in some embodiments, the mobile phone displays a home screen interface and refreshes the display screen using a first preset refresh rate (e.g., 120 Hz); when the user starts the communication application and displays the interface of the communication application, the mobile phone refreshes the display screen by using a second preset refresh rate (such as 60 Hz). In other embodiments, the handset displays a home screen interface and refreshes the display screen using a first preset refresh rate (e.g., 120 Hz); when the main screen interface of the mobile phone does not receive the operation of the user within a certain time (i.e. the main interface is not changed within a certain time and is in a static state), the mobile phone can reduce the screen refresh rate (for example, to 60 Hz), and the display screen is refreshed by using the reduced screen refresh rate. When the mobile phone receives the operation of the user (such as the operation of starting the communication application by the user), the mobile phone firstly increases the screen refresh rate to a first preset refresh rate (such as 120 Hz); and then, after the communication application is started, the mobile phone refreshes the display screen by using a second preset refresh rate (such as 60 Hz) and displays an interface of the communication application.
In the scenario of starting the dynamic effect, the starting dynamic effect may be a dynamic effect played when the user starts the application for the first time, or may be a dynamic effect played when the user starts the application for the non-first time. Wherein, for the first time, it means that the application is not running in the foreground nor in the background. By not first referring to the application running in the background, in this case, the startup action may also be understood as the process of the application switching from the background to the foreground running.
Taking an application as a "communication" application and a scene of a dynamic effect scene as a scene of a "communication" application exiting as an example, the mobile phone displays an interface 103 as shown in fig. 5 (a), and the interface 103 is an interface after the "communication" application is started. For example, as shown in fig. 5 (a), the interface 103 may be, for example, an interface of a user communication list. The handset then displays an interface 104 as shown in fig. 5 (b) in response to the user's exit operation from the "communication" application. The interface 104 may be, for example, a mobile phone home screen interface.
In some embodiments, the exit operation may be, for example, one of a gesture operation, a voice operation, or a touch operation. The touch operation may be, for example, a click operation, a slide operation, or the like. Taking the exit operation as a sliding operation as an example, as also shown in fig. 5 (b), the exit operation may be, for example, an operation of sliding up the interface of the "communication" application by the user.
In the scene of exiting the communication application, the exiting action of the communication application is the process of the user operating the display main screen interface of the mobile phone by sliding up the interface of the communication application. In some embodiments, in combination with the above embodiments, the deactuation effect is a process in which the image displayed in the interface 102 is made smaller by size. Specifically, when the user slides on the interface of the communication application, the action is withdrawn; the image in the interface 102 displayed by the handset begins to shrink; when the image displayed in the interface 102 is completely exited and the handset displays the home screen interface, the exit action ends.
Taking a scenario in which a mobile phone switches from a home screen interface to a multi-tasking interface as an example, the mobile phone displays an interface 105 as shown in fig. 6 (a), and the interface 105 is a mobile phone home screen interface. Then, in response to the user's operation on the home screen interface, the mobile phone displays an interface 106 as shown in fig. 6 (b), where the interface 106 is a multitasking interface of the mobile phone. The interface 106 includes an interface of an application 1 running in the background, where the application 1 is an application running in the background for the shortest duration. In the dynamic effect scene, dynamic effect refers to a process that a user operates a main screen interface to display a multi-task interface on a mobile phone.
It should be noted that, the operation of the user on the home screen interface may refer to the illustration of the exit operation in the above embodiment, which is not described herein. Taking this operation as an example of a sliding operation, as also shown in fig. 6 (b), the operation may be, for example, a user's sliding operation on the home screen interface.
Taking a dynamic effect scene as an example of a scene of application switching between the latest tasks in a multi-task interface of the mobile phone, the mobile phone displays an interface 107 as shown in (a) of fig. 7, and the interface 107 is a multi-task interface of the mobile phone. The interface 107 includes an interface of an application 1 running in the background, and the application 2 is the application running in the background for the shortest duration. Then, in response to the user's operation on the interface 107, the mobile phone displays an interface 108 as shown in fig. 7 (b), the interface 108 including an interface of the application 2 running in the background, the duration of the application 2 running in the background being longer than the duration of the application 1 running in the background. In this action scenario, the action refers to a process of operating the interface 107 by the user to the mobile phone display interface 108.
It should be noted that, the operation of the user on the interface 107 may refer to the above embodiment for illustrating the exit operation, which is not described herein. Taking this operation as an example of a sliding operation, as also shown in fig. 7 (b), the operation may be, for example, a right sliding operation.
The display method provided in the embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The display method provided in the embodiment of the application may be applied to an electronic device with a display function, where the electronic device may be a mobile phone, a motion camera (go pro), a digital camera, a tablet computer, a desktop, a laptop, a handheld computer, a notebook, a vehicle-mounted device, an ultra-mobile personal computer (mobile personal computer, UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, and the embodiment of the application does not limit the specific form of the electronic device.
Fig. 8 is a schematic diagram of an electronic device 100. Wherein the electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel (or display substrate). The display panel may employ an organic light-emitting diode (OLED). In the embodiment of the application, the display screen is an LTPO display screen; the LTPO display includes a display panel in which a display unit (e.g., TFT) is an LTPO TFT. For an example of LTPO, reference may be made to the above embodiments, and details are not repeated here.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, audio, video, etc. files are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121, and the internal memory 121 may include a storage program area and a storage data area.
The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like.
In some embodiments, the software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, or a cloud architecture. In this embodiment, taking a layered architecture Android system as an example, a software structure of the electronic device 100 is illustrated.
Fig. 9 is a software structure diagram of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, namely an application layer (or application layer), an application framework layer, an Zhuoyun row (Android run) and system library, a hardware abstraction layer and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 9, the application package may include telephone, mailbox, calendar, camera, etc. applications. In some embodiments, the application layer further includes a desktop launcher (launcher). The desktop Launcher is a desktop Launcher in the Android system, and a desktop interface (UI) of the Android system is collectively called as the desktop Launcher. The desktop interface comprises icons of various applications installed by the electronic device. Illustratively, the desktop interface includes a phone icon, a mailbox icon, a calendar icon, a camera icon, and the like.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application layer applications. The application framework layer includes a number of predefined functions.
As shown in fig. 9, the application framework layer may include a window manager, a refresh rate switching module, a dynamic effect identification module, an image composition system, a view system, a package manager, an input manager, an activity manager, a resource manager, a dynamic effect play component, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The refresh rate switching module is used for adjusting the screen refresh rate.
The dynamic effect identification module is used for identifying dynamic effects of interface switching under the scene of interface switching of the electronic equipment. For example, identify a start of an action, an end of an action, etc.
The image composition system is used to control image composition and generate vertical synchronization (vetical synchronization, vsync) signals. In some embodiments, the image composition system further comprises an image cache queue. Illustratively, the application draws the image through the view system and renders the drawn image through the image rendering system. Then the application sends the drawn and rendered image to an image cache queue in an image synthesis system; the image cache queue is used for caching the image rendered by the application drawing. The image composition system sequentially acquires one frame of image to be composed from the image buffer queue every time the Vsync signal arrives, and then performs image composition by the image composition system.
The image composition system includes: a composition thread, a Vsync thread, a buffer queue (queue buffer) thread. The composition thread is used to wake up by the Vsync signal for composition. The Vsync thread is used to generate the next Vsync signal based on the Vsync signal request. The cache queue thread is used for storing caches, generating Vsync signal requests, waking up the synthesis thread, and the like.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The packet manager is used for program management within the system, for example: application installation, uninstallation, and upgrades, etc.
The input manager is used for managing programs of the input device. For example, the input system may determine input operations such as a mouse click operation, a keyboard input operation, and a touch swipe.
The activity manager is used for managing the life cycle of each application program and the navigation rollback function. And the main thread creation of the Android is responsible for maintaining the life cycle of each application program.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The dynamic effect playing component is used for playing dynamic effects. Illustratively, the dynamic effect playing component is used for playing dynamic effects when the application is started (such as starting dynamic effects) or dynamic effects when the application is exited (such as exiting dynamic effects).
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in virtual machines. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: an image rendering library, an image synthesis library, a function library, a media library, an input processing library and the like.
The image rendering library is used for rendering two-dimensional or three-dimensional images. The image composition library is used for composition of two-dimensional or three-dimensional images.
In a possible implementation manner, the application renders the image through the image rendering library, and then the application sends the rendered image to a cache queue of the image composition system. Each time the Vsync signal arrives, an image composition system (e.g., surface scaler) sequentially acquires one frame of image to be composed from the buffer queue, and then performs image composition through the image composition library.
The function library provides macros, type definitions, string operation functions, mathematical computation functions, input-output functions, and the like used in the C language.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The input processing library is used for processing the library of the input device, and can realize mouse, keyboard, touch input processing and the like.
The hardware abstraction layer may include a plurality of library modules, which may be, for example, a hardware synthesizer (HWC), a camera library module, etc. The Android system can load a corresponding library module for the equipment hardware, so that the purpose of accessing the equipment hardware by an application program framework layer is achieved. The device hardware may include, for example, a display screen, a camera, etc., in an electronic device.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a Touch Panel (TP) driver, a display driver, a Bluetooth driver, a WIFI driver, a keyboard driver, a shared memory driver, a camera driver and the like.
The hardware may be an audio device, a bluetooth device, a camera device, a sensor device, etc.
The workflow of the electronic device software and hardware is illustrated below in connection with the scenario in which the electronic device switches interfaces.
When a touch sensor in the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch strength, time stamp of the touch operation, and the like). The original input event is stored at the kernel layer. The kernel layer determines the focus application according to the information (including operation type, report point position and the like) of the original input event and the current focus through the input processing library, and sends the analyzed information to the focus application. The focus may be a touch point in a touch operation or a click position in a mouse click operation. The focus application is an application running in the foreground of the electronic equipment or an application corresponding to a touch position in touch operation. And the focus application determines the control corresponding to the original input event according to the analyzed information (such as the report point position) of the original input event.
Taking the touch operation as a clicking operation, taking the icon of the application 1 as an example of a control corresponding to the clicking operation, and calling an image rendering library in a system library to draw and render the image by the application 1 through a view system of an application program framework layer. And the application 1 sends the drawn and rendered image to a cache queue of the image synthesis system. And synthesizing the image drawn and rendered in the image synthesis system into an application 1 interface through an image synthesis library in the system library. The image composition system is driven by the display of the kernel layer so that the screen (display screen) displays the corresponding interface of the application 1.
For ease of understanding, a description of concepts related to the embodiments of the application in part are given by way of example for reference.
1. Frame: refers to a single picture of the minimum unit in the interface display. A frame is understood to mean a still picture, and displaying a plurality of successive frames in rapid succession may form an artifact of the motion of the object. The frame rate refers to the number of frames that a picture is refreshed in 1 second, and can also be understood as the number of times an image processor in an electronic device refreshes a picture per second. A high frame rate may result in a smoother and more realistic animation. The more frames per second, the smoother the displayed motion.
It should be noted that, before the frame is displayed on the interface, it is usually required to undergo processes such as drawing, rendering, and compositing.
2. And (3) frame drawing: refers to picture drawing of interface display. The display interface may be composed of one or more views, each of which may be drawn by a visual control of the view system, each of which is composed of sub-views, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. And (3) frame rendering: the rendered view is subjected to coloring operation, 3D effect is added, or the like. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, etc.
4. And (3) frame synthesis: is a process of combining a plurality of the one or more rendered views into a display interface.
In order to improve the smoothness of display and reduce the occurrence of display blocking, electronic devices generally display based on Vsync signals to synchronize the processes of drawing, rendering, synthesizing, refreshing, and displaying images. It will be appreciated by those skilled in the art that the Vsync signal is a periodic signal, and that the Vsync signal period may be set according to the refresh rate of the display screen. For example, when the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronic device generates a control signal every 16.6ms to trigger the Vsync signal period.
In addition, it should be noted that the Vsync signal may be divided into a software Vsync signal and a hardware Vsync signal. The software Vsync signal includes Vsync-APP and Vsync-SF. Vsync-APP is used to trigger the draw rendering process. Vsync-SF is used to trigger the composition process. The hardware Vsync signal (Vsync-HW) is used to trigger the screen display refresh process.
Typically, the software Vsync signal and the hardware Vsync signal remain periodically synchronized. Taking 120Hz and 60Hz changes as an example, if Vsync-HW is switched from 120Hz to 60Hz, vsync-APP and Vsync-SF are synchronously changed, and the Vsync-HW is switched from 120Hz to 60Hz.
Fig. 10 is a schematic diagram of an electronic device interface display processing flow according to an embodiment of the present application. The content displayed by the electronic device corresponds to frame 1, frame 2, and frame 3 in sequence in time order.
Specifically, taking the display of the frame 1 as an example, the application of the electronic device draws and renders the frame 1 through the view system of the application framework layer. After the rendering of the frame 1 is completed, the application of the electronic device sends the rendered frame 1 to the image synthesis system. The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the electronic device may display the content corresponding to the frame 1 on a screen (e.g., a display screen) by calling a kernel layer display driver. It should be noted that, the processes similar to those of the frame 1 are also synthesized and displayed in the frame 2 and the frame 3, and will not be repeated here. Each frame in fig. 10 lags by 2 Vsync signal periods from drawing to display, i.e., the display of the electronic device has hysteresis.
Fig. 11 is a schematic diagram of an electronic device interface display processing flow according to an embodiment of the present application. The content displayed by the electronic device corresponds to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in sequence in time order.
Specifically, taking the display of the frame 2 as an example, the application of the electronic device draws and renders the frame 2 through the view system of the application framework layer. After the rendering of the frame 2 is completed, the application of the electronic device sends the rendered frame 2 to the image synthesis system. The image composition system composes the rendered frame 2. After the frame 2 is synthesized, the electronic device can start the display driver by calling the kernel layer to display the content corresponding to the frame 2. The process of frame 3, frame 4, frame 5 and frame 6, which are similar to frame 2, is also synthesized and displayed and will not be repeated here.
Wherein, when rendering is drawn in frame 3, the screen refresh rate switching module of the electronic device decides to switch the refresh rate (e.g. from 120Hz to 60 Hz); and when the frame 4 is drawn and rendered, the screen refresh rate is switched, the period duration of the Vsync signal corresponding to the frame 4 drawing and rendering is prolonged, and the screen refresh rate is switched.
As can be seen from fig. 11, each frame in fig. 11 lags by 2 Vsync signal periods from the drawing to the display. As can be seen from the above embodiment, the switching of the screen refresh rate of the electronic device occurs during the active playing process, and referring to fig. 11, the display screen of the electronic device sequentially displays the frame 2 and the frame 3 during the active playing process. When the screen refresh rate is switched in the process of the active-effect playing, the frame interval corresponding to the drawing and rendering of the frame 2 is inconsistent with the frame interval corresponding to the display of the frame 2 (for example, the frame interval corresponding to the display of the frame 2 is longer than the frame interval corresponding to the drawing and rendering of the frame 2, and the distance is longer). Likewise, the frame interval corresponding to the rendering of the frame 3 is inconsistent with the frame interval corresponding to the display of the frame 3, so that a jamming occurs in the active playing process.
Fig. 12 is a schematic diagram of an electronic device interface display process according to an embodiment of the present application. The contents displayed by the electronic device correspond to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6 in sequence.
Wherein, when rendering is drawn in frame 4, the screen refresh rate switching module of the electronic device decides to switch the refresh rate (e.g. from 120Hz to 60 Hz); and when the frame 5 is drawn and rendered, switching the screen refresh rate, wherein the period duration of the Vsync signal corresponding to the frame 5 drawing and rendering is prolonged, and switching the screen refresh rate is completed.
As can be seen from fig. 12, the switching of the screen refresh rate of the electronic device occurs after the playing of the active effect, i.e. immediately after the playing of the active effect. Referring to fig. 12, in the active playing process, the display screen of the electronic device displays frame 0, frame 1 and frame 2. At this time, the frame interval corresponding to the rendering of frame 1 coincides with the frame interval corresponding to the display of frame 1. Similarly, the frame interval corresponding to the rendering of the frame 2 is consistent with the frame interval corresponding to the display of the frame 2, so that the problem that the electronic equipment is blocked in the process of playing the dynamic effect can be avoided.
It should be noted that, when the screen refresh rate of the display screen is 120Hz, the Vsync signal period may be 8.3ms, that is, the electronic device generates a control message every 8.3ms to trigger the Vsync signal period. When the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronic device generates a control signal every 16.6ms to trigger the Vsync signal period.
In addition, taking the display screen of the electronic device with the size of 1080X1980 (pixel unit), that is, the display screen of the electronic device with the longitudinal direction of 1080pixel and the transverse direction of 1980pixel, the application starting action duration is 400ms as an example. Then the application of the startup action at 60Hz refresh rate requires drawing 24 frames. Typically, the display screen is vertically 83 pixels per frame and horizontally 45 pixels per frame. If the screen refresh rate of the electronic device is switched to 120Hz, the interval per frame is halved, i.e. 42 pixels per frame in the vertical direction and 23 pixels per frame in the horizontal direction of the display.
For easy understanding, the following describes a process of interaction between each module involved in the display method provided in the embodiment of the present application in connection with the software architecture diagram shown in fig. 9.
Fig. 13 is a schematic diagram illustrating a process of interaction between each module in the display method according to the embodiment of the present application. As shown in fig. 13, the electronic device may include: the system comprises a desktop starter, a dynamic effect identification module, a dynamic effect manager, a dynamic effect playing component, a screen refresh rate switching module, an image synthesis system, a hardware synthesizer and a display driver. For example, the display method may include S201-S215.
S201, receiving a first operation of a user by a desktop starter.
The first operation may be one of a voice operation, a gesture operation, or a touch operation. The touch operation may be, for example, a click operation or a slide operation.
Taking the first operation as a click operation as an example, for example, referring to fig. 4a, the desktop launcher receives a click operation of the communication application by a user, and starts the communication application in response to the click operation. After the communication application is started, the electronic equipment plays the starting action. Taking the first operation as a sliding operation as an example, as shown in fig. 5, for example, the desktop launcher receives a sliding operation (such as a sliding operation) of the user on the communication application interface, and exits the communication application in response to the sliding operation. When the communication application exits, the electronic device plays the exiting action.
S202, the desktop starter sends a first dynamic effect notification message to the dynamic effect identification module.
The first dynamic effect notification message is used for notifying the dynamic effect identification module that the target dynamic effect is generated (or called the start of the target dynamic effect).
In some embodiments, the first action notification message further includes a first action type. The first dynamic effect type is used for indicating the dynamic effect type of the target dynamic effect. For example, when the first action type is "0", it indicates that the target action is the start action; when the first action type is "1", it indicates that the target action is the exit action.
Illustratively, the first dynamic effect notification message may be, for example, start "0"; or start "1". Wherein start indicates start. On the basis, when the first dynamic effect notification message is start 0, the target dynamic effect is started by starting the dynamic effect; when the first dynamic effect notification message is start '1', the target dynamic effect is to exit the dynamic effect.
In some embodiments, after the desktop initiator receives the first operation of the user, the target application is initiated in response to the first operation. The dynamic effect played by the target application in the starting process is called a target dynamic effect. On this basis, for example, when the desktop initiator detects that the user's finger leaves the desktop (i.e., the user's finger leaves the screen of the electronic device), the target action (e.g., start action or exit action) starts playing. In addition, in some embodiments, when the target dynamic effect starts to play, the desktop initiator sends a first dynamic effect notification message to the dynamic effect identification module; that is, when the desktop initiator detects that the user's finger leaves the desktop, the desktop initiator sends a first action notification message to the action recognition module.
S203, the dynamic effect identification module registers a callback to the desktop starter.
In some embodiments, when the action recognition module receives a first action notification message sent by the desktop initiator, the action recognition module registers a callback with the desktop initiator to notify the desktop initiator that the target action is to begin.
S204, the desktop starter informs the activity manager that the current active effect starts.
S205, the activity manager sends a first message to the screen refresh rate switching module.
The first message is used for informing the screen refresh rate switching module that the current active effect starts. In some embodiments, the first message includes an application package name of the target application.
S206, the screen refresh rate switching module determines the preset refresh rate of the target application according to the first message and the preset refresh rate switching rule.
The preset refresh rate switching rule is stored in the screen refresh rate switching module and is used for indicating the corresponding relation between the application package name of the target application and the preset refresh rate of the target application.
For example, the application package name of the target application is com.tent.qqlove, and the preset refresh rate corresponding to the application package name of the target application is 60Hz.
It should be noted that, if the action scene is a scene of playing an action when the application is started, the first message includes a packet name of the started application (i.e., the target application). If the dynamic scene is a scene of playing dynamic when the first application is switched to the second application, the first message may be a packet name switching message, that is, a message of switching the application packet name of the first application to the application packet name of the second application. On the basis, the screen refresh rate switching module can determine the preset refresh rate of the switched second application according to the packet name switching message.
It should be noted that, in the related art, after the screen refresh rate switching module receives the first message sent by the activity manager, the screen refresh rate switching module determines a preset refresh rate of the target application; and then, the screen refresh rate switching module switches the current screen refresh rate to the preset refresh rate of the target application. However, in the embodiment of the present application, after the screen refresh rate switching module receives the first message sent by the activity manager, the screen refresh rate switching module determines that the current active effect starts according to the first message. And then, the screen refresh rate switching module determines the preset refresh rate of the target application according to the application packet name of the target application carried by the first message and the screen refresh rate switching rule, and stores the preset refresh rate in the screen refresh rate switching module. And when the screen refresh rate switching module receives the message of ending the dynamic effect, the screen refresh rate switching module switches the current screen refresh rate to the preset refresh rate of the target application.
S207, the dynamic effect identification module sends a second message to the dynamic effect playing component.
Wherein the second message (or first target message) is used for indicating the dynamic effect attribute of the target dynamic effect. Illustratively, the activity attributes include one or more of activity content, activity size (e.g., from small to large or from large to small), activity duration, or activity start and end positions of the target activity.
For example, when the target action is started by starting the action, the action identification module may send one or more of an action content, an action size, an action duration, or an action start position and an action end position for starting the action to the action playing component.
The dynamic effect includes a plurality of consecutive image frames. Wherein the motion effect starting position refers to the position of the first frame image frame, and the motion effect ending position refers to the position of the last frame image frame.
S208, the dynamic effect playing component plays the target dynamic effect according to the second message.
Taking the type of the target dynamic effect as the starting dynamic effect, the dynamic effect playing component plays the target dynamic effect according to the dynamic effect content of the starting dynamic effect, the dynamic effect size (such as from small to large), the dynamic effect duration (such as 400 ms) and the starting position and the ending position of the starting dynamic effect.
S209, the dynamic playing component registers a callback to the desktop starter.
Illustratively, when the moving effect playing component plays the target moving effect, the moving effect playing component registers a callback to the desktop initiator to inform the desktop initiator that the target moving effect is finished.
Illustratively, the target motion effect is that the electronic device continuously plays a series of image frames (e.g., frames 1 through 6 shown in fig. 4 b) for a fixed duration. In some embodiments, the active play component registers a callback with the desktop initiator when the active play component has played the last frame of the image frame of the target active. Illustratively, when the image frame currently played by the active playing component reaches the active end position (i.e., the image frame played by the active playing component is the last frame at this time), the active playing component registers a callback to the desktop initiator. In other embodiments, when the duration of playing the target action by the action playing component reaches a fixed duration (e.g., 400ms after playing), the animation playing component registers a callback with the desktop initiator.
S210, the desktop starter sends a second dynamic effect notification message to the dynamic effect identification module.
The second dynamic effect notification message is used for notifying the dynamic effect recognition module that the target dynamic effect is ended.
In some embodiments, the second action notification message further includes a second action type. The second dynamic effect type is used for indicating the dynamic effect type of the target dynamic effect ending. For example, when the second action type is "0", it indicates that the target action is the start action; when the second action type is "1", it indicates that the target action is the exit action.
Illustratively, the second dynamic effect notification message may be, for example, end "0"; or end "1". Where end represents the end. On the basis, when the second dynamic effect notification message is end '0', the target dynamic effect is the starting dynamic effect ending; when the second dynamic effect notification message is end '1', the target dynamic effect is the end of exiting dynamic effect.
S211, the dynamic effect identification module informs the screen refresh rate switching module of ending the target dynamic effect.
Illustratively, after the dynamic effect recognition module notifies the screen refresh rate switching module that the target dynamic effect is over, the screen refresh rate switching module switches the current screen refresh rate to a preset refresh rate of the target application (e.g., from 120Hz to 60 Hz) according to the previously stored preset refresh rate of the target application. It should be noted that, if the current screen refresh rate is the same as the preset refresh rate of the target application, the screen refresh rate switching module will not switch the refresh rate.
It should be noted that, the preset refresh rate of the target application in the embodiment of the present application refers to a screen refresh rate of a display screen of the electronic device, that is, the number of times that the display screen is refreshed every second of a screen. In other words, the electronic device displays the interface of the target application on the display screen at the preset refresh rate of the target application.
It should be noted that, when the desktop initiator sends the first action notification message and the second action notification message to the action recognition module, a delay problem may occur. Taking the example that the desktop initiator sends the first action notification message to the action recognition module, for example, after the desktop initiator receives the first operation of the user for a period of time (for example, 3 ms), the desktop initiator sends the first action notification message to the action recognition module, that is, delays sending. Such time delays can cause the electronic device to jam while playing the active effects. Based on the first dynamic notification message and the second dynamic notification message can be sent to the dynamic identification module by the desktop starter in the embodiment of the application in advance for a certain time, so that the problem of blocking caused by time delay is effectively avoided.
S212, the screen refresh rate switching module sends a preset refresh rate of the target application to the image synthesis system.
S213, the image synthesis system synthesizes the target image according to the preset refresh rate of the target application.
Illustratively, the image composition system triggers the target application to render the rendered image data. For example, the target application draws image data through the view system and renders the image data through the image rendering system. The target application sends the drawn and rendered image data to an image synthesis system, and the image synthesis system synthesizes the image data to obtain a target image.
S214, the image synthesis system sends the synthesized target image to a hardware synthesizer.
S215, the hardware synthesizer sends the target image to the display driver.
Specifically, after the hardware synthesizer sends the synthesized target image to the display driver, the display driver drives the display screen to display the target image.
For example, in the embodiment of the present application, the target image may be an image displayed after the target application is started (such as an image of a main interface of the target application, or an image of an interface of the target application in a background runtime).
In summary, in the embodiment of the present application, since the electronic device may identify the start and end of the target moving effect through the moving effect identifying module, after the electronic device identifies the end of the target moving effect, the electronic device may switch the refresh rate of the display screen through the screen refresh rate switching module, so that the problem of the electronic device that the electronic device is stuck in the process of playing the target moving effect may be avoided.
Fig. 14 is a schematic flow chart of a display method according to an embodiment of the present application. The display method is applied to the electronic equipment, and the electronic equipment supports a first refresh rate and a second refresh rate. The display method comprises the following steps:
s301, the electronic device displays a first interface on a display screen at a first refresh rate.
In some embodiments, as shown in fig. 4a and 6, the first interface is a desktop of the electronic device; at this time, the first refresh rate is a preset refresh rate of the desktop. In other embodiments, as shown in fig. 5, the first interface is an interface of a target application (e.g., a communication application); the first refresh rate is then a preset refresh rate of the target application. In still other embodiments, the first interface is a multitasking interface of the electronic device, the first interface comprising an interface of the source application when running between the most recent tasks; the first refresh rate is now a preset refresh rate of the source application. Illustratively, as shown in fig. 7 (a), the first interface is the interface when the source application (or application 1) is running between the most recent tasks.
As described in connection with the above embodiments, in the embodiments of the present application, the source application may also be referred to as a first application (or referred to as application 1), and the target application may also be referred to as a second application (or referred to as application 2).
S302, the electronic equipment receives a first operation of a user.
The first operation may be illustrated by referring to the first operation illustrated in the above embodiment S201, and is not listed here.
S303, the electronic equipment responds to the first operation, and the electronic equipment starts to play the dynamic effect.
In some embodiments, the motion effect (or target motion effect) includes consecutive N frames of pictures (or image frames); wherein N is 1 or more. For example, the electronic device starts playing the dynamic effects according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of a first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of a last frame picture in the N frame pictures on the display screen.
And S304, the electronic equipment detects that the dynamic effect playing is finished, and the display screen is switched to display a second interface at a second refresh rate.
The mobile effect is used for indicating a picture displayed in the process of switching the electronic equipment from the first interface to the second interface.
In some embodiments, as shown in fig. 4a, the second interface is an interface of the target application (e.g., a communication application), where the second refresh rate is a preset refresh rate of the target application. In other embodiments, as shown in fig. 5, the second interface is a desktop of the electronic device, where the second refresh rate is a preset refresh rate of the desktop. In other embodiments, as shown in fig. 6, the second interface is an interface when the source application (e.g., application 1) is running between the most recent tasks, where the second refresh rate is a preset refresh rate of the source application. In other embodiments, as shown in fig. 7, the second interface is an interface when the target application (e.g., application 2) is running between the most recent tasks, where the second refresh rate is a preset refresh rate of the target application.
In some embodiments, when the electronic device plays the mth frame of picture, if the distance between the position of the mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic device detects that the dynamic playing is finished; wherein M is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame of the N frames on the display screen.
It should be noted that, the preset value may be set according to actual requirements, which is not limited in this embodiment of the present application.
For example, when the electronic device plays the mth frame, if the position of the mth frame on the display screen is the same as the target position, the electronic device detects that the active play is finished. In other words, when the electronic device plays the last frame of the N frames, the electronic device detects that the active play is finished.
In other embodiments, the electronic device detects that the playing of the dynamic effect is finished when the duration of playing of the dynamic effect by the electronic device satisfies the preset duration.
It should be noted that, the preset duration may be set according to actual requirements, which is not limited in this embodiment of the present application.
Taking the preset time length of 400ms as an example for illustration, for example, when the time length of playing the dynamic effect by the electronic device reaches the preset time length of 400ms, the electronic device detects that the playing of the dynamic effect is finished.
In the embodiment of the application, the electronic equipment displays a first interface on a display screen at a first refresh rate; when the electronic equipment receives a first operation of a user, and responds to the first operation, the electronic equipment starts to play the dynamic effect; when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment is switched to display a second interface on the display screen at a second refresh rate; because the mobile effect is used for executing the image displayed in the process that the electronic equipment is switched from the first interface to the second interface, when the electronic equipment detects that the mobile effect playing is finished, the electronic equipment switches the first refresh rate to the second refresh rate and displays the second interface, so that the problem of blocking caused by switching the refresh rate in the mobile effect playing process is avoided.
The embodiment of the application provides electronic equipment, which can comprise: a display screen (e.g., a touch screen), a memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 8.
Embodiments of the present application also provide a chip system, as shown in fig. 15, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 8 in the above embodiment. Interface circuit 1802 may be, for example, an interface circuit between processor 110 and external memory 120; or as an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and interface circuit 1802 described above may be interconnected by wires. For example, interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 1802 may be used to send signals to other devices (e.g., processor 1801). The interface circuit 1802 may, for example, read instructions stored in a memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the various steps performed by the chip system 1800 in the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a computer readable storage medium, which comprises computer instructions, when the computer instructions run on the electronic device, the electronic device is caused to execute the functions or steps executed by the mobile phone in the embodiment of the method.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the mobile phone in the above-mentioned method embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A display method, characterized by being applied to an electronic device, the electronic device supporting a first refresh rate and a second refresh rate; the method comprises the following steps:
the electronic equipment displays a first interface on a display screen at the first refresh rate;
the electronic equipment receives a first operation of a user;
the electronic equipment responds to the first operation and starts playing the dynamic effect;
the electronic equipment detects that the dynamic playing is finished, and switches to display a second interface on the display screen at the second refresh rate;
the mobile effect is used for indicating a picture displayed in the process of switching the electronic equipment from the first interface to the second interface.
2. The method of claim 1, wherein the dynamic effect comprises a succession of N frames, wherein N is greater than or equal to 1; the electronic device starts playing the dynamic effect, including:
The electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame picture in the N frame pictures on the display screen.
3. The method according to claim 1 or 2, wherein the dynamic effect comprises a succession of N frames of pictures, wherein N is equal to or greater than 1; the electronic device detecting that the dynamic playing is finished includes:
when the electronic equipment plays an Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N;
the target position is used for indicating the position of the last frame of the N frames of frames on the display screen.
4. The method according to claim 1 or 2, wherein the electronic device detecting the end of the active play comprises:
And when the time for playing the dynamic effect by the electronic equipment meets the preset time, the electronic equipment detects that the dynamic effect is finished.
5. The method of any of claims 1-4, wherein the electronic device comprises a target application, and the second interface is an interface of the target application; after the electronic device starts playing the dynamic effect, the method further comprises the following steps:
the electronic equipment acquires first information; the first information comprises an application package name of the target application;
the electronic equipment determines the second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
6. The method of any of claims 1-5, wherein the electronic device comprises a target application;
the first interface is a desktop of the electronic device, and the second interface is an interface after the target application is started;
the dynamic effect comprises a picture displayed in the process of starting the target application by the electronic equipment.
7. The method of claim 6, wherein the step of providing the first layer comprises,
When the target application is not running in the background of the electronic equipment, the second interface is a main interface of the target application; or alternatively, the process may be performed,
and when the target application runs in the background of the electronic equipment, the second interface is an interface when the target application runs in the background.
8. The method according to claim 6 or 7, wherein the dynamic effect comprises consecutive N frames of pictures, N being equal to or greater than 1; in the N frames of pictures, the size of each frame of picture is different;
and in the process of starting the target application, the electronic equipment sequentially displays the N frames of images, and the sizes of the first frame of images to the N frames of images are sequentially increased in the N frames of images.
9. The method of any of claims 1-5, wherein the electronic device comprises a target application;
the first interface is an interface of the target application, and the second interface is a desktop of the electronic device;
the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
10. The method of claim 9, wherein the dynamic effect comprises consecutive N frames of pictures, N being greater than or equal to 1; in the N frames of pictures, the size of each frame of picture is different;
And in the process of exiting the target application, the electronic equipment sequentially displays the N frames of images, and the sizes of the first frame of images to the N frames of images in the N frames of images are sequentially reduced.
11. The method of any of claims 1-5, wherein the electronic device comprises a source application and a target application;
the first interface and the second interface are multitasking interfaces of the electronic equipment; the first interface comprises an interface when the source application runs between the nearest tasks, and the second interface comprises an interface when the target application runs between the nearest tasks.
12. The method of claim 2, wherein the electronic device comprises a desktop launcher, a dynamic effect recognition module, and a dynamic effect play component; the electronic equipment responds to a first operation, starts playing the dynamic effect, and comprises the following steps:
the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts;
the dynamic effect identification module sends a first target message to the dynamic effect playing component according to a first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute;
And the dynamic effect component starts playing dynamic effects according to the first target message.
13. The method of claim 3, wherein the electronic device comprises a desktop launcher, a dynamic effect recognition module, and a dynamic effect play component; the electronic device detecting that the playing of the dynamic effect is finished comprises:
when the mobile effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the mobile effect playing component informs the desktop manager that the mobile effect playing is finished;
the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module;
and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
14. The method of claim 4, wherein the electronic device comprises a desktop launcher, a dynamic effect recognition module, and a dynamic effect play component; the electronic device detecting that the playing of the dynamic effect is finished comprises:
when the duration of playing the dynamic effect by the dynamic effect playing component meets the preset duration, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished;
The desktop manager sends a second dynamic effect notification message to the dynamic effect identification module;
and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
15. An electronic device, the electronic device comprising: a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled;
the memory is used for storing computer program codes, and the computer program codes comprise computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-14.
16. A computer-readable storage medium comprising computer instructions; the computer instructions, when run on the electronic device, cause the electronic device to perform the method of any of claims 1-14.
CN202310382648.9A 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium Pending CN116501210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310382648.9A CN116501210A (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210023829.8A CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium
CN202310382648.9A CN116501210A (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210023829.8A Division CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN116501210A true CN116501210A (en) 2023-07-28

Family

ID=81597576

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310382648.9A Pending CN116501210A (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium
CN202210023829.8A Active CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210023829.8A Active CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium

Country Status (1)

Country Link
CN (2) CN116501210A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116991274A (en) * 2023-09-28 2023-11-03 荣耀终端有限公司 Upper sliding effect exception handling method and electronic equipment

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116052618B (en) * 2022-08-24 2023-11-07 荣耀终端有限公司 Screen refresh rate switching method and electronic equipment
CN117742835A (en) * 2022-09-14 2024-03-22 荣耀终端有限公司 Method for requesting vSync signal and electronic device
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 Electronic equipment dynamic effect playing method, electronic equipment and storage medium
CN116701307A (en) * 2022-12-20 2023-09-05 荣耀终端有限公司 Interface display method of reading application and terminal equipment
CN117130698A (en) * 2023-03-29 2023-11-28 荣耀终端有限公司 Menu display method and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256219A (en) * 2020-10-13 2021-01-22 北京小米移动软件有限公司 Display method and device, terminal and storage medium
US20210065658A1 (en) * 2019-08-28 2021-03-04 Beijing Xiaomi Mobile Software Co., Ltd. Method for controlling frame refresh rate of screen, apparatus and storage medium
CN112667340A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen refresh control method, mobile terminal and computer readable storage medium
CN112689168A (en) * 2020-12-09 2021-04-20 四川金熊猫新媒体有限公司 Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
US20210201732A1 (en) * 2019-12-31 2021-07-01 Micron Technology, Inc. Intelligent adjustment of screen refresh rate
CN113362783A (en) * 2020-03-06 2021-09-07 华为技术有限公司 Refresh rate switching method and electronic equipment
CN113438552A (en) * 2021-05-19 2021-09-24 荣耀终端有限公司 Refresh rate adjusting method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7499043B2 (en) * 2006-05-30 2009-03-03 Intel Corporation Switching of display refresh rates
CN106791212B (en) * 2017-03-10 2019-07-02 Oppo广东移动通信有限公司 A kind of control method, device and the mobile terminal of mobile terminal refresh rate
US10964262B1 (en) * 2018-08-30 2021-03-30 Apple Inc. Systems and methods for reducing visual artifacts in displays due to refresh rate
CN110377251A (en) * 2019-06-06 2019-10-25 努比亚技术有限公司 A kind of screen refresh rate method of adjustment, terminal and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210065658A1 (en) * 2019-08-28 2021-03-04 Beijing Xiaomi Mobile Software Co., Ltd. Method for controlling frame refresh rate of screen, apparatus and storage medium
US20210201732A1 (en) * 2019-12-31 2021-07-01 Micron Technology, Inc. Intelligent adjustment of screen refresh rate
CN113362783A (en) * 2020-03-06 2021-09-07 华为技术有限公司 Refresh rate switching method and electronic equipment
CN112256219A (en) * 2020-10-13 2021-01-22 北京小米移动软件有限公司 Display method and device, terminal and storage medium
CN112689168A (en) * 2020-12-09 2021-04-20 四川金熊猫新媒体有限公司 Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN112667340A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen refresh control method, mobile terminal and computer readable storage medium
CN113438552A (en) * 2021-05-19 2021-09-24 荣耀终端有限公司 Refresh rate adjusting method and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116991274A (en) * 2023-09-28 2023-11-03 荣耀终端有限公司 Upper sliding effect exception handling method and electronic equipment
CN116991274B (en) * 2023-09-28 2023-12-19 荣耀终端有限公司 Upper sliding effect exception handling method and electronic equipment

Also Published As

Publication number Publication date
CN114518817B (en) 2023-04-07
CN114518817A (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN114518817B (en) Display method, electronic device and storage medium
CN115631258B (en) Image processing method and electronic equipment
CN114579075B (en) Data processing method and related device
CN113254120B (en) Data processing method and related device
CN114661263B (en) Display method, electronic equipment and storage medium
CN114911336B (en) Method and device for adjusting frequency, electronic equipment and readable storage medium
CN113132526B (en) Page drawing method and related device
CN114648951A (en) Method for controlling dynamic change of screen refresh rate and electronic equipment
CN116010076A (en) Application running method and related equipment
CN115048012A (en) Data processing method and related device
CN117711355A (en) Screen refresh rate switching method and electronic equipment
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN114697446B (en) Refresh rate switching method, electronic device and storage medium
CN116688495B (en) Frame rate adjusting method and related device
WO2024016798A1 (en) Image display method and related apparatus
WO2023124227A9 (en) Frame rate switching method and device
WO2024087970A1 (en) Data processing method and related device
WO2023124225A9 (en) Frame rate switching method and apparatus
CN115904184B (en) Data processing method and related device
CN117689785A (en) Rendering method, electronic device and computer readable storage medium
WO2022247541A1 (en) Method and apparatus for application animation linking
CN117724779A (en) Method for generating interface image and electronic equipment
CN117971087A (en) Data processing method and related device
CN117850727A (en) Vsync signal control method, electronic device, storage medium and chip
CN117724781A (en) Playing method for starting animation by application program and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination