CN114661263B - Display method, electronic equipment and storage medium - Google Patents

Display method, electronic equipment and storage medium Download PDF

Info

Publication number
CN114661263B
CN114661263B CN202210181930.6A CN202210181930A CN114661263B CN 114661263 B CN114661263 B CN 114661263B CN 202210181930 A CN202210181930 A CN 202210181930A CN 114661263 B CN114661263 B CN 114661263B
Authority
CN
China
Prior art keywords
layer
display
refresh rate
electronic device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210181930.6A
Other languages
Chinese (zh)
Other versions
CN114661263A (en
Inventor
李时进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210181930.6A priority Critical patent/CN114661263B/en
Publication of CN114661263A publication Critical patent/CN114661263A/en
Application granted granted Critical
Publication of CN114661263B publication Critical patent/CN114661263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application provides a display method, electronic equipment and a storage medium, relates to the technical field of terminals, and is used for solving the problem of blocking when the electronic equipment displays an interface after dynamic change. The method comprises the following steps: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture includes an updated display layer, and the second refresh rate is greater than the first refresh rate.

Description

Display method, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a display method, an electronic device, and a storage medium.
Background
The low temperature polycrystalline oxide (low temperature polycrystalline oxide, LTPO) display screen refers to that an oxide layer is added to a substrate of an organic light-emitting diode (OLED) display screen, so that energy consumption required for exciting pixel points is reduced, and power consumption during display of electronic equipment can be reduced.
Wherein the LTPO display screen is capable of supporting multiple refresh rates (e.g., from 120 hertz (Hz) to 1 Hz); the refresh rate of the LTPO display may be dynamically adjusted as the electronic device is running applications. When the electronic device runs different applications, the refresh rate of the LTPO display of the electronic device is different. For example, when the electronic device runs application 1, the refresh rate of the LTPO display of the electronic device is refresh rate a (e.g., 60 Hz); when the electronic device runs application 2, the refresh rate of the LTPO display of the electronic device is refresh rate B (e.g., 120 Hz). In the related art, when an electronic device runs an application, if an interface of the application changes dynamically, the electronic device will cause a problem of jamming when displaying the dynamically changed interface of the application.
Disclosure of Invention
The embodiment of the application provides a display method, electronic equipment and a storage medium, which are used for solving the problem of blocking when the electronic equipment displays an interface after dynamic change.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a display method is provided, the method being applied to an electronic device, the electronic device supporting a first refresh rate and a second refresh rate; the method comprises the following steps: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture includes an updated display layer, and the second refresh rate is greater than the first refresh rate.
Based on the first aspect, since the first refresh rate is smaller than the first threshold, when the electronic device detects that the display layer of the first picture is updated under the condition that the electronic device controls the display screen to display the first picture at the first refresh rate, that is, the display layer of the first picture is changed, the electronic device controls the display screen to display the second picture at the second refresh rate, and the second picture comprises the updated display layer; and because the second refresh rate is greater than the first refresh rate, when the electronic equipment displays the updated display layer, the electronic equipment can improve the refresh rate of the display screen, thereby solving the problem of blocking when the electronic equipment displays the updated display layer.
In a possible implementation manner of the first aspect, the electronic device detects a display layer update of the first screen, including: the electronic equipment detects a first instruction; the first instruction is used for triggering the electronic equipment to display a first layer, and the first layer is different from at least one display layer; and the electronic equipment responds to the first instruction and detects the display layer update of the first picture.
In this implementation manner, when the electronic device detects the first instruction, the first instruction is used to trigger the electronic device to display the first layer, where the first layer is different from at least one display layer, that is, the first layer is a newly added display layer, so when the electronic device responds to the first instruction, it is beneficial to reduce power consumption of the device by detecting that the display layer of the first picture is updated (that is, there is a newly added display layer).
In a possible implementation manner of the first aspect, the electronic device controls the display to display the second screen at the second refresh rate, including: the electronic equipment creates a first layer according to the attribute information of the first layer, and controls the display screen to display a second picture at a second refresh rate; the attribute information of the first layer includes at least one of a window size, a window position, or a window name of the first layer.
In the implementation manner, the electronic device can create the first layer according to the attribute information of the first layer, and after the first layer is created, the display screen is controlled to display the second picture at the second refresh rate, so that the problem of blocking when the electronic device displays the first layer can be further solved.
In a possible implementation manner of the first aspect, the electronic device detects a display layer update of the first screen, including: the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer; the second layer is hidden under the third layer; the second layer and the third layer are two display layers in the at least one display layer; and the electronic equipment detects the display layer update of the first picture according to the image change information.
In the implementation manner, when the electronic device acquires the image change information, the electronic device is triggered to display a second image layer due to the image change information, and the second image layer is hidden below a third image layer; the second layer and the third layer are two display layers in the at least one display layer; that is, the image change information can trigger the electronic device to change from displaying the third image layer to displaying the second image layer, that is, the display image layer of the electronic device changes, so that the electronic device can detect that the display image layer of the first picture is updated according to the image change information, which is beneficial to reducing the power consumption of the device.
In a possible implementation manner of the first aspect, the electronic device controls the display to display the second screen at the second refresh rate, including: when the electronic equipment determines that the second image layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
In this implementation manner, when the electronic device determines that the second layer needs to be displayed, the electronic device controls the display screen to display the second picture at the second refresh rate, so that the power consumption of the device can be reduced, and meanwhile, the problem of blocking can be solved better.
In a possible implementation manner of the first aspect, the determining, by the electronic device, that the second layer needs to be displayed includes: the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of a layer size, a layer position, or a layer transparency.
In this implementation manner, the electronic device compares the attribute information of the second layer (i.e., the new layer) with the attribute information of the third layer (i.e., the old layer), so as to determine that the second layer needs to be displayed, and thus, errors in judging that the second layer is displayed can be reduced.
In a possible implementation manner of the first aspect, the electronic device includes: an image synthesizing system; the electronic device creates a first layer according to the attribute information of the first layer, including: the image composition system creates a first layer based on attribute information of the first layer.
In a possible implementation manner of the first aspect, the electronic device includes: an image cache queue; the electronic device obtains image change information, including: the image cache queue acquires image change information; the electronic device detecting display layer update of the first picture according to the image change information, including: and the image cache queue detects the update of the display layer of the first picture according to the image change information.
In a possible implementation manner of the first aspect, the electronic device includes an image composition system; the electronic device determining that the second layer needs to be displayed includes: and the image synthesis system compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed.
In a possible implementation manner of the first aspect, the first layer is used for displaying a volume adjustment control; or, the first layer is used for displaying screenshot; or, the first layer is used for displaying a popup window; or, the first layer is used for displaying a message unread prompt; alternatively, the first layer is used to display the application red envelope.
In this implementation, the first layer is used to display a volume adjustment control; or for displaying a screenshot; or for displaying a pop-up window; or for displaying a message unread prompt; or the first layer is used for displaying the application red packet, and the first layer is used for displaying the picture which is newly added under the condition that the user does not have a touch display screen; therefore, when the electronic equipment detects that the display layer of the first picture is updated under the condition that the user does not touch the display screen, the refresh rate of the display screen is improved, the problem of blocking can be further solved, and the user experience is improved.
In a possible implementation manner of the first aspect, the first screen and the second screen include a target window; and the target window circularly displays the third layer and the second layer in a carousel mode.
In the implementation manner, the third layer and the second layer are circularly displayed by the target window in a carousel mode, namely, the frames displayed by the target window are changed at regular time, and under the scene, the electronic equipment can improve the refresh rate of the display screen under the condition that different display layers are circularly displayed by the target window, so that the user experience is further improved.
In a possible implementation manner of the first aspect, the electronic device controls the display to display a first screen at a first refresh rate, including: the electronic equipment controls the display screen to display a third picture at a third refresh rate; the electronic equipment determines that the third picture is unchanged within the preset time period, and controls the display screen to display the first picture at the first refresh rate.
In the implementation manner, when the electronic device determines that the third frame is unchanged within the preset time, the electronic device controls the display screen to display the first frame at the first refresh rate, and the power consumption of the device can be reduced because the first refresh rate is smaller than the first threshold value, namely, the first refresh rate is a low refresh rate.
In a possible implementation manner of the first aspect, the electronic device detects a first instruction, including: the electronic equipment responds to the operation of a user on a target control, and detects a first instruction; the target control comprises a volume key or a screen locking key; or after receiving the message of the first application, the electronic device detects the first instruction.
In the implementation manner, the electronic device can respond to the operation of a user on the target control to detect a first instruction; or after the electronic equipment receives the message of the first application, detecting a first instruction; because the target control comprises the volume key or the screen locking key, namely the first instruction is the instruction detected by the electronic equipment when the user does not touch the display screen, the electronic equipment detects that the display layer changes according to the first instruction, the refresh rate of the display screen is improved, and the user experience is improved.
In a second aspect, an electronic device is provided, which has the functions of implementing the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided that supports a first refresh rate and a second refresh rate; the electronic device includes a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is used for storing computer program codes; the computer program code includes computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture includes an updated display layer, and the second refresh rate is greater than the first refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment detects a first instruction; the first instruction is used for triggering the electronic equipment to display a first layer, and the first layer is different from at least one display layer; and the electronic equipment responds to the first instruction and detects the display layer update of the first picture.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment creates a first layer according to the attribute information of the first layer, and controls the display screen to display a second picture at a second refresh rate; the attribute information of the first layer includes at least one of a window size, a window position, or a window name of the first layer.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer; the second layer is hidden under the third layer; the second layer and the third layer are two display layers in the at least one display layer; and the electronic equipment detects the display layer update of the first picture according to the image change information.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: when the electronic equipment determines that the second image layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of a layer size, a layer position, or a layer transparency.
In a possible implementation manner of the third aspect, the electronic device includes an image composition system; when the processor executes the computer instructions, the electronic device is caused to specifically perform the steps of: the image composition system creates a first layer based on attribute information of the first layer.
In a possible implementation manner of the third aspect, the electronic device includes an image buffer queue; when the processor executes the computer instructions, the electronic device is caused to specifically perform the steps of: the image cache queue acquires image change information; when the processor executes the computer instructions, the electronic device is caused to specifically perform the steps of: and the image cache queue detects the update of the display layer of the first picture according to the image change information.
In a possible implementation manner of the third aspect, the electronic device includes an image composition system; when the processor executes the computer instructions, the electronic device is caused to specifically perform the steps of: and the image synthesis system compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed.
In a possible implementation manner of the third aspect, the first layer is used for displaying a volume adjustment control; or, the first layer is used for displaying screenshot; or, the first layer is used for displaying a popup window; or, the first layer is used for displaying a message unread prompt; alternatively, the first layer is used to display the application red envelope.
In a possible implementation manner of the third aspect, the first screen and the second screen include a target window; and the target window circularly displays the third layer and the second layer in a carousel mode.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment controls the display screen to display a third picture at a third refresh rate; the electronic equipment determines that the third picture is unchanged within the preset time period, and controls the display screen to display the first picture at the first refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is caused to specifically perform the following steps: the electronic equipment responds to the operation of a user on a target control, and detects a first instruction; the target control comprises a volume key or a screen locking key; or after receiving the message of the first application, the electronic device detects the first instruction.
In a fourth aspect, there is provided a computer readable storage medium having stored therein computer instructions which, when run on a computer, cause the computer to perform the display method of any one of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the display method of any of the first aspects above.
The technical effects of any one of the design manners of the second aspect to the fifth aspect may be referred to the technical effects of the different design manners of the first aspect, and will not be repeated here.
Drawings
Fig. 1a is a schematic diagram of an interface for switching a refresh rate of a display screen according to an embodiment of the present application;
Fig. 1b is a second interface schematic diagram of refresh rate switching of a display screen according to an embodiment of the present application;
fig. 1c is a schematic diagram of a display layer corresponding to a display screen of a display screen according to an embodiment of the present application;
fig. 2 is a schematic diagram one of an application scenario provided in an embodiment of the present application;
fig. 3 is a schematic diagram two of an application scenario provided in an embodiment of the present application;
fig. 4 is a schematic diagram III of an application scenario provided in an embodiment of the present application;
fig. 5 is a schematic diagram fourth of an application scenario provided in an embodiment of the present application;
fig. 6 is a schematic diagram fifth of an application scenario provided in an embodiment of the present application;
fig. 7 is an interface schematic diagram of a home page of a video application according to an embodiment of the present application;
fig. 8 is a schematic diagram sixth of an application scenario provided in an embodiment of the present application;
fig. 9 is a schematic diagram seventh of an application scenario provided in an embodiment of the present application;
fig. 10 is a schematic diagram eight of an application scenario provided in an embodiment of the present application;
fig. 11 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic software framework of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic flow chart of an electronic device interface display process according to an embodiment of the present application;
Fig. 14 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 15 is a second flow chart of a display method according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the present application, "/" means that the related objects are in a "or" relationship, unless otherwise specified, for example, a/B may mean a or B; the term "and/or" in this application is merely an association relation describing an association object, and means that three kinds of relations may exist, for example, a and/or B may mean: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. Also, in the description of the present application, unless otherwise indicated, "a plurality" means two or more than two. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural. In addition, in order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", and the like are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ. Meanwhile, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
For ease of understanding, related terms and concepts related to the embodiments of the present application are described below.
(1) Refresh rate of display screen
The refresh rate of a display screen refers to the number of times the display screen is refreshed per second, in Hz.
(2) Output frame rate
The output frame rate is the number of frames per second that the application outputs image data to the display screen, and is in FPS (frames per second, the number of frames per second transmitted). The output frame rate refers to the number of frames output per second. The frame is a single image picture of the minimum unit in the image animation, one frame is a still picture, and continuous frames form the animation. In some embodiments, when the service scenarios when the applications are running are different, the number of frames per second that the applications output image data to the display screen (i.e., the output frame rate of the applications) will also be different. For example, the service scene of the application program in operation is a static scene (such as a static scene of an interface displayed by the application program), and the output frame rate of the application program can be 30FPS, for example; when the service scene in which the application is running is a dynamic scene (such as a scene of playing video), the output frame rate of the application may be 60FPS, for example.
(3) Display frame rate
The display frame rate refers to the frame rate at which an application program is actually displayed on a display screen, and its unit is also the FPS.
Wherein the display frame rate is determined by the refresh rate of the display screen and the output frame rate.
When the output frame rate is less than the refresh rate of the display screen, the display frame rate is approximately equal to the output frame rate. For example, the output frame rate is 30FPS and the refresh rate of the display screen is 60Hz. To ensure continuity of the display of the frames, 60 frames displayed for 1 second on the display screen are filled with repeated frames. Such repeated frames are not counted in counting the display frame rate, and thus the effective picture actually displayed is still 30 frames, i.e., the display frame rate is about 30FPS.
When the output frame rate is equal to the refresh rate of the display screen, the display frame rate is approximately equal to the output frame rate. For example, the output frame rate is 90FPS and the refresh rate of the display screen is 90Hz. Then just the application outputs 90 frames per second and the display screen refreshes displaying 90 frames per second, with a display frame rate of about 90FPS.
When the output frame rate is greater than the refresh rate of the display screen, the display frame rate is approximately equal to the refresh rate of the display screen. For example, the output frame rate is 120FPS and the refresh rate of the display screen is 90Hz. At this time, the application program outputs 120 pictures per second, and the display screen can only refresh 90 pictures per second, but cannot completely display 120 pictures output per second by the application program, and only can display 90 pictures at most. The electronic device will discard or merge portions of 120 frames per second output by the application and eventually display 90 frames per second, i.e., a display frame rate of about 90FPS.
It should be noted that, the display frame rate is about equal to the output frame rate or the refresh rate of the display screen, but not equal to the output frame rate or the refresh rate of the display screen, because in practical applications, the timing of refreshing the display screen and the screen may not be guaranteed to be completely consistent. Therefore, each output frame cannot be displayed just when the screen is refreshed, and the frame may be lost, so that a small fluctuation of the display frame rate may occur. For example, even if the output frame rate is 90FPS, the refresh rate of the display screen is 90Hz, and the final actual display frame rate may continuously fluctuate between 80FPS to 90FPS.
In electronic devices, the output frame rate of most applications varies accordingly with the refresh rate of the display screen in the system. For example, when the refresh rate of the display screen is set to 60Hz, the output frame rate of the application program is adjusted to 60FPS; when the refresh rate of the display screen is set to 90Hz, the output frame rate of this portion of the application will also be adjusted to 90FPS. However, there are also portions of applications where the output frame rate is relatively independent, such as gaming applications. These applications have their own independent timer and output frame rate, which is not controlled by the refresh rate of the display screen set in the system. For example, in this part of the applications, the output frame rate setting of one of the applications is 120FPS. If the refresh rate of the display is set to 60Hz, the output frame rate of this application will not adjust to 60FPS as the refresh rate of the display in the system changes, but will continue to maintain the output frame rate of 120FPS. If the refresh rate of the display screen is set to 144Hz, the output frame rate of this application also continues to maintain 120FPS. In general, the higher the output frame rate, the higher the display frame rate thereof. However, if the refresh rate of the display screen is insufficient, the final display frame rate can only reach the value of the refresh rate of the display screen at most even if the output frame rate is high. For example, if the refresh rate of one display screen is 60Hz, the final display frame rate can only reach 60FPS at most even if the output frame rate of the application is 100 FPS.
It should be noted that the electronic device includes a display screen, which is an important component of the electronic device. In some embodiments, a display screen includes a substrate and a display unit disposed on the substrate, the display unit including a thin film transistor (thin film transistor, TFT). In the embodiment of the application, the display screen included in the electronic device is an LTPO display screen, and the LTPO display screen refers to that TFTs in the display unit are LTPO TFTs. LTPO is a mixed product of low temperature polysilicon (low temperature poly-silicon, LTPS) and indium gallium zinc oxide (indium gallium zinc oxide, IGZO). In other words, the LTPO display screen is to replace part of the TFTs in the display unit with IGZO TFTs. The low electric leakage of the IGZO TFT can enable the LTPO display screen to achieve lower refresh rate so as to reduce power consumption when the electronic equipment is displayed, and meanwhile the problem of abnormal low refresh rate display caused by LTPS electric leakage can be avoided. In addition, the IGZO TFT has better uniformity, and can reduce the problems of dirty screen, color cast and the like of the LTPO display screen during low-brightness display.
LTPO displays can support adaptive refresh rates, i.e., LTPO displays can support multiple refresh rates (e.g., from 120Hz to 1 Hz); the refresh rate of the LTPO display can be dynamically adjusted as the electronic device runs the application. When the electronic device runs different applications, the refresh rate of the LTPO display of the electronic device is different. For example, when the electronic device runs application 1, the refresh rate of the LTPO display of the electronic device is refresh rate a (e.g., 60 Hz); when the electronic device runs application 2, the refresh rate of the LTPO display of the electronic device is refresh rate B (e.g., 120 Hz).
Taking the example of the electronic device running the application 1, in some embodiments, after the electronic device starts the application 1, the electronic device refreshes the LTPO display screen using an initial refresh rate (60 Hz) corresponding to the application 1, and displays an interface of the application 1. If the interface of the application 1 does not change within a certain period of time, the refresh rate of the LTPO display screen may be reduced (e.g., to 10Hz or 1 Hz) so as to reduce the power consumption when the electronic device displays. However, when the interface of the application 1 changes dynamically, if the refresh rate of the LTPO display screen of the electronic device is still low (e.g. 10 Hz), the electronic device may be stuck when displaying the dynamically changed interface of the application 1.
Wherein, the dynamic change of the interface of the application 1 means that the interface of the application 1 has a new layer or the layer of the interface of the application 1 changes. Wherein the image layer is composed of a plurality of pixels, and one or more image layers are stacked to form the whole display image. By way of example, each layer may be resembling a "transparent glass"; if nothing is on the "transparent glass", the "transparent glass" is a completely transparent blank layer (or transparent layer); if the "transparent glass" has an image thereon, the "transparent glass" may be referred to as a non-transparent layer.
By way of example, the scene of the newly added layer may be, for example, a scene of adjusting the volume of the electronic device, a scene of a screenshot, a scene of displaying a notification message, a scene of displaying a sidebar (or a downslide notification bar), or the like. The scene of the layer change can be, for example, a scene of displaying a red packet of the communication application (or displaying an unread message prompt) when the electronic device runs the communication application, and a scene of switching a scrolling page of a main interface of the video application when the electronic device runs the video application.
Taking the electronic device as a mobile phone and the application 1 as a communication application as an example, the mobile phone starts the communication application and displays an interface 10A as shown in (1) in fig. 1 a. Wherein the display screen may display the interface 10A at a refresh rate of 60 Hz. When the interface 10A does not change for a certain period of time (e.g., 30 s), the display screen of the mobile phone displays an interface 10B as shown in fig. 1a (2). Wherein the display screen may display interface 10B at a refresh rate of 10 Hz. An example of an additional layer (e.g., a pop-up window) on the interface 10B is illustrated as interface 10C shown in fig. 1a (3). The interface 10C includes a popup window, and the display screen still displays the interface 10C at a refresh rate of 10Hz, thereby causing the interface 10C to jam during the display of the newly added layer.
Based on the above, the embodiment of the application provides a display method, which is applied to electronic equipment, wherein the electronic equipment supports different refresh rates; after the electronic equipment starts the application 1, the electronic equipment displays an interface of the application 1 at an initial refresh rate (such as 60 Hz) of the application 1; when the interface of the application 1 is unchanged within a certain time (or called preset time), the electronic device can reduce the refresh rate of the display screen (for example, to 10 Hz); when the electronic device determines that the interface of the application 1 changes dynamically (e.g. a new layer is added to the interface of the application 1, or the interface of the application 1 changes in layers), the electronic device can improve the refresh rate of the display screen (e.g. to 60 Hz), so that the problem that the electronic device is stuck in the process of displaying the changed interface of the application 1 can be solved, i.e. the electronic device is not stuck in the process of displaying the new layer or the changed layer.
Still taking the electronic device as a mobile phone and application 1 as a communication application as an example, the mobile phone starts the communication application and displays an interface 20A as shown in (1) in fig. 1 b. Wherein the display screen may display interface 20A at a refresh rate of 60 Hz. When the interface 20A has not changed for a certain period of time (e.g., 30 s), the display of the mobile phone displays the interface 20B as shown in fig. 1B (2). Wherein the display screen may display interface 20B at a refresh rate of 10 Hz. An example of an additional layer (e.g., a pop-up window) appearing on the interface 20B is illustrated as interface 20C shown in fig. 1B (3). The interface 20C includes a popup window, and the display screen displays the interface 20C at a refresh rate of 60Hz, so as to solve the problem that the electronic device is stuck when displaying the interface 20C, that is, the electronic device is not stuck in the process of displaying the interface 20C (or displaying the popup window).
In some embodiments, when the electronic device runs the application 1, the electronic device controls the display screen to display the interface one at the refresh rate one (or the third refresh rate) (the screen displayed in the interface one may be, for example, the third screen); when the first interface is unchanged within the preset duration, the electronic equipment controls the display screen to display the second interface at a second refresh rate (or called a first refresh rate) (a picture displayed in the second interface can be a first picture, for example); wherein the second picture corresponds to at least one display layer. And when the electronic equipment determines that the interface II is dynamically changed (or the display layer of the first picture is updated), the electronic equipment controls the display screen to display the interface III at a refresh rate III (or a second refresh rate) (the picture displayed in the interface III can be, for example, a second picture).
Wherein, the dynamic change of the interface II means that the picture displayed by the interface III (i.e. the second picture) comprises a newly added picture layer (i.e. an updated display picture layer) relative to the picture displayed by the interface II (i.e. the first picture); or the picture displayed by the interface three is changed relative to the picture layer in the picture displayed by the interface two. Additionally, in this embodiment, refresh rate three (i.e., second refresh rate) is greater than refresh rate two (i.e., first refresh rate). For example, refresh rate three is 60Hz and refresh rate two is 10Hz.
In this embodiment, the magnitude relation between the refresh rate one and the refresh rate three is not limited. Wherein, the refresh rate one may be greater than the refresh rate three; or the refresh rate one may be less than the refresh rate three; or the refresh rate is equal to refresh rate three.
As can be seen from the above embodiments, when the refresh rates of the display screens are different, the frame rates actually displayed on the display screens are also different (i.e., the display frame rates of application 1 are different). And when application 1 runs different traffic scenarios, the output frame rate of application 1 is also different. For example, taking a refresh rate of 60Hz as an example, when the electronic device controls the display screen to refresh at a refresh rate of one display interface, the output frame rate of application 1 is 60FPS, and the display frame rate of application 1 is 60FPS. For another example, taking the refresh rate of two as 10Hz as an example, when the electronic device controls the display screen to display the interface of two at the refresh rate, the output frame rate of application 1 is 10FPS, and the display frame rate of application 1 is 10FPS. For another example, taking the refresh rate three as 60Hz as an example, when the electronic device controls the display screen to display the interface three values at the refresh rate three, the output frame rate of application 1 is 60FPS, and the display frame rate of application 1 is 60FPS.
In the following, an electronic device is taken as a mobile phone, an application 1 is taken as a communication application, and an example of dynamic change of the interface II is illustrated by combining the drawing of the specification.
The second interface may be an interface when the user chat with other users through the communication application. For example, as shown in FIG. 1c, interface two includes a plurality of chat hidden windows; such as a hidden chat window with user 1, a hidden chat window with user 2, a hidden chat window with user 3, a hidden chat window with user 4, a hidden chat window with user 5, a hidden chat window with user 6, and a hidden chat window with user n, etc. The picture displayed by the interface II is synthesized by overlapping a plurality of layers. For example, as shown in fig. 1c, the picture displayed by the interface two is composed of layers 101, 102 and 103 (wherein, layers 101, 102 and 103 may be display layers, for example). It should be noted that, the image displayed by the interface two may further include other transparent layers or non-transparent layers for superposition synthesis, which is not limited in the embodiment of the present application. In some embodiments, when the mobile phone receives the target instruction (or the first instruction), the mobile phone determines that the second interface is dynamically changed, and controls the display screen to display the third interface at the refresh rate. The target instruction may be a touch operation input by a user, or a push message (or called a message) of other applications (such as the first application), which is not limited in the embodiment of the present application. The touch operation may be an operation of a user on a mobile phone key, or an operation of a user on a mobile phone screen (or called a display screen). The touch operation may be one of a key operation, a touch operation, a voice operation, or a gesture operation. The touch operation may be, for example, a click operation or a slide operation. It should be noted that the above embodiments are merely examples of touch operations, and are not meant to limit the present application, and other suitable touch operations are also included in the protection scope of the embodiments of the present application.
It should be noted that, the message of the first application may be a message sent by an application installed in the mobile phone (i.e. the present device); alternatively, the message of the first application may be a message sent by an application installed in another mobile phone. For example, the first application sends a message to the device via the server.
Illustratively, the handset displays an interface 201 as shown in fig. 2 (1), where the interface 201 is composed of layers 101, 102 and 103 superimposed. On this basis, the mobile phone displays an interface 203 as shown in (2) in fig. 2 in response to the user's operation of the volume key 202. The interface 203 is composed of layers 101, 102, 103 and 204. As can be seen in fig. 2, interface 203 has an added layer 204 relative to interface 201. For example, the screen displayed by the layer 204 may be a screen with an increased volume.
Also exemplary, the handset displays an interface 201 as shown in fig. 3 (1), where the interface 201 is composed of layers 101, 102 and 103 superimposed. On this basis, the mobile phone displays an interface 206 as shown in (2) in fig. 3 in response to the user's operation of the lock key 205 (e.g., the user continuously presses the lock key 205; or the user presses a combination key of the lock key 205 and the volume key). The interface 206 is composed of layers 101, 102, 103 and 207. As can be seen in fig. 3, interface 206 has an added layer 207 relative to interface 201. For example, the screen displayed by the layer 207 may be, for example, a screenshot of the interface 206 by the mobile phone.
Also exemplary, the handset displays an interface 201 as shown in fig. 4 (1), where the interface 201 is composed of layers 101, 102 and 103 superimposed. On this basis, the handset receives messages from other users (or push messages from other applications), and displays an interface 208 as shown in fig. 4 (2). The interface 208 is composed of layers 101, 102, 103 and 209. As can be seen in fig. 4, interface 208 has an added layer 209 relative to interface 201. For example, the screen displayed by the layer 209 may be a popup window (such as a message sent by another user); or a notification bar (e.g., push messages for other applications).
Also exemplary, the handset displays an interface 201 as shown in fig. 5 (1), the interface 201 being composed of layers 101, 102 and 103 superimposed. On this basis, after the mobile phone receives the messages of other users, the mobile phone displays an interface 210 as shown in (2) of fig. 5. The interface 210 is formed by overlapping the layers 101, 102 and 103, and the interface 210 changes layer with respect to the layer 103 of the interface 201. For example, the message unread prompt 211 is included in the layer 103 in the interface 210.
Also by way of example, the handset displays an interface 212 as shown in fig. 6 (1), which interface 212 may be, for example, an interface where the user chat with other users through a communication application. The interface 212 may be composed of, for example, the layers 104, 105, and 106. On this basis, after the mobile phone receives the messages of other users, the mobile phone displays an interface 213 as shown in (2) of fig. 6. The interface 213 is formed by stacking the layers 104, 105 and 106, and the interface 213 is changed with respect to the interface 212. For example, the layer 106 in the interface 213 includes messages 214 (e.g., application red packages) sent by other users.
It should be noted that, the first frame in the embodiment of the present application may be, for example, frames displayed by the interfaces 201 and 212 in fig. 2-6 and fig. 9-10; the second screen may be, for example, the screen displayed by the interface 204, the interface 206, the interface 208, the interface 210, the interface 213, the interface 218, or the interface 220 in fig. 2-6 and fig. 9-10. In addition, the at least one display layer may be, for example, layer 101, layer 102, and layer 103 shown in fig. 2-5 and fig. 9-10; or layer 104, layer 105, and layer 106 shown in fig. 6; alternatively, the at least one display layer may be, for example, layer 107, layer 108, and layer 109 shown in fig. 8.
In some embodiments, the first layer is for displaying a volume adjustment control; or, the first layer is used for displaying screenshot; or, the first layer is used for displaying a popup window; or the first layer is used for displaying a message unread prompt; alternatively, the first layer is used to display the application red envelope. As an example, as shown in connection with fig. 2-6 and fig. 9-10, the first layer may be, for example, layer 204, layer 207, layer 209, layer 211, layer 214, layer 219, or layer 221 of fig. 2-6 and fig. 9-10.
Taking the electronic device as a mobile phone as an example, and taking the application 1 as a video application as an example, the mobile phone displays an interface 215 as shown in fig. 7 after the mobile phone starts the video application. Wherein, the interface 215 may be a main interface (or top page) of the video application; interface 215 includes a plurality of video windows, such as video window a, video window B, video window C, and video window D. For example, the mobile phone may display video content of a video window corresponding to a click operation in response to a click operation of a user on any one of a plurality of video windows. In some embodiments, as also shown in FIG. 7, video window A (or target window) is typically displayed in interface 215 in the form of a ticker carousel, i.e., the frames displayed by video window A may change periodically.
In this embodiment, the first screen may be, for example, a screen displayed on the interface 216 shown in (1) in fig. 8, and the second screen may be, for example, a screen displayed on the interface 217 shown in (2) in fig. 8. As can be seen from fig. 8, the first picture and the second picture include a target window (e.g., video window a), and the target window cyclically displays a third layer (e.g., layer 2 shown in (1) of fig. 8) and a second layer (e.g., layer 1 shown in (2) of fig. 8) in a carousel manner.
In some embodiments, the picture displayed by video window a is composed of a plurality of layers superimposed. For example, as shown in fig. 7, the picture displayed by the video window a is composed of layers 1 and 2 superimposed. When layer 2 is overlaid on layer 1, the picture displayed by the video window a is the picture in layer 2. Accordingly, when layer 1 is overlaid on top of layer 2, the picture displayed by video window a is the picture in layer 1. Taking layer 2 as an example of covering layer 1, the picture displayed by video window a is the picture in layer 2, since layer 1 is hidden under layer 2 when layer 2 is covering layer 1, layer 1 is not displayed in video window a.
In connection with the interface 215 shown in fig. 7, as shown in fig. 8, the exemplary mobile phone displays the interface 216 shown in fig. 8 (1), and the interface 216 is formed by stacking the layers 107, 108 and 109; and the picture displayed by the layer 109 is the picture of the layer 2. Then, the mobile phone detects that the duration of the picture of layer 2 displayed on layer 109 reaches a certain time, and displays an interface 217 as shown in (2) of fig. 8. The interface 217 is composed of layers 107, 108 and 109; and the picture displayed by the layer 109 is the picture of the layer 1. Referring to fig. 8, it can be seen that interface 217 has undergone a layer change relative to interface 216. For example, interface 217 changes layer 109 relative to interface 216.
Still taking the electronic device as a mobile phone and the application 1 as a communication application as an example, the mobile phone displays an interface 201 as shown in (1) of fig. 9, where the interface 201 is formed by stacking the layer 101, the layer 102 and the layer 103. On this basis, in response to a touch operation (e.g., a sliding operation to the left) of the display screen by the user, the mobile phone displays an interface 218 as shown in fig. 9 (2), and the interface 218 is composed of layers 101, 102, 103 and 219 superimposed. Referring to fig. 9, it can be seen that interface 218 has been augmented with a layer 219 relative to interface 201. For example, the layer 219 in the interface 218 may be a sidebar.
Also exemplary, the handset displays an interface 201 as shown in fig. 10 (1), the interface 201 being composed of layers 101, 102 and 103 superimposed. On this basis, in response to a touch operation (e.g., a downward sliding operation) of the display screen by the user, the mobile phone displays an interface 220 as shown in fig. 10 (2), and the interface 220 is composed of layers 101, 102, 103 and 221 superimposed. Referring to fig. 10, it can be seen that interface 220 has an added layer 221 relative to interface 201. For example, layer 221 in interface 220 may be a slide-down notification bar.
It should be noted that, the foregoing is merely exemplified by taking the application 1 as a communication application or a video application, and of course, the application 1 may also be other applications, such as a live broadcast application, a short video application, and a browser application, and the embodiment of the present application does not specifically limit the application 1.
The display method provided for the embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The electronic device in the embodiment of the application may be an electronic device including an LTPO display screen. For example, the electronic device may be a mobile phone, a motion camera (go pro), a digital camera, a tablet computer, a desktop, a laptop, a handheld computer, a notebook, a vehicle-mounted device, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, etc., and the specific form of the electronic device is not particularly limited in the embodiments of the present application.
Fig. 11 is a schematic structural diagram of the electronic device 100. Wherein the electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, positioning module 181, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel (or display substrate). The display panel may employ an organic light-emitting diode (OLED). In the embodiment of the application, the display screen is an LTPO display screen; the LTPO display includes a display panel in which a display unit (e.g., TFT) is an LTPO TFT. For an example of LTPO, reference may be made to the above embodiments, and details are not repeated here.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, audio, video, etc. files are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121, and the internal memory 121 may include a storage program area and a storage data area.
The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like.
In some embodiments, the software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, or a cloud architecture. In this embodiment, taking a layered architecture Android system as an example, a software structure of the electronic device 100 is illustrated.
Fig. 12 is a software structure diagram of an electronic device according to an embodiment of the present application.
It will be appreciated that the layered architecture divides the software into several layers, each with a clear role and division. The layers communicate with each other through a software interface. In some embodiments, an Android system may include an Application (APP) layer, an application Framework (FWK) layer, a system library, a hardware abstraction layer (hardware abstraction layer, HAL), and a kernel layer (kernel). For ease of understanding, in the embodiment of the present application, the software architecture diagram shown in fig. 12 further includes some hardware structures of the electronic device in fig. 11. Such as LTPO displays.
The application layer may include a series of application packages. By way of example, as shown in FIG. 12, an application package may include application 1, application 2, and so on.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer provides programming services to application layer calls through the API interface. Illustratively, as shown in FIG. 7, the application framework layer includes an activity management service (activity manger service, AMS) and a window management service (window manger service, WMS). Wherein the activity management service is used for managing the life cycle of each application program and the navigation rollback function. And the main thread creation of the Android is responsible for maintaining the life cycle of each application program. The window management service is used for managing windows, and specifically comprises adding, deleting, size, direction position, animation, focus and the like of the windows.
The system library may include a plurality of functional modules. For example: a frame rate decision module, an image synthesis system (e.g., surface camera), a view system, an image rendering system, etc. The frame rate decision module is used for adjusting the refresh rate of the LTPO display screen. The image composition system is used to control image composition and to generate vertical synchronization signals (vetical synchronization, vsync). In some embodiments, the image composition system further comprises an image cache queue. Illustratively, the application draws the image through the view system and renders the drawn image through the image rendering system. Then the application sends the drawn and rendered image to an image cache queue in an image synthesis system; the image cache queue is used for caching the image rendered by the application drawing. The image composition system sequentially acquires one frame of image to be composed from the image buffer queue every time the Vsync signal arrives, and then performs image composition by the image composition system.
The hardware abstraction layer is an interface layer between the kernel layer and the hardware, and can be used for abstracting the hardware. Illustratively, as depicted in FIG. 7, the hardware abstraction layer includes a hardware synthesizer (e.g., hwComposer, HWC). The hardware synthesizer is used for synthesizing the image synthesized by the image synthesizing system into hardware (such as LTPO display screen).
The kernel layer provides the underlying drivers for the various hardware of the electronic device. Illustratively, as shown in FIG. 12, the kernel layer includes display drivers.
The workflow of the electronic device software and hardware is illustrated in the following in connection with a scene of adding a layer or generating a layer change in an interface of an application program displayed by the electronic device.
In some embodiments, when the electronic device runs application 1, the electronic device controls the display screen to display interface one at refresh rate one (e.g., 60 Hz); when the first interface is unchanged within the preset duration, the electronic equipment controls the display screen to display the second interface at a refresh rate (e.g. 10 Hz); and then, when the electronic equipment determines that the interface II is dynamically changed, the electronic equipment controls the display screen to display the interface III at a refresh rate of three (such as 60 Hz).
Taking dynamic change of the second interface as an example that the second interface has a newly added layer, the application 1 notifies the AMS to create the activity of the newly added layer; the AMS initiates activity of the newly added layer. Then, AMS informs WMS to create target window; WMS manages the window size, location, window name, etc. of the target window. WMS notifies image composition system to create a new layer. Illustratively, the WMS calls a create layer interface (e.g., create layer () interface) to inform the image composition system to create a newly added layer; the image composition system receives layer creation information of the newly added layer (e.g., size, position, layer name, etc. of the newly added layer) in the create layer () interface. The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60 Hz) and sends the refresh rate of three to the image composition system. The image synthesis system can synthesize the layers according to the refresh rate III, and finally display the synthesized newly added layers.
Illustratively, the image composition system sends layer creation information to application 1 to trigger application 1 to render the newly added layer. The application 1 draws image data through the view system, renders the image data through the image rendering system, and then sends the drawn and rendered image data to the image composition system. And the image synthesis system performs layer synthesis according to the drawn and rendered image data to obtain a new layer. Then, the image synthesis system sends the synthesized newly added layers to a hardware synthesizer (HWC), and the hardware synthesizer sends the synthesized newly added layers to a display driver; the display driver drives the LTPO display screen to display the newly added layer.
Taking the dynamic change of the second interface as an example of the layer change of the second interface, for example, when the electronic device determines that the layer in the display image of the second interface is changed (for example, the layer change occurs in a part of the window of the second interface), the application 1 writes the image change information into the image cache queue. The image change information includes one or more of layer displacement, a size of a window in which the layer is located, layer color, or layer dynamic effects (e.g., becoming larger or smaller). Then, the image buffer queue informs the image synthesizing system that the current interface has the layer change, and the image synthesizing system generates a processing transaction. The transaction is used for judging whether the changed image layer needs to be displayed or not. The image composition system recognizes that the changed layer is to be displayed (e.g., displays the changed layer) by processing the transaction. The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60 Hz) and sends the refresh rate of three to the image composition system. The image composition system may perform layer composition according to the refresh rate three, and finally display the changed layer. In some embodiments, the image composition system determines that a layer change has occurred in a portion of the window of the current interface through a layer listening interface (surface flinger:: set transaction state ()). The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60 Hz) and sends the refresh rate of three to the image composition system. The image composition system may perform layer composition according to refresh rate three and display.
The image composition system sends the changed layer information (such as layer displacement, size of window where the layer is, layer color, and layer dynamic effect) to the application 1 to trigger the application to render the changed layer. The application 1 draws image data through the view system, renders the image data through the image rendering system, and then sends the drawn and rendered image data to the image composition system. And the image synthesis system performs image synthesis according to the drawn and rendered image data to obtain a changed image layer. Then, the image synthesis system sends the synthesized image layer to a hardware synthesizer (HWC), and the hardware synthesizer sends the synthesized image layer to a display driver; the display driver drives the LTPO display screen to display the changed image layer.
In combination with the above embodiment, in the case where the electronic device displays the newly added layer or the changed layer, the electronic device needs to synthesize the newly added layer or the changed layer (e.g., the second layer) by using the image synthesis system and display the newly added layer or the changed layer.
The specific processing flow of the image composition system to compose the second layer is described in detail below. For ease of understanding, the following exemplary description of some of the technical terms related to layer synthesis is given for reference.
1. Frame: refers to a single picture of the minimum unit in the interface display. A frame is understood to mean a still picture, and displaying a plurality of successive frames in rapid succession may form an artifact of the motion of the object. The frame rate refers to the number of frames that a picture is refreshed in 1 second, and can also be understood as the number of times an image processor in an electronic device refreshes a picture per second. A high frame rate may result in a smoother and more realistic animation. The more frames per second, the smoother the displayed motion.
It should be noted that, before the frame is displayed on the interface, it is usually required to undergo processes such as drawing, rendering, and compositing.
2. And (3) frame drawing: refers to picture drawing of interface display. The display interface may be composed of one or more views, each of which may be drawn by a visual control of the view system, each of which is composed of sub-views, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. And (3) frame rendering: the rendered view is subjected to coloring operation, 3D effect is added, or the like. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, etc.
4. And (3) frame synthesis: is a process of combining a plurality of the one or more rendered views into a display interface.
In order to improve the smoothness of display and reduce the occurrence of display blocking, electronic devices generally display based on Vsync signals to synchronize the processes of drawing, rendering, synthesizing, refreshing, and displaying images. It will be appreciated by those skilled in the art that the Vsync signal is a periodic signal, and that the Vsync signal period may be set according to the refresh rate of the display screen. For example, when the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronic device generates a control signal every 16.6ms to trigger the Vsync signal period.
In some embodiments, as shown in fig. 13, a schematic diagram of a layer synthesis process flow provided in an embodiment of the present application is shown. The content displayed by the electronic device corresponds to frame 1, frame 2, and frame 3 in sequence in time order.
Taking the display of frame 1 as an example, the application of the electronic device draws frame 1 through the view system and renders frame 1 through the image rendering system. After the rendering of the frame 1 is completed, the application of the electronic device sends the rendered frame 1 to the image synthesis system. The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the electronic device may display the content corresponding to the frame 1 on a screen (such as an LTPO display screen) by calling a kernel layer display driver. It should be noted that, the processes similar to those of the frame 1 are also synthesized and displayed in the frame 2 and the frame 3, and will not be repeated here. Each frame in fig. 13 lags by 2 Vsync signal periods from drawing to display, i.e., the display of the electronic device has hysteresis.
In the embodiment of the application, after the electronic device starts the first application, the electronic device refreshes the LTPO display screen by using a preset refresh rate (such as 60 Hz), and displays a first interface; when the first interface is unchanged within a preset time period, the electronic device uses a first refresh rate (such as 10 Hz) to refresh the LTPO display screen. It can be seen that, under the condition that the user does not operate the first application from the hand, the first interface is in a static state for a long time, and at this time, the electronic device can reduce the refresh rate of the LTPO display screen so as to reduce the power consumption of the device. In this case, if the first interface has a new layer to be displayed or the first interface has a layer to be changed and the changed layer needs to be displayed, the electronic device uses the second refresh rate (e.g. 60 Hz) to refresh the LTPO display screen during the process of displaying the new layer or the changed layer.
For ease of understanding, in the embodiment of the present application, in the process of displaying the newly added layer or the changed layer, the process of interaction between the modules involved in the display method of the LTPO display screen using the refresh rate three is described below with reference to fig. 12.
Fig. 14 and fig. 15 are schematic views of a process of interaction between each module in the display method according to the embodiment of the present application. As shown in fig. 14 and 15, the system may include: application 1, an Activity Management Service (AMS), a Window Management Service (WMS), an image composition System (SF), an image cache queue, a frame rate decision module, a hardware compositor (HWC), a display driver, and an LTPO display screen.
In the embodiment of the application, when the electronic device runs the application 1, the electronic device uses the refresh rate one (such as 60 Hz) to refresh the LTPO display screen and displays the interface one; when the first interface is unchanged within the preset duration, the electronic device uses the second refresh rate (e.g. 10 Hz) to refresh the LTPO display screen, and displays the second interface. It can be seen that, in the case that the user does not operate the application 1 away from his/her hand, the interface is in a stationary state for a long time, and at this time, the electronic device may reduce the refresh rate of the LTPO display screen, so as to reduce the power consumption of the device. In this case, if the second interface is dynamically changed, the electronic device uses the refresh rate three (e.g., 60 Hz) to refresh the LTPO display screen and displays the third interface. Wherein, the dynamic change of the second interface means that the second interface has a new layer (or a first layer); or the layer of the interface II is changed (such as from the third layer to the second layer).
In some embodiments, taking the example that the interface two has an additional layer, as shown in fig. 12 and fig. 14, the display method may include: step A1-step A14.
And step A1, receiving a target instruction by the application 1.
The target instruction (or first instruction) may be, for example, a touch operation input by a user, or a push message of other applications. The embodiment of the application does not limit the target instruction. In addition, the touch operation may be illustrated by referring to the above embodiments, and will not be described herein in detail.
The target instruction is used for triggering the electronic device to display the newly added layer. The added layer may be, for example, a scene of adjusting the volume of the electronic device, a scene of capturing a screen, a scene of displaying a notification message, a scene of displaying a sidebar (or a downslide notification bar), etc.
And step A2, the application 1 sends a first message to the AMS.
The first message is used for notifying the AMS to create the activity of the newly added layer. The first message carries attribute information of the newly added layer. Illustratively, the attribute information includes one or more of an activity name, a window size, a window position, or a window name.
And step A3, the AMS starts the activity of the newly added layer.
And step A4, the AMS sends a second message to the WMS.
The second message is used for notifying the WMS to manage the window of the newly added layer. The second message carries attribute information of the newly added layer.
Illustratively, the WMS manages window size, window location, window name, etc. of the newly added layer.
And step A5, the WMS informs the image synthesis system to create a new layer.
Illustratively, the WMS calls a create layer interface (e.g., a create layer () interface) to inform the image composition system to create the newly added layer. The image composition system receives layer creation information in the create layer () interface. The layer creation information may be, for example, the size, the position, the layer name, etc. of the newly added layer.
Step A6, the image composition system determines that the current refresh rate (e.g., the first refresh rate) is below a threshold (e.g., the first threshold).
The threshold value may be, for example, 10Hz.
In some embodiments, when the image composition system determines that the current refresh rate is above the threshold, the image composition system notifies the frame rate decision module to adjust the refresh rate. The threshold value in this embodiment may be, for example, 60Hz.
And A7, informing a frame rate decision module to adjust the refresh rate by the image synthesis system.
Step A8, the frame rate decision module determines that the current refresh rate is lower than the threshold, and the frame rate decision module adjusts the current refresh rate from a refresh rate of two (e.g., a first refresh rate) to a refresh rate of three (e.g., a third refresh rate).
The threshold value may be, for example, 10Hz. The refresh rate two may be, for example, 10Hz and the refresh rate three may be, for example, 60Hz.
In the event that the image composition system determines that the current refresh rate is above a threshold (e.g., 60 Hz), the frame rate decision module determines that the current refresh rate is above the threshold (e.g., 60 Hz) and the frame rate decision module adjusts the current refresh rate from a high refresh rate (e.g., 60 Hz) to a low refresh rate (e.g., 10 Hz).
And A9, the frame rate decision module sends the refresh rate III to the image synthesis system.
And step A10, the frame rate decision module sends the refresh rate III to the display driver.
And step A11, the image synthesis system performs layer synthesis according to the refresh rate III.
Illustratively, the image composition system sends a Vsync signal according to the refresh rate three-way application 1; and (3) carrying out drawing rendering on the image layer by the application 1 according to the rhythm of the refresh rate III, and then sending the drawn and rendered image data to the image cache queue by the application 1. And the image composition system performs layer composition on the image data drawn and rendered in the image cache queue according to the rhythm of the refresh rate III to obtain a composite layer (such as a newly added layer). For example, when the refresh rate three is 60Hz, the application 1 performs layer drawing rendering according to the cadence of the refresh rate three, that is, the output frame rate of the application 1 is 60FPS.
And step A12, the image synthesis system sends the synthesized image layers to a hardware synthesizer.
And step A13, the hardware synthesizer sends the image layer to the display driver.
In step a14, the LTPO display is driven by the display driver to display the three display layers (e.g., the first layer) at the refresh rate.
Exemplary, the display driver drives LTPO display screen display interface three; the interface III comprises a synthesized layer (such as a newly added layer). For example, the display driver LTPO display displays interfaces as shown in fig. 2-4 and fig. 9 and 10. The interfaces shown in fig. 2-4 and fig. 9 and 10 include additional layers (e.g., layer 204, layer 207, layer 209, layer 219, and layer 221).
In this embodiment, in the case where the electronic device uses a lower refresh rate (e.g., refresh rate two in the above embodiment, i.e., 10 Hz) to refresh the LTPO display screen, if the electronic device receives the target instruction, the electronic device uses refresh rate three (e.g., 60 Hz) to refresh the LTPO display screen and displays interface three; because the target instruction is used for indicating the electronic equipment to display the newly added layer, and because the refresh rate III is larger than the refresh rate II, the electronic equipment improves the refresh rate of the display screen in the process of displaying the newly added layer, thereby solving the problem that the electronic equipment cannot be blocked in the process of displaying the newly added layer and improving the user experience.
In other embodiments, taking the example of the change of the layer of the interface two as an example, in conjunction with fig. 12 and fig. 15, the display method may include: step S1-step S13.
Step S1, the application 1 receives the image change information.
The image change information is used to indicate that a layer change occurs in the interface (e.g., interface two) currently displayed by the application 1. For example, the image change information is used to trigger the electronic device to display the second layer (e.g., change from the third layer to the second layer). The second layer is hidden below the third layer, and the second layer and the third layer are two display layers in the at least one display layer. The image change information may include one or more of a layer displacement, a size of a window in which the layer is located, a layer color, or a layer dynamic effect (e.g., increasing or decreasing).
And S2, writing the image change information into an image cache queue by the application 1.
Accordingly, the image buffer queue buffers the image change information.
And S3, notifying the image synthesis system of layer change by the image buffer queue.
The image composition system registers monitoring with an image buffer queue through a layer monitoring interface (surface flinger:: set transaction state ()), and when the layer changes, the image buffer queue calls the layer monitoring interface to inform the image composition system that the layer changes to part of the window of the current interface.
And S4, the image synthesis system determines whether the changed image layer needs to be displayed.
Illustratively, the image composition system generates a processing transaction; the transaction is used for judging whether the changed image layer needs to be displayed or not. For example, the image composition system may determine whether the changed layer needs to be displayed by comparing whether attribute information (including at least one of a layer size, a layer position, or a layer transparency) of the layer (e.g., a third layer) before the change and the changed layer (e.g., a second layer) changes in the processing transaction.
And S5, when the image synthesis system determines that the changed image layer needs to be displayed, the image synthesis system determines that the current refresh rate is lower than the threshold value.
The threshold value may be, for example, 10Hz.
And S6, the image synthesis system informs the frame rate decision module of adjusting the refresh rate.
In some embodiments, when the image composition system determines that the current refresh rate is above the threshold, the image composition system notifies the frame rate decision module to adjust the refresh rate. The threshold value in this embodiment may be, for example, 60Hz.
And S7, the frame rate decision module determines that the current refresh rate is lower than the threshold value, and the frame rate decision module adjusts the current refresh rate from the refresh rate II to the refresh rate III.
The threshold value may be, for example, 10Hz. The refresh rate two may be, for example, 10Hz and the refresh rate three may be, for example, 60Hz.
In the event that the image composition system determines that the current refresh rate is above a threshold (e.g., 60 Hz), the frame rate decision module determines that the current refresh rate is above the threshold (e.g., 60 Hz) and the frame rate decision module adjusts the current refresh rate from a high refresh rate (e.g., 60 Hz) to a low refresh rate (e.g., 10 Hz).
And S8, the frame rate decision module sends the refresh rate III to the image synthesis system.
And S9, the frame rate decision module sends the refresh rate III to the display driver.
And S10, the image synthesis system synthesizes the layers according to the refresh rate III.
Illustratively, the image composition system sends a Vsync signal according to the refresh rate three-way application 1; and (3) carrying out drawing rendering on the image layer by the application 1 according to the rhythm of the refresh rate III, and then sending the drawn and rendered image data to the image cache queue by the application 1. And the image composition system performs layer composition on the image data drawn and rendered in the image cache queue according to the rhythm of the refresh rate III to obtain a composite layer (such as a changed layer). For example, when the refresh rate three is 60Hz, the application 1 performs layer drawing rendering according to the cadence of the refresh rate three, that is, the output frame rate of the application 1 is 60FPS.
And step S11, the image synthesis system sends the synthesized image layers to a hardware synthesizer.
Step S12, the hardware synthesizer sends the image layer to the display driver.
And S13, driving the LTPO display screen by display driving to display the layers at the refresh rate.
Exemplary, the display driver drives LTPO display screen display interface three; the interface three comprises a synthesized layer (such as a changed layer). For example, the display driver LTPO display displays interfaces as shown in fig. 5, 6, and 8. The interfaces shown in fig. 5, 6, and 8 include layers (e.g., layer 211, layer 214, and layer 109) that are changed.
In this embodiment, in the case where the electronic device refreshes the LTPO display screen using a lower refresh rate (e.g., refresh rate two in the above embodiment, i.e., 10 Hz), if the electronic device receives the image change information, the electronic device refreshes the LTPO display screen using refresh rate three (e.g., 60 Hz), and displays interface three; the image change information is used for indicating that the image layer changes in the interface currently displayed by the electronic equipment, and the refresh rate III is larger than the refresh rate II, so that the electronic equipment improves the refresh rate of the display screen in the process of displaying the changed image layer by the electronic equipment, the problem that the electronic equipment cannot be blocked in the process of displaying the newly added image layer can be solved, and the user experience is improved.
The embodiment of the application provides electronic equipment, which can comprise: a display screen (e.g., a touch screen), a memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 11.
Embodiments of the present application also provide a chip system, as shown in fig. 16, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 11 in the above embodiment. Interface circuit 1802 may be, for example, an interface circuit between processor 110 and external memory 120; or as an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and interface circuit 1802 described above may be interconnected by wires. For example, interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 1802 may be used to send signals to other devices (e.g., processor 1801). The interface circuit 1802 may, for example, read instructions stored in a memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the steps performed by the handset 180 in the above embodiments. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a computer readable storage medium, which comprises computer instructions, when the computer instructions run on the electronic device, the electronic device is caused to execute the functions or steps executed by the mobile phone in the embodiment of the method.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the mobile phone in the above-mentioned method embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A display method, characterized by being applied to an electronic device, the electronic device supporting a first refresh rate and a second refresh rate; the method comprises the following steps:
the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer;
the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer, the second image layer is hidden below a third image layer, and the second image layer and the third image layer are two display image layers in the at least one display image layer;
the electronic equipment detects that the display layer of the first picture is updated according to the image change information, and controls the display screen to display a second picture at a second refresh rate; the second picture comprises the second picture layer; the first picture and the second picture comprise target windows, and the target windows circularly display the third layer and the second layer in a carousel mode; the second refresh rate is greater than the first refresh rate.
2. The method of claim 1, wherein the electronic device controlling the display screen to display a second picture at a second refresh rate comprises:
and when the electronic equipment determines that the second image layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
3. The method of claim 2, wherein the electronic device determining that the second layer needs to be displayed comprises:
the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of a layer size, a layer position, or a layer transparency.
4. The method of claim 1, wherein the electronic device comprises: an image cache queue;
the electronic device obtains image change information, including:
the image cache queue acquires the image change information;
the electronic device detecting, according to the image change information, that the display layer of the first picture is updated includes:
and the image cache queue detects the update of the display layer of the first picture according to the image change information.
5. A method according to claim 2 or 3, wherein the electronic device comprises an image synthesis system; the electronic device determining that the second layer needs to be displayed includes:
and the image synthesis system compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed.
6. The method of any of claims 1-4, wherein the electronic device controlling the display to display the first picture at the first refresh rate comprises:
the electronic equipment controls the display screen to display a third picture at a third refresh rate;
and the electronic equipment determines that the third picture is unchanged within a preset time period, and controls the display screen to display the first picture at the first refresh rate.
7. An electronic device, wherein the electronic device supports a first refresh rate and a second refresh rate; the electronic device includes a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled;
the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-6.
8. A computer-readable storage medium comprising computer instructions; the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-6.
CN202210181930.6A 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium Active CN114661263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210181930.6A CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210181930.6A CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114661263A CN114661263A (en) 2022-06-24
CN114661263B true CN114661263B (en) 2023-06-20

Family

ID=82027023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210181930.6A Active CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114661263B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 Electronic equipment dynamic effect playing method, electronic equipment and storage medium
CN115690269B (en) * 2022-10-31 2023-11-07 荣耀终端有限公司 View object processing method and electronic equipment
CN117698554B (en) * 2024-02-06 2024-04-16 深圳市欧冶半导体有限公司 Projection control method, device, equipment and system of intelligent car lamp system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7987431B2 (en) * 1999-10-29 2011-07-26 Surfcast, Inc. System and method for simultaneous display of multiple information sources
CN103473051B (en) * 2013-09-02 2017-03-15 小米科技有限责任公司 A kind of method and apparatus for saving power consumption of terminal
CN106933328B (en) * 2017-03-10 2020-04-17 Oppo广东移动通信有限公司 Method and device for controlling frame rate of mobile terminal and mobile terminal
CN109388461A (en) * 2018-09-27 2019-02-26 青岛海信电器股份有限公司 Display methods, device and the display terminal of object are identified in screen-picture screenshot
CN111767013A (en) * 2020-06-01 2020-10-13 Oppo(重庆)智能科技有限公司 Control method, control device, electronic device, computer-readable storage medium
CN112652263A (en) * 2020-12-25 2021-04-13 深圳传音控股股份有限公司 Refreshing method, terminal and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium

Also Published As

Publication number Publication date
CN114661263A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN114661263B (en) Display method, electronic equipment and storage medium
CN112717370B (en) Control method and electronic equipment
CN113630572B (en) Frame rate switching method and related device
CN114648951B (en) Method for controlling dynamic change of screen refresh rate and electronic equipment
CN114518817B (en) Display method, electronic device and storage medium
CN114579075B (en) Data processing method and related device
CN115631258B (en) Image processing method and electronic equipment
WO2022257451A1 (en) Display method, electronic device and computer storage medium
WO2023000772A1 (en) Mode switching method and apparatus, electronic device and chip system
EP3159876A1 (en) Method and apparatus for displaying content
CN113254120A (en) Data processing method and related device
US10134326B2 (en) Device for and method of saving power when refreshing a display screen when displayed content does not change
CN116052618B (en) Screen refresh rate switching method and electronic equipment
CN115048012A (en) Data processing method and related device
CN114461051A (en) Frame rate switching method and device and storage medium
CN113805983B (en) Method for adjusting window refresh rate and electronic equipment
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN114697446B (en) Refresh rate switching method, electronic device and storage medium
CN114513574B (en) Interface display method, electronic device and storage medium
CN114489879B (en) Display method of playing interface and electronic equipment
CN115695699A (en) Display frame rate adjusting method and device, terminal and storage medium
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
US20240105107A1 (en) Frame rate switching method and apparatus
WO2024066834A1 (en) Vsync signal control method, electronic device, storage medium and chip
WO2023124225A9 (en) Frame rate switching method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant