CN114661263A - Display method, electronic equipment and storage medium - Google Patents

Display method, electronic equipment and storage medium Download PDF

Info

Publication number
CN114661263A
CN114661263A CN202210181930.6A CN202210181930A CN114661263A CN 114661263 A CN114661263 A CN 114661263A CN 202210181930 A CN202210181930 A CN 202210181930A CN 114661263 A CN114661263 A CN 114661263A
Authority
CN
China
Prior art keywords
layer
display
electronic device
refresh rate
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210181930.6A
Other languages
Chinese (zh)
Other versions
CN114661263B (en
Inventor
李时进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210181930.6A priority Critical patent/CN114661263B/en
Publication of CN114661263A publication Critical patent/CN114661263A/en
Application granted granted Critical
Publication of CN114661263B publication Critical patent/CN114661263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a display method, electronic equipment and a storage medium, relates to the technical field of terminals, and is used for solving the problem of unsmooth display of an interface after dynamic change of the electronic equipment. The method comprises the following steps: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture comprises an updated display layer, and the second refresh rate is greater than the first refresh rate.

Description

Display method, electronic device and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a display method, an electronic device, and a storage medium.
Background
The Low Temperature Polycrystalline Oxide (LTPO) display screen is formed by adding an additional oxide layer on a substrate of an organic light-emitting diode (OLED) display screen, so that energy consumption required for exciting pixel points is reduced, and power consumption of electronic equipment during display can be reduced.
Wherein the LTPO display screen is capable of supporting multiple refresh rates (e.g., from 120 hertz (Hz) to 1 Hz); the refresh rate of the LTPO display screen may be dynamically adjusted depending on the application being run by the electronic device. When the electronic device runs different applications, the refresh rate of the LTPO display screen of the electronic device is different. For example, when the electronic device runs application 1, the refresh rate of the LTPO display screen of the electronic device is refresh rate a (e.g., 60 Hz); when the electronic device runs application 2, the refresh rate of the LTPO display screen of the electronic device is refresh rate B (e.g., 120 Hz). In the related art, when an electronic device runs an application, if an interface of the application changes dynamically, the electronic device may be stuck when displaying the dynamically changed interface of the application.
Disclosure of Invention
The embodiment of the application provides a display method, electronic equipment and a storage medium, which are used for solving the problem of jamming of the electronic equipment when an interface which is dynamically changed is displayed.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a display method is provided, which is applied to an electronic device, wherein the electronic device supports a first refresh rate and a second refresh rate; the method comprises the following steps: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture comprises an updated display layer, and the second refresh rate is greater than the first refresh rate.
Based on the first aspect, since the first refresh rate is smaller than the first threshold, when the electronic device detects that the display layer of the first picture is updated, that is, the display layer of the first picture is changed, the electronic device controls the display screen to display a second picture at a second refresh rate when the electronic device controls the display screen to display the first picture at the first refresh rate, where the second picture includes the updated display layer; and because the second refresh rate is greater than the first refresh rate, when the electronic device displays the updated display layer, the electronic device can increase the refresh rate of the display screen, thereby solving the problem of blockage of the electronic device when the updated display layer is displayed.
In a possible implementation manner of the first aspect, the detecting, by an electronic device, a display layer update of a first screen includes: the electronic equipment detects a first instruction; the first instruction is used for triggering the electronic equipment to display a first layer, and the first layer is different from at least one display layer; the electronic equipment responds to the first instruction and detects that the display layer of the first picture is updated.
In this implementation manner, when the electronic device detects the first instruction, because the first instruction is used to trigger the electronic device to display the first layer, where the first layer is different from at least one display layer, that is, the first layer is a newly added display layer, when the electronic device responds to the first instruction, it is detected that the display layer of the first picture is updated (that is, the newly added display layer is present), which is beneficial to reducing the power consumption of the device.
In a possible implementation manner of the first aspect, the controlling, by the electronic device, the display screen to display the second picture at the second refresh rate includes: the electronic equipment creates a first layer according to the attribute information of the first layer and controls the display screen to display a second picture at a second refresh rate; the attribute information of the first layer includes at least one of a window size, a window position, or a window name of the first layer.
In the implementation manner, the electronic device may create the first layer according to the attribute information of the first layer, and control the display screen to display the second picture at the second refresh rate after the first layer is created, so that the problem of jamming occurring when the electronic device displays the first layer can be further solved.
In a possible implementation manner of the first aspect, the detecting, by an electronic device, a display layer update of a first screen includes: the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer; the second layer is hidden under the third layer; the second layer and the third layer are two display layers in at least one display layer; and the electronic equipment detects that the display layer of the first picture is updated according to the image change information.
In the implementation mode, when the electronic device acquires the image change information, the electronic device is triggered to display the second layer because the image change information is used for hiding the second layer under the third layer; the second layer and the third layer are two display layers in at least one display layer; that is to say, the image change information can trigger the electronic device to change from displaying the third layer to displaying the second layer, that is, the display layer of the electronic device has changed, so that the electronic device can detect that the display layer of the first picture is updated according to the image change information, which is beneficial to reducing the power consumption of the device.
In a possible implementation manner of the first aspect, the controlling, by the electronic device, the display screen to display the second picture at the second refresh rate includes: and when the electronic equipment determines that the second image layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
In this implementation manner, when the electronic device determines that the second layer needs to be displayed, the electronic device controls the display screen to display the second picture at the second refresh rate, so that the power consumption of the device can be reduced, and the problem of deadlock can be better solved.
In a possible implementation manner of the first aspect, the determining, by the electronic device, that the second image layer needs to be displayed includes: the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of a layer size, a layer position, or a layer transparency.
In this implementation manner, the electronic device compares the attribute information of the second layer (i.e., the new layer) with the attribute information of the third layer (i.e., the old layer) to determine that the second layer needs to be displayed, so that a failure in determining and displaying the second layer can be reduced.
In one possible implementation manner of the first aspect, an electronic device includes: an image synthesis system; the electronic equipment creates a first layer according to the attribute information of the first layer, and the creating comprises the following steps: and the image synthesis system creates the first image layer according to the attribute information of the first image layer.
In one possible implementation manner of the first aspect, an electronic device includes: an image buffer queue; the electronic equipment acquires image change information, and comprises the following steps: the image buffer queue acquires image change information; the electronic equipment detects that the display layer of the first picture is updated according to the image change information, and the method comprises the following steps: and the image cache queue detects that the display layer of the first picture is updated according to the image change information.
In one possible implementation form of the first aspect, the electronic device comprises an image composition system; the electronic device determines that the second layer needs to be displayed, and the determining includes: and the image synthesis system compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed.
In a possible implementation manner of the first aspect, the first layer is configured to display a volume adjustment control; or the first layer is used for displaying the screenshot; or the first image layer is used for displaying the pop-up window; or the first layer is used for displaying the message unread prompt; or the first image layer is used for displaying the application red packet.
In this implementation, the first layer is used to display a volume adjustment control; or for displaying a screenshot; or for displaying a pop-up window; or for displaying a message unread prompt; or the display layer is used for displaying the application red packet, and it can be seen that the display layer is newly added on the picture displayed by the first layer under the condition that the user does not have a touch display screen; therefore, under the condition that no user touches the display screen, when the electronic equipment detects that the display layer of the first picture is updated, the refresh rate of the display screen is improved, the problem of blockage can be further solved, and the user experience is improved.
In one possible implementation manner of the first aspect, the first screen and the second screen include a target window; and circularly displaying a third layer and the second layer by the target window in a carousel mode.
In the implementation manner, the third layer and the second layer are cyclically displayed by the target window in a carousel manner, that is, a picture displayed by the target window may change regularly, and in this scenario, the electronic device may improve the refresh rate of the display screen under the condition that the target window cyclically displays different display layers, which is beneficial to further improving user experience.
In one possible implementation manner of the first aspect, the controlling, by the electronic device, the display screen to display the first picture at the first refresh rate includes: the electronic equipment controls the display screen to display a third picture at a third refresh rate; and the electronic equipment determines that the third picture does not change within the preset time length, and controls the display screen to display the first picture at a first refresh rate.
In this implementation manner, when the electronic device determines that the third picture does not change within the preset time period, the electronic device controls the display screen to display the first picture at the first refresh rate, and since the first refresh rate is smaller than the first threshold, that is, the first refresh rate is a low refresh rate, the power consumption of the device can be reduced.
In a possible implementation manner of the first aspect, the detecting, by the electronic device, the first instruction includes: the electronic equipment responds to the operation of a user on the target control and detects a first instruction; the target control comprises a volume key or a screen locking key; or after the electronic equipment receives the message of the first application, the first instruction is detected.
In this implementation, the electronic device may detect a first instruction in response to a user operating a target control; or after the electronic equipment receives the message of the first application, detecting a first instruction; because the target control comprises a volume key or a screen locking key, namely the first instruction is an instruction detected by the electronic equipment when the user does not touch the display screen, the change of the display layer is detected by the electronic equipment according to the first instruction, the refresh rate of the display screen is improved, and the improvement of user experience is facilitated.
In a second aspect, an electronic device is provided, which has the function of implementing the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided, the electronic device supporting a first refresh rate and a second refresh rate; the electronic device includes a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is for storing computer program code; the computer program code includes computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: the electronic equipment controls the display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer; when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture comprises an updated display layer, and the second refresh rate is greater than the first refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment detects a first instruction; the first instruction is used for triggering the electronic equipment to display a first layer, and the first layer is different from at least one display layer; the electronic equipment responds to the first instruction and detects that the display layer of the first picture is updated.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment creates a first layer according to the attribute information of the first layer and controls the display screen to display a second picture at a second refresh rate; the attribute information of the first image layer includes at least one of a window size, a window position, or a window name of the first image layer.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer; the second layer is hidden under the third layer; the second layer and the third layer are two display layers in at least one display layer; and the electronic equipment detects that the display layer of the first picture is updated according to the image change information.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: and when the electronic equipment determines that the second image layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of a layer size, a layer position, or a layer transparency.
In one possible implementation form of the third aspect, the electronic device includes an image composition system; when the processor executes the computer instructions, the electronic device is caused to specifically execute the following steps: and the image synthesis system creates the first image layer according to the attribute information of the first image layer.
In a possible implementation manner of the third aspect, the electronic device includes an image buffer queue; when the processor executes the computer instructions, the electronic device is caused to specifically execute the following steps: the image buffer queue acquires image change information; when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: and the image cache queue detects that the display layer of the first picture is updated according to the image change information.
In one possible implementation form of the third aspect, the electronic device includes an image composition system; when the processor executes the computer instructions, the electronic device is caused to specifically execute the following steps: and the image synthesis system compares the attribute information of the second image layer with the attribute information of the third image layer to determine that the second image layer needs to be displayed.
In a possible implementation manner of the third aspect, the first layer is configured to display a volume adjustment control; or the first layer is used for displaying the screenshot; or the first image layer is used for displaying the pop-up window; or the first layer is used for displaying the message unread prompt; or the first image layer is used for displaying the application red packet.
In a possible implementation manner of the third aspect, the first screen and the second screen include a target window; and circularly displaying a third layer and the second layer by the target window in a carousel mode.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment controls the display screen to display a third picture at a third refresh rate; and the electronic equipment determines that the third picture is not changed within a preset time length, and controls the display screen to display the first picture at a first refresh rate.
In a possible implementation manner of the third aspect, when the processor executes the computer instructions, the electronic device is specifically caused to perform the following steps: the electronic equipment responds to the operation of a user on the target control and detects a first instruction; the target control comprises a volume key or a screen locking key; or after the electronic equipment receives the message of the first application, the first instruction is detected.
In a fourth aspect, a computer-readable storage medium is provided, in which computer instructions are stored, and when the computer instructions are executed on a computer, the computer is enabled to execute the display method according to any one of the first aspect.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the display method of any of the first aspects above.
For technical effects brought by any one of the design manners in the second aspect to the fifth aspect, reference may be made to technical effects brought by different design manners in the first aspect, and details are not described herein.
Drawings
Fig. 1a is a first schematic interface diagram illustrating refresh rate switching of a display screen according to an embodiment of the present disclosure;
fig. 1b is a schematic view of an interface for switching a refresh rate of a display screen according to an embodiment of the present application;
fig. 1c is a schematic view of a display layer corresponding to a display image of a display screen according to an embodiment of the present application;
fig. 2 is a first schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a second application scenario provided in the embodiment of the present application;
fig. 4 is a third schematic diagram of an application scenario provided in the embodiment of the present application;
fig. 5 is a fourth schematic diagram of an application scenario provided in the embodiment of the present application;
fig. 6 is a schematic diagram five of an application scenario provided in the embodiment of the present application;
fig. 7 is a schematic interface diagram of a home page of a video application according to an embodiment of the present application;
fig. 8 is a sixth schematic view of an application scenario provided in an embodiment of the present application;
fig. 9 is a seventh schematic diagram of an application scenario provided in the embodiment of the present application;
fig. 10 is an eighth schematic view of an application scenario provided in an embodiment of the present application;
fig. 11 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 12 is a software framework diagram of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic flowchart of an electronic device interface display process according to an embodiment of the present disclosure;
fig. 14 is a first flowchart illustrating a display method according to an embodiment of the present disclosure;
fig. 15 is a flowchart illustrating a second display method according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Where in the description of the present application, "/" indicates a relationship where the objects associated before and after are an "or", unless otherwise stated, for example, a/B may indicate a or B; in the present application, "and/or" is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. Also, in the description of the present application, "a plurality" means two or more than two unless otherwise specified. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance. Also, in the embodiments of the present application, the words "exemplary" or "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion for ease of understanding.
For ease of understanding, the related terms and concepts related to the embodiments of the present application will be described below.
(1) Refresh rate of display screen
The refresh rate of the display screen refers to the number of times a picture of the display screen is refreshed per second, and the unit is Hz.
(2) Output frame rate
The output frame rate is the number of frames per second that the application outputs image data to the display screen, and is in units of FPS (frames per second). The output frame rate refers to the number of frames output per second. The frame is a single image picture of the minimum unit in the image animation, one frame is a static picture, and the continuous frames form the animation. In some embodiments, when the service scenes in which the application program runs are different, the number of frames per second that the application program outputs image data to the display screen (i.e., the output frame rate of the application program) may also be different. For example, a service scene in which the application is running is a static scene (e.g., a scene in which an interface displayed by the application is still), and at this time, the output frame rate of the application may be, for example, 30 FPS; when the service scene in which the application is running is a dynamic scene (e.g., a scene in which a video is played), the output frame rate of the application may be 60FPS, for example.
(3) Display frame rate
The display frame rate refers to the frame rate of the actual display of the application program on the display screen, and the unit of the frame rate is also FPS.
The display frame rate is determined by the refresh rate and the output frame rate of the display screen.
When the output frame rate is less than the refresh rate of the display screen, the display frame rate is approximately equal to the output frame rate. For example, the output frame rate is 30FPS and the refresh rate of the display screen is 60 Hz. In order to ensure the continuity of the picture display, 60 pictures displayed for 1 second on the display screen are filled with repeated frames. Such repeated frames are not counted in the statistical display frame rate, so that the effective frame displayed actually is still 30 frames, i.e. the display frame rate is about 30 FPS.
When the output frame rate is equal to the refresh rate of the display screen, the display frame rate is approximately equal to the output frame rate. For example, the output frame rate is 90FPS and the refresh rate of the display screen is 90 Hz. Then exactly 90 frames per second are output by the application, and the display screen refreshes and displays 90 frames per second, with a display frame rate of about 90 FPS.
When the output frame rate is greater than the refresh rate of the display screen, the display frame rate is approximately equal to the refresh rate of the display screen. For example, the output frame rate is 120FPS and the refresh rate of the display screen is 90 Hz. At this time, the application program outputs 120 screens per second, but the display screen can only refresh 90 screens per second, and cannot display all 120 screens output by the application program per second, and only 90 screens at most can be displayed. The electronic device discards or merges partial frames of the 120 frames output by the application program every second, and finally displays 90 frames every second, i.e. the display frame rate is about 90 FPS.
It should be noted that the display frame rate is approximately equal to the output frame rate or the refresh rate of the display screen, but not equal to the output frame rate or the refresh rate of the display screen, because in practical applications, the timing of refreshing the display screen and the screen cannot be guaranteed to be completely consistent sometimes. Therefore, each output frame cannot be displayed just during the screen refresh, and may be lost, so that the display frame rate may fluctuate slightly. For example, even if the output frame rate is 90FPS and the refresh rate of the display screen is 90Hz, the final actual display frame rate may fluctuate continuously between 80FPS and 90 FPS.
In the electronic device, the output frame rate of most application programs is changed correspondingly with the change of the refresh rate of the display screen in the system. For example, when the refresh rate of the display screen is set to 60Hz, the output frame rate of the application program is adjusted to 60 FPS; when the refresh rate of the display screen is set to 90Hz, the output frame rate of the part of the application program is also adjusted to 90 FPS. However, there are also some applications where the output frame rate is relatively independent, such as game applications. These applications have their own independent timer and output frame rate that is not controlled by the refresh rate of the display screen set in the system. For example, in the part of the applications, the output frame rate setting of one of the applications is 120 FPS. If the refresh rate of the display screen is set to 60Hz, the output frame rate of the application program will not be adjusted to 60FPS according to the change of the refresh rate of the display screen in the system, but will continue to maintain the output frame rate of 120 FPS. If the refresh rate of the display screen is set to 144Hz, the output frame rate of the application program also continues to maintain 120 FPS. In general, the higher the output frame rate, the higher the display frame rate. However, if the refresh rate of the display screen is insufficient, the final display frame rate can only reach the value of the refresh rate of the display screen at most even if the output frame rate is high. For example, if the refresh rate of one display screen is 60Hz, the final display frame rate can only be up to 60FPS even if the output frame rate of the application program is 100 FPS.
It should be noted that the electronic device includes a display screen, and the display screen is an important component of the electronic device. In some embodiments, a display screen includes a substrate base and a display unit disposed on the substrate base, the display unit including a Thin Film Transistor (TFT). In the embodiment of the application, the display screen included in the electronic device is an LTPO display screen, and the LTPO display screen means that the TFT in the display unit is an LTPO TFT. LTPO is a mixed product of Low Temperature Polysilicon (LTPS) and Indium Gallium Zinc Oxide (IGZO). In other words, the LTPO display screen is formed by replacing a part of TFTs in the display unit with IGZO TFTs. The IGZO TFT low leakage can enable the LTPO display screen to achieve a lower refresh rate so as to reduce power consumption of electronic equipment during display, and meanwhile, the problem of low refresh rate display abnormity caused by LTPS leakage can be avoided. In addition, the IGZO TFT has better uniformity, and can reduce the problems of dirty screen, color cast and the like of the LTPO display screen during low-brightness display.
The LTPO display can support an adaptive refresh rate, i.e., the LTPO display can support multiple refresh rates (e.g., from 120Hz to 1 Hz); the refresh rate of the LTPO display screen can be dynamically adjusted with the application being run by the electronic device. When the electronic device runs different applications, the refresh rate of the LTPO display screen of the electronic device is different. For example, when the electronic device runs application 1, the refresh rate of the LTPO display screen of the electronic device is refresh rate a (e.g., 60 Hz); when the electronic device runs application 2, the refresh rate of the LTPO display screen of the electronic device is refresh rate B (e.g., 120 Hz).
Taking the electronic device as an example to run the application 1, in some embodiments, after the electronic device starts the application 1, the electronic device refreshes the LTPO display screen by using an initial refresh rate (60Hz) corresponding to the application 1, and displays an interface of the application 1. If the interface of the application 1 is not changed within a certain period of time, the refresh rate of the LTPO display screen is reduced (e.g. to 10Hz or 1Hz), so as to reduce the power consumption of the electronic device during displaying. However, when the interface of the application 1 is dynamically changed, if the refresh rate of the LTPO display screen of the electronic device is still a low refresh rate (e.g. 10Hz), the electronic device may be stuck when displaying the dynamically changed interface of the application 1.
The dynamic change of the interface of the application 1 means that the interface of the application 1 has a new layer, or the layer of the interface of the application 1 has changed. The image layers are composed of a plurality of pixels, and one or more image layers are overlapped to form the whole display image. For example, each layer may be similar to a "transparent glass"; if nothing is on the transparent glass, the transparent glass is a completely transparent empty layer (or transparent layer); if the "transparent glass" has an image thereon, the "transparent glass" may be referred to as a non-transparent layer.
For example, the scene of the newly added layer may be a scene of adjusting the volume of the electronic device, a scene of capturing a picture, a scene of displaying a notification message, a scene of displaying a sidebar (or a slide-down notification bar), and the like. The scene with the changed layer may be, for example, a scene in which a red packet of the communication application (or an unread message prompt) is displayed when the communication application is run by the electronic device, and a scene in which a scrolling page of a main interface of the video application is switched when the video application is run by the electronic device.
Taking the electronic device as a mobile phone and the application 1 as a communication application as an example, the mobile phone starts the communication application and displays the interface 10A shown in (1) in fig. 1 a. Wherein the display screen may display the interface 10A at a refresh rate of 60 Hz. When the interface 10A does not change within a certain time (e.g. 30s), the display screen of the mobile phone displays the interface 10B shown in (2) in fig. 1 a. Wherein the display screen may display the interface 10B at a refresh rate of 10 Hz. An example of an interface 10B with a newly added layer (e.g., pop-up window) is shown as an interface 10C shown in (3) of fig. 1 a. The interface 10C includes a pop-up window, and the display screen still displays the interface 10C at a refresh rate of 10Hz, so that the interface 10C is stuck in the process of displaying the added layer.
Based on this, the embodiment of the present application provides a display method, which is applied to an electronic device, where the electronic device supports different refresh rates; after the electronic equipment starts the application 1, the electronic equipment displays an interface of the application 1 at an initial refresh rate (such as 60Hz) of the application 1; when the interface of the application 1 is not changed within a certain time (or called a preset time), the electronic device may reduce the refresh rate of the display screen (e.g. to 10 Hz); when the electronic device determines that the interface of the application 1 changes dynamically (for example, the interface of the application 1 has a newly added layer, or the interface of the application 1 has a layer change), the electronic device increases the refresh rate of the display screen (for example, to 60Hz), so that the problem that the electronic device is stuck during the process of displaying the changed interface of the application 1 can be solved, that is, the electronic device does not get stuck during the process of displaying the newly added layer or the changed layer.
Still taking the electronic device as a mobile phone and the application 1 as a communication application as an example, the mobile phone starts the communication application and displays the interface 20A shown in (1) in fig. 1 b. Wherein the display screen may display the interface 20A at a refresh rate of 60 Hz. When the interface 20A is not changed within a certain time (e.g. 30s), the display screen of the mobile phone displays the interface 20B shown in (2) in fig. 1B. Wherein the display screen may display the interface 20B at a refresh rate of 10 Hz. An example of the interface 20B showing a newly added layer (e.g., pop-up window) is shown as an interface 20C shown in (3) of fig. 1B. The interface 20C includes a pop-up window, and the display screen displays the interface 20C at a refresh rate of 60Hz, so that the problem of stuck-on of the electronic device when displaying the interface 20C is solved, that is, the electronic device is not stuck-on in the process of displaying the interface 20C (or called display pop-up window).
In some embodiments, when the electronic device runs the application 1, the electronic device controls the display screen to display a first interface (a picture displayed in the first interface may be, for example, a third picture) at a first refresh rate (or a third refresh rate); when the first interface is not changed within the preset time length, the electronic device controls the display screen to display a second interface (the picture displayed in the second interface can be the first picture for example) at a second refresh rate (or called the first refresh rate); the second picture corresponds to at least one display layer. Then, when the electronic device determines that the second interface is dynamically changed (or called as display layer update of the first image), the electronic device controls the display screen to display the third interface at a third refresh rate (or called as a second refresh rate) (the image displayed in the third interface may be, for example, the second image).
The dynamic change of the second interface means that a picture (namely, a second picture) displayed by the third interface comprises a new layer (namely, an updated display layer) relative to a picture (namely, a first picture) displayed by the second interface; or the picture displayed on the interface three is changed relative to the layer in the picture displayed on the interface two. Additionally, in this embodiment, the refresh rate three (i.e., the second refresh rate) is greater than the refresh rate two (i.e., the first refresh rate). For example, the refresh rate three is 60Hz and the refresh rate two is 10 Hz.
In this embodiment, the magnitude relationship between the first refresh rate and the third refresh rate is not limited. Wherein, the first refresh rate can be larger than the third refresh rate; or the first refresh rate may be less than the third refresh rate; or a refresh rate equal to refresh rate three.
In combination with the above embodiments, when the refresh rates of the display screens are different, the frame rates actually displayed on the display screens by the applications are also different (i.e., the display frame rates of the application 1 are different). And when the application 1 runs different service scenes, the output frame rate of the application 1 is also different. For example, taking the refresh rate one as 60Hz as an example, when the electronic device controls the display screen to display the interface one at the refresh rate, the output frame rate of the application 1 is 60FPS, and the display frame rate of the application 1 is 60 FPS. For another example, taking the second refresh rate as 10Hz as an example, when the electronic device controls the display screen to display the second interface at the second refresh rate, the output frame rate of the application 1 is 10FPS, and the display frame rate of the application 1 is 10 FPS. For another example, taking the refresh rate three as 60Hz as an example, when the electronic device controls the display screen to display three values of the interface at the refresh rate three, the output frame rate of the application 1 is 60FPS, and the display frame rate of the application 1 is 60 FPS.
Taking the electronic device as a mobile phone and the application 1 as a communication application as an example, the dynamic change of the interface two is exemplified by combining the drawings in the specification.
Illustratively, the second interface may be an interface used when the user chats with other users through the communication application. For example, as shown in FIG. 1c, interface two includes a plurality of chat hidden windows; such as a chat hidden window with user 1, a chat hidden window with user 2, a chat hidden window with user 3, a chat hidden window with user 4, a chat hidden window with user 5, a chat hidden window with user 6, and a chat hidden window with user n. And the picture displayed by the interface two is superposed and synthesized by a plurality of layers. For example, as shown in fig. 1c, a screen displayed on the second interface is synthesized by overlapping a layer 101, a layer 102, and a layer 103 (where the layer 101, the layer 102, and the layer 103 may be display layers, for example). It should be noted that the picture displayed on the second interface may further include other transparent layers or non-transparent layers for superposition synthesis, which is not limited in this embodiment of the present application. In some embodiments, when the mobile phone receives the target command (or the first command), the mobile phone determines that the second interface is dynamically changed, and controls the display screen to display the third interface at the refresh rate. The target instruction may be a touch operation input by a user, or a push message (or called message) of another application (e.g., the first application), and the target instruction is not limited in the embodiment of the present application. The touch operation may be an operation of a user on a key of the mobile phone, or an operation of a screen (or called a display screen) of the mobile phone. The touch operation may be one of a key operation, a touch operation, a voice operation, or a gesture operation. The touch operation may be, for example, a click operation or a slide operation. It should be noted that the above embodiments are merely examples of the touch operations, and do not limit the application, and other suitable touch operations should also belong to the scope of the embodiments of the application.
It should be noted that the message of the first application may be a message sent by an application installed in a mobile phone (i.e., the device); alternatively, the message of the first application may also be a message sent by an application installed in another mobile phone. For example, the first application sends a message to the device via the server.
Illustratively, the mobile phone displays an interface 201 as shown in (1) in fig. 2, where the interface 201 is synthesized by overlapping the layer 101, the layer 102, and the layer 103. On this basis, the cellular phone displays an interface 203 as shown in (2) in fig. 2 in response to the user's operation of the volume key 202. The interface 203 is formed by superposing the layer 101, the layer 102, the layer 103, and the layer 204. As can be seen in fig. 2, interface 203 adds layer 204 to interface 201. For example, the screen displayed in the layer 204 may be a screen for increasing or decreasing volume, for example.
Further exemplarily, the mobile phone displays an interface 201 as shown in (1) in fig. 3, where the interface 201 is superimposed and synthesized by the layer 101, the layer 102, and the layer 103. On this basis, the handset displays an interface 206 as shown in (2) in fig. 3 in response to the user's operation of the lock key 205 (e.g., the user continuously presses the lock key 205; or the user presses a combination of the lock key 205 and the volume key). The interface 206 is formed by superimposing the layer 101, the layer 102, the layer 103, and the layer 207. As can be seen in fig. 3, interface 206 adds layer 207 to interface 201. For example, the screen displayed by the layer 207 may be a screenshot of the interface 206 of a mobile phone, for example.
Further exemplarily, the mobile phone displays an interface 201 as shown in (1) in fig. 4, where the interface 201 is superimposed and synthesized by the layer 101, the layer 102, and the layer 103. On this basis, the handset receives messages of other users (or push messages of other applications), and the handset displays an interface 208 as shown in (2) in fig. 4. The interface 208 is formed by superimposing and combining the layer 101, the layer 102, the layer 103, and the layer 209. As can be seen in fig. 4, interface 208 is augmented with layer 209 relative to interface 201. For example, the screen displayed by the layer 209 may be a popup window (e.g., a message sent by another user); or a notification bar (e.g., a push message for other applications).
Further exemplarily, the mobile phone displays an interface 201 as shown in (1) in fig. 5, where the interface 201 is superimposed and synthesized by the layer 101, the layer 102, and the layer 103. On this basis, after the handset receives the message of the other user, the handset displays an interface 210 as shown in (2) in fig. 5. The interface 210 is formed by superimposing the layer 101, the layer 102, and the layer 103, and the interface 210 changes layer with respect to the layer 103 of the interface 201. For example, the layer 103 in the interface 210 includes a message unread prompt 211.
Also illustratively, the handset displays an interface 212 as shown in (1) of fig. 6, where the interface 212 may be, for example, an interface through which the user chats with other users through a communication application. The interface 212 may be formed by, for example, superimposing the layer 104, the layer 105, and the layer 106. On this basis, after the mobile phone receives the message of the other user, the mobile phone displays an interface 213 as shown in (2) in fig. 6. The interface 213 is formed by superimposing the layer 104, the layer 105, and the layer 106, and the interface 213 changes layers with respect to the interface 212. For example, the layer 106 in the interface 213 includes messages 214 sent by other users (e.g., red packet application).
It should be noted that, the first screen described in the embodiment of the present application may be, for example, screens displayed by the interface 201 and the interface 212 in fig. 2 to fig. 6 and fig. 9 to fig. 10; the second screen may be, for example, a screen displayed by interface 204, interface 206, interface 208, interface 210, interface 213, interface 218, or interface 220 in fig. 2-6 and 9-10. In addition, the at least one display layer may be, for example, the layer 101, the layer 102, and the layer 103 shown in fig. 2 to 5 and fig. 9 to 10; or the layer 104, the layer 105 and the layer 106 shown in fig. 6; alternatively, the at least one display layer may be, for example, the layer 107, the layer 108, and the layer 109 shown in fig. 8.
In some embodiments, the first layer is used to display a volume adjustment control; or the first layer is used for displaying the screenshot; or the first image layer is used for displaying a popup window; or the first layer is used for displaying the message unread prompt; or the first image layer is used for displaying the application red packet. For example, as shown in fig. 2 to 6 and fig. 9 to 10 in combination, the first layer may be, for example, the layer 204, the layer 207, the layer 209, the layer 211, the layer 214, the layer 219, or the layer 221 in fig. 2 to 6 and fig. 9 to 10.
Taking an electronic device as an example of a mobile phone and an application 1 as an example of a video application, for example, after the mobile phone starts the video application, the mobile phone displays an interface 215 shown in fig. 7. The interface 215 may be a main interface (or called home page) of a video application; interface 215 includes a plurality of video windows, such as video window a, video window B, video window C, and video window D. For example, the mobile phone may respond to a click operation of a user on any one of the plurality of video windows, and the mobile phone displays video content of the video window corresponding to the click operation. In some embodiments, as also shown in fig. 7, the video window a (or target window) is typically displayed in the interface 215 in the form of a carousel of advertisement listings, i.e., the frame displayed by the video window a may change periodically.
In this embodiment, the first screen may be a screen displayed on the interface 216 shown in (1) in fig. 8, for example, and the second screen may be a screen displayed on the interface 217 shown in (2) in fig. 8, for example. As can be seen from fig. 8, the first picture and the second picture include a target window (e.g., a video window a), and the target window cyclically displays a third layer (e.g., layer 2 shown in (1) in fig. 8) and a second layer (e.g., layer 1 shown in (2) in fig. 8) in a carousel manner.
In some embodiments, the picture displayed by the video window a is superimposed and synthesized by a plurality of image layers. For example, as shown in fig. 7, the picture displayed by the video window a is superimposed and synthesized by the layer 1 and the layer 2. When the layer 2 is covered on the layer 1, the picture displayed by the video window a is the picture in the layer 2. Correspondingly, when the layer 1 is overlaid on the layer 2, the picture displayed by the video window a is the picture in the layer 1. Taking the example that the layer 2 is overlaid on the layer 1, the displayed picture of the video window a is the picture in the layer 2, because when the layer 2 is overlaid on the layer 1, the layer 1 is hidden under the layer 2, and therefore the layer 1 is not displayed in the video window a.
In conjunction with the interface 215 shown in fig. 7, as shown in fig. 8, for example, the mobile phone displays an interface 216 shown in (1) in fig. 8, where the interface 216 is formed by superimposing the layer 107, the layer 108, and the layer 109; and the picture displayed by layer 109 is the picture of layer 2. Then, the mobile phone detects that the duration of the picture of the layer 2 displayed on the layer 109 reaches a certain time, and the mobile phone displays the interface 217 shown in (2) in fig. 8. The interface 217 is formed by overlapping the layer 107, the layer 108 and the layer 109; and the picture displayed by layer 109 is the picture of layer 1. As can be seen with reference to fig. 8, the interface 217 is layer-changed with respect to the interface 216. For example, interface 217 has changed layer 109 relative to interface 216.
Still taking the electronic device as a mobile phone and the application 1 as a communication application as an example, for example, the mobile phone displays an interface 201 shown in (1) in fig. 9, where the interface 201 is formed by superimposing the layer 101, the layer 102, and the layer 103. On the basis, in response to a touch operation (such as a leftward sliding operation) of the user on the display screen, the mobile phone displays an interface 218 shown in (2) in fig. 9, where the interface 218 is superimposed and synthesized by the layer 101, the layer 102, the layer 103, and the layer 219. Referring to FIG. 9, it can be seen that interface 218 is augmented with layer 219 relative to interface 201. For example, layer 219 in interface 218 may be a sidebar.
Further exemplarily, the mobile phone displays an interface 201 as shown in (1) in fig. 10, where the interface 201 is superimposed and synthesized by the layer 101, the layer 102, and the layer 103. On the basis, in response to a touch operation (such as a downward sliding operation) of the user on the display screen, the mobile phone displays an interface 220 shown in (2) in fig. 10, wherein the interface 220 is superimposed and synthesized by the layer 101, the layer 102, the layer 103 and the layer 221. Referring to FIG. 10, it can be seen that the interface 220 adds a layer 221 to the interface 201. For example, layer 221 in interface 220 may be a slide down notification bar.
It should be noted that, the above is only described by taking the application 1 as a communication application or a video application as an example, but of course, the application 1 may also be other applications, such as a live application, a short video application, a browser application, and the like, and the application 1 is not specifically limited in this embodiment of the application.
The display method provided by the embodiments of the present application will be described in detail below with reference to the drawings of the specification.
For example, the electronic device in the embodiment of the present application may be an electronic device including an LTPO display screen. For example, the electronic device may be a mobile phone, a sports camera (GoPro), a digital camera, a tablet computer, a desktop, a laptop, a handheld computer, a notebook, a car-mounted device, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the like, and the embodiment of the present application is not particularly limited to the specific form of the electronic device.
Fig. 11 is a schematic structural diagram of the electronic device 100. Among them, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a positioning module 181, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel (or display substrate). The display panel may employ an organic light-emitting diode (OLED). In the embodiment of the application, the display screen is an LTPO display screen; the LTPO display screen includes display cells (e.g., TFTs) in the display panel that are LTPO TFTs. For the illustration of LTPO, reference may be made to the above embodiments, which are not described in detail herein.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device is in frequency bin selection, the digital signal processor is used for performing fourier transform and the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as audio, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in the embodiment of the present application, the processor 110 may execute instructions stored in the internal memory 121, and the internal memory 121 may include a program storage area and a data storage area.
The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc.
In some embodiments, the software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, or a cloud architecture. In the embodiment of the present application, a layered architecture Android system is taken as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 12 is a software structure diagram of an electronic device according to an embodiment of the present application.
It will be appreciated that the hierarchical architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may include an Application (APP) layer, an application Framework (FWK) layer, a system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel). For convenience of understanding, in the embodiment of the present application, the software structure diagram shown in fig. 12 further includes some hardware structures of the electronic device in fig. 11. Such as LTPO display screens.
The application layer may include a series of application packages. Illustratively, as shown in fig. 12, the application package may include application 1, application 2, and the like.
The application framework layer provides an Application Programming Interface (API) and a programming framework for an application of the application layer. The application framework layer includes a number of predefined functions. The application framework layer provides programming services to application layer calls through the API interface. Illustratively, as shown in fig. 7, the application framework layer includes an Activity Manager Service (AMS) and a Window Manager Service (WMS). The activity management service is used for managing the life cycle of each application program and the navigation backspacing function. The method is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program. The window management service is used for managing windows, and specifically comprises adding, deleting, size, direction and position, animation, focus and the like of the windows.
The system library may include a plurality of functional modules. For example: a frame rate decision module, an image synthesis system (such as a surface flicker), a view system, an image rendering system, and the like. The frame rate decision module is used for adjusting the refresh rate of the LTPO display screen. The image synthesis system is used to control image synthesis and generate vertical synchronization (Vsync) signals. In some embodiments, the image composition system further comprises an image buffer queue. Illustratively, the application renders the image through a view system and renders the rendered image through an image rendering system. Then the application sends the rendered image to an image cache queue in the image synthesis system; the image buffer queue is used for buffering the image rendered by the application drawing. Every time the Vsync signal comes, the image synthesis system sequentially acquires one frame of image to be synthesized from the image buffer queue, and then performs image synthesis by the image synthesis system.
The hardware abstraction layer is an interface layer located between the kernel layer and the hardware, and can be used for abstracting the hardware. Illustratively, as depicted in FIG. 7, the hardware abstraction layer includes a hardware synthesizer (e.g., HwComperser, HWC). The hardware compositor is used to composite the images composited by the image compositing system into hardware (e.g., an LTPO display screen).
The kernel layer provides underlying drivers for various hardware of the electronic device. Illustratively, as shown in FIG. 12, the core layer includes a display driver.
The following describes an exemplary workflow of electronic device software and hardware in conjunction with a scene in which a layer is newly added or changed in an application program interface displayed by the electronic device.
In some embodiments, when the electronic device runs application 1, the electronic device controls the display screen to display interface one at a refresh rate one (e.g., 60 Hz); when the first interface is not changed within the preset time length, the electronic equipment controls the display screen to display a second interface at a second refresh rate (such as 10 Hz); then, when the electronic device determines that the second interface dynamically changes, the electronic device controls the display screen to display the third interface at a refresh rate of three (e.g., 60 Hz).
Taking the dynamic change of the second interface as the case that the second interface has a newly added layer, for example, the application 1 notifies the AMS to create the activity of the newly added layer; the AMS starts the activity of the newly added layer. Then, the AMS informs the WMS to create a target window; the WMS manages the window size, position, window name, etc. of the target window. And the WMS informs the image synthesis system to create the new layer. Illustratively, the WMS calls a create layer interface (e.g., create layer () interface) to notify the image composition system to create a new layer; the image synthesis system receives layer creation information (such as the size, position, layer name, and the like of the newly added layer) of the newly added layer in the create layer () interface. The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60Hz), and sends the refresh rate of three to the image synthesis system. The image synthesis system can perform layer synthesis according to the refresh rate three, and finally display the newly added layer after synthesis.
Illustratively, the image synthesis system sends layer creation information to the application 1 to trigger the application 1 to draw and render the newly added layer. The application 1 draws image data through a view system, renders the image data through an image rendering system, and then sends the rendered image data to an image synthesis system. And the image synthesis system performs layer synthesis according to the rendered image data to obtain a newly added layer. Then, the image synthesis system sends the synthesized newly added layer to a hardware synthesizer (HWC), and the hardware synthesizer sends the synthesized newly added layer to a display driver; the display driver drives the LTPO display to display the new layer.
Taking an example that the dynamic change of the second interface is a layer change of the second interface, for example, when the electronic device determines that a layer in a display image of the second interface changes (for example, a part of a window in the second interface changes), the application 1 writes image change information into the image cache queue. The image change information includes one or more of layer displacement, size of a window where the layer is located, layer color, or layer movement effect (such as becoming larger or smaller). And then, the image cache queue informs the image synthesis system that the current interface has layer change, and the image synthesis system generates a processing transaction. The processing transaction is used for judging whether the changed layer needs to be displayed. The image synthesis system identifies that the changed layer needs to be displayed (e.g., displaying the changed layer) by processing the transaction. The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60Hz) and sends the refresh rate of three to the image composition system. The image synthesis system can synthesize layers according to the refresh rate three, and finally displays the changed layers. In some embodiments, the image composition system determines that the layer change occurs in the partial window of the current interface through a layer listener interface (surface flag:: set transaction state ()). The image composition system then notifies the frame rate decision module that the refresh rate of the LTPO display screen needs to be increased (e.g., increasing the refresh rate of the LTPO display screen from 10Hz to 60 Hz). The frame rate decision module adjusts the refresh rate of the LTPO display screen to a refresh rate of three (e.g., 60Hz), and sends the refresh rate of three to the image synthesis system. And the image synthesis system can perform layer synthesis according to the refresh rate three and display the layer synthesis.
Illustratively, the image composition system sends the changed layer information (such as layer displacement, size of a window where the layer is located, layer color, and layer animation effect) to the application 1, so as to trigger the application to render the changed layer. The application 1 draws image data through a view system, renders the image data through an image rendering system, and then sends the rendered image data to an image synthesis system. And the image synthesis system performs layer synthesis according to the rendered image data to obtain a changed layer. Then, the image synthesis system sends the synthesized layer to a hardware synthesizer (HWC), and the hardware synthesizer sends the synthesized layer to a display driver; the display driver may drive the LTPO display to display the changed layer.
With reference to the foregoing embodiment, when the electronic device displays the new layer or the changed layer, the electronic device needs to synthesize the new layer or the changed layer (such as the second layer) through the image synthesis system and display the new layer or the changed layer.
The following describes in detail a specific processing flow of the image synthesis system synthesizing the second image layer. For ease of understanding, the following description of some technical terms related to layer composition is given by way of example for reference.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of object motion. The frame rate is the number of frames of a picture refreshed in 1 second, and can also be understood as the number of times of refreshing the picture per second by an image processor in the electronic device. A high frame rate may result in a smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that the interface usually needs to go through drawing, rendering, composition, and other processes before displaying the frame.
2. And (3) frame drawing: the drawing refers to drawing of pictures displayed on an interface. The display interface may be comprised of one or more views, each of which may be drawn by a visual control of the view system, each of which is comprised of a sub-view, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. Frame rendering: rendering the rendered view or adding 3D effects, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
4. Frame synthesis: is the process of compositing a plurality of the one or more rendered views into a display interface.
In order to improve display smoothness and reduce display jamming and the like, electronic devices generally perform display based on Vsync signals to synchronize the flow of drawing, rendering, synthesizing, screen refreshing display and the like of images. Those skilled in the art will appreciate that the Vsync signal is a periodic signal, and the period of the Vsync signal may be set according to the refresh rate of the display screen. For example, when the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronics generate a control signal every 16.6ms to cause the Vsync signal period to trigger.
In some embodiments, as shown in fig. 13, a schematic diagram of an image layer synthesis processing flow provided in an embodiment of the present application is shown. The content displayed by the electronic device corresponds to frame 1, frame 2, and frame 3 in chronological order.
Illustratively, taking the display of frame 1 as an example, the application of the electronic device renders frame 1 through the view system and renders frame 1 through the image rendering system. After the frame 1 is rendered and rendered, the application of the electronic equipment sends the rendered frame 1 to the image synthesis system. The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the electronic device may display the content corresponding to the frame 1 on a screen (e.g., an LTPO display screen) by calling the kernel layer display driver. It should be noted that the process of frames 2 and 3 similar to that of frame 1 is also synthesized and displayed, and is not described here again. Each frame in fig. 13 lags from drawing to display by 2 periods of the Vsync signal, i.e., the display of the electronic device has hysteresis.
In the embodiment of the application, after the electronic device starts the first application, the electronic device refreshes the LTPO display screen by using a preset refresh rate (for example, 60Hz) and displays a first interface; when the first interface is unchanged for the preset time period, the electronic device refreshes the LTPO display screen by using a first refresh rate (such as 10 Hz). It can be seen that, under the condition that the user is not away from the hand and does not operate the first application, the first interface may be in a static state for a long time, and at this time, the electronic device may reduce the refresh rate of the LTPO display screen, so as to reduce the power consumption of the device. In this case, if a newly added layer on the first interface needs to be displayed, or the first interface has a layer change, and the changed layer needs to be displayed, the electronic device refreshes the LTPO display screen by using a second refresh rate (e.g., 60Hz) in the process of displaying the newly added layer or displaying the changed layer.
For convenience of understanding, in the embodiment of the present application, a process of interaction between modules involved in a display method for refreshing an LTPO display screen by using a refresh rate three is described below with reference to fig. 12 in a process of displaying a newly added layer or displaying a changed layer.
For example, fig. 14 and fig. 15 are schematic diagrams illustrating a process of interaction between modules in a display method provided in an embodiment of the present application. As shown in fig. 14 and 15, the system may include: application 1, Activity Management Service (AMS), Window Management Service (WMS), image composition System (SF), image cache queue, frame rate decision Module, hardware compositor (HWC), display driver, and LTPO display.
In the embodiment of the application, when the electronic device runs the application 1, the electronic device refreshes the LTPO display screen by using a refresh rate one (for example, 60Hz), and displays an interface one; and when the first interface is not changed within the preset time, the electronic equipment refreshes the LTPO display screen by using a second refresh rate (such as 10Hz) and displays the second interface. It can be seen that, in the case that the user is away from the hand and does not operate the application 1, the interface is in a static state for a long time, and at this time, the electronic device may reduce the refresh rate of the LTPO display screen, so as to reduce the power consumption of the device. In this case, if the second interface changes dynamically, the electronic device refreshes the LTPO display screen with a refresh rate of three (e.g., 60Hz), and displays the third interface. Wherein, the dynamic change of the second interface means that the second interface has a new layer (or called as a first layer); or the layer of the second interface is changed (for example, the third layer is changed into the second layer).
In some embodiments, taking an example that an additional layer is added to the second interface, as shown in fig. 12 and fig. 14, the display method may include: step a 1-step a 14.
Step a1, application 1 receives the target instruction.
The target instruction (or the first instruction) may be, for example, a touch operation input by a user, or a push message of another application. The embodiment of the present application does not limit the target instruction. In addition, for the illustration of the touch operation, reference may be made to the above embodiments, and details are not repeated here.
Illustratively, the target instruction is used for triggering the electronic device to display the newly added layer. The newly added layer may be, for example, a scene for adjusting the volume of the electronic device, a scene for capturing a picture, a scene for displaying a notification message, a scene for displaying a sidebar (or a downslide notification bar), and the like.
Step a2, application 1 sends a first message to the AMS.
The first message is used for notifying the AMS of creating the activity of the newly added layer. The first message carries attribute information of the newly added layer. Illustratively, the attribute information includes one or more of an activity name, a window size, a window position, or a window name.
Step A3, AMS starts the activity of the newly added layer.
Step a4, the AMS sends a second message to the WMS.
And the second message is used for informing the WMS of managing the window of the newly added layer. The second message carries attribute information of the newly added layer.
Illustratively, the WMS manages a window size, a window position, a window name, and the like of the newly added layer.
Step A5, the WMS informs the image synthesis system to create the new layer.
Illustratively, the WMS calls a create layer interface (e.g., create layer () interface) to notify the image composition system to create the new layer. The image composition system receives the layer creation information in the create layer () interface. The layer creation information may be, for example, the size, the position, the layer name, and the like of the newly added layer.
Step a6, the image composition system determines that the current refresh rate (e.g., the first refresh rate) is below a threshold (e.g., the first threshold).
Illustratively, the threshold may be, for example, 10 Hz.
In some embodiments, when the image synthesis system determines that the current refresh rate is above a threshold, the image synthesis system notifies the frame rate decision module to adjust the refresh rate. Wherein the threshold value in this embodiment may be, for example, 60 Hz.
Step A7, the image synthesis system informs the frame rate decision module to adjust the refresh rate.
Step A8, the frame rate decision module determines that the current refresh rate is lower than the threshold, and the frame rate decision module adjusts the current refresh rate from the second refresh rate (e.g. the first refresh rate) to the third refresh rate (e.g. the third refresh rate).
Illustratively, the threshold may be, for example, 10 Hz. The second refresh rate may be, for example, 10Hz, and the third refresh rate may be, for example, 60 Hz.
In the event that the image synthesis system determines that the current refresh rate is above a threshold (e.g., 60Hz), the frame rate decision module determines that the current refresh rate is above a threshold (e.g., 60Hz), and the frame rate decision module adjusts the current refresh rate from a high refresh rate (e.g., 60Hz) to a low refresh rate (e.g., 10 Hz).
And step A9, the frame rate decision module sends the refresh rate III to the image synthesis system.
And step A10, the frame rate decision module sends the refresh rate III to the display driver.
And step A11, the image synthesis system carries out layer synthesis according to the refresh rate III.
Illustratively, the image composition system sends a Vsync signal according to a refresh rate three-way application 1; and (3) performing drawing and rendering of the image layer by the application 1 according to the rhythm of the refresh rate three, and then sending the drawn and rendered image data to the image cache queue by the application 1. And the image synthesis system carries out layer synthesis on the image data after rendering in the image cache queue according to the rhythm of the refresh rate three to obtain a synthesized layer (such as a newly added layer). For example, when the refresh rate three is 60Hz, the application 1 performs drawing rendering of the image layer according to the rhythm of the refresh rate three, that is, the output frame rate of the application 1 is 60 FPS.
And step A12, the image synthesis system sends the synthesized image layer to a hardware synthesizer.
And step A13, the hardware synthesizer sends the layer to the display driver.
Step A14, the display driver drives the LTPO display to display three display layers (such as the first layer) at the refresh rate.
Illustratively, the display driver drives the LTPO display screen to display interface three; and the third interface comprises a synthesized layer (such as a newly added layer). For example, the display driver LTPO display displays the interfaces shown in fig. 2-4 and fig. 9 and 10. The interfaces shown in fig. 2 to 4, 9 and 10 include additional layers (such as layer 204, layer 207, layer 209, layer 219 and layer 221).
In this embodiment, in a case where the electronic device refreshes the LTPO display screen with a lower refresh rate (e.g., 10Hz as the refresh rate in the above embodiment), if the electronic device receives the target instruction, the electronic device refreshes the LTPO display screen with a refresh rate three (e.g., 60Hz), and displays an interface three; because the target instruction is used for indicating the electronic equipment to display the newly added layer, and because the refresh rate III is greater than the refresh rate II, the electronic equipment improves the refresh rate of the display screen in the process of displaying the newly added layer by the electronic equipment, the problem that the electronic equipment cannot be jammed in the process of displaying the newly added layer can be solved, and the user experience is improved.
In other embodiments, taking the change of the layer of the second interface as an example, as shown in fig. 12 and fig. 15, the display method may include: step S1-step S13.
Step S1, application 1 receives the image change information.
The image change information is used to indicate that layer change occurs in the interface (for example, interface two) currently displayed by the application 1. For example, the image change information is used to trigger the electronic device to display the second layer (e.g., change from the third layer to the second layer). The second layer is hidden under the third layer, and the second layer and the third layer are two display layers in at least one display layer. Illustratively, the image variation information includes one or more of layer displacement, size of a window where the layer is located, layer color, or layer movement effect (e.g., becoming larger or smaller).
Step S2, application 1 writes image change information to the image buffer queue.
Accordingly, the image buffer queue buffers the image change information.
Step S3, the image buffer queue notifies the image composition system of the occurrence of layer change.
Illustratively, the image synthesis system registers monitoring to an image cache queue through a layer monitoring interface (surface pointer:: set transaction state ()), and when there is a layer change, the image cache queue calls the layer monitoring interface to notify the image synthesis system that the layer change occurs in a partial window of the current interface.
Step S4, the image synthesis system determines whether the changed layer needs to be displayed.
Illustratively, the image composition system generates a transaction; the processing transaction is used for judging whether the changed layer needs to be displayed. For example, the image synthesis system may compare, in a processing transaction, whether attribute information (including at least one of an image layer size, an image layer position, or an image layer transparency) of a layer before change (e.g., a third image layer) and an image layer after change (e.g., a second image layer) is changed, so as to determine whether the changed image layer needs to be displayed.
Step S5, when the image synthesis system determines that the changed image layer needs to be displayed, the image synthesis system determines that the current refresh rate is lower than the threshold.
Illustratively, the threshold may be, for example, 10 Hz.
Step S6, the image synthesis system notifies the frame rate decision module to adjust the refresh rate.
In some embodiments, when the image synthesis system determines that the current refresh rate is above a threshold, the image synthesis system notifies the frame rate decision module to adjust the refresh rate. Wherein the threshold value in this embodiment may be, for example, 60 Hz.
Step S7, the frame rate decision module determines that the current refresh rate is lower than the threshold, and the frame rate decision module adjusts the current refresh rate from refresh rate two to refresh rate three.
Illustratively, the threshold may be, for example, 10 Hz. The second refresh rate may be, for example, 10Hz, and the third refresh rate may be, for example, 60 Hz.
In the event that the image synthesis system determines that the current refresh rate is above a threshold (e.g., 60Hz), the frame rate decision module determines that the current refresh rate is above a threshold (e.g., 60Hz), and the frame rate decision module adjusts the current refresh rate from a high refresh rate (e.g., 60Hz) to a low refresh rate (e.g., 10 Hz).
And step S8, the frame rate decision module sends the refresh rate III to the image synthesis system.
And step S9, the frame rate decision module sends the refresh rate III to the display driver.
And step S10, the image synthesis system carries out layer synthesis according to the refresh rate III.
Illustratively, the image composition system sends a Vsync signal according to a refresh rate three-way application 1; and (3) performing drawing and rendering of the image layer by the application 1 according to the rhythm of the refresh rate three, and then sending the drawn and rendered image data to the image cache queue by the application 1. And the image synthesis system performs layer synthesis on the rendered image data drawn in the image cache queue according to the rhythm of the refresh rate three to obtain a synthesized layer (such as a changed layer). For example, when the refresh rate three is 60Hz, the application 1 performs drawing rendering of the image layer according to the rhythm of the refresh rate three, that is, the output frame rate of the application 1 is 60 FPS.
Step S11, the image synthesis system sends the synthesized layer to the hardware synthesizer.
And step S12, the hardware synthesizer sends the layer to the display driver.
And step S13, the display driver drives the LTPO display screen to display layers in a refresh rate.
Illustratively, the display driver drives the LTPO display screen to display interface three; wherein, the third interface includes a synthesized layer (e.g., a changed layer). For example, the display driver drives the LTPO display panel to display the interfaces shown in fig. 5, 6, and 8. The interfaces shown in fig. 5, 6, and 8 include changed layers (such as layer 211, layer 214, and layer 109).
In this embodiment, in the case that the electronic device refreshes the LTPO display screen with a lower refresh rate (e.g., 10Hz in the above embodiment), if the electronic device receives the image change information, the electronic device refreshes the LTPO display screen with a refresh rate three (e.g., 60Hz), and displays an interface three; because the image change information is used for indicating that the layer change occurs in the interface displayed by the electronic equipment currently, and because the refresh rate III is greater than the refresh rate II, the electronic equipment improves the refresh rate of the display screen in the process of displaying the changed layer by the electronic equipment, so that the problem that the electronic equipment cannot be stuck in the process of displaying the newly added layer can be solved, and the user experience is improved.
An embodiment of the present application provides an electronic device, which may include: a display screen (e.g., a touch screen), memory, and one or more processors. The display screen, memory and processor are coupled. The memory is for storing computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform various functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 11.
An embodiment of the present application further provides a chip system, as shown in fig. 16, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 11 in the foregoing embodiment. The interface circuit 1802 may be, for example, an interface circuit between the processor 110 and the external memory 120; or an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and the interface circuit 1802 may be interconnected by wires. For example, the interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). Also for example, the interface circuit 1802 may be used to send signals to other devices, such as the processor 1801. Illustratively, interface circuit 1802 may read instructions stored in the memory and send the instructions to processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the steps performed by the handset 180 in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform various functions or steps performed by a mobile phone in the foregoing method embodiments.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute each function or step executed by the mobile phone in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A display method is applied to an electronic device, wherein the electronic device supports a first refresh rate and a second refresh rate; the method comprises the following steps:
the electronic equipment controls a display screen to display a first picture at a first refresh rate; the first refresh rate is less than a first threshold; the first picture corresponds to at least one display layer;
when the electronic equipment detects that the display layer of the first picture is updated, the electronic equipment controls the display screen to display a second picture at a second refresh rate; the second picture comprises an updated display layer; the second refresh rate is greater than the first refresh rate.
2. The method of claim 1, wherein the electronic device detecting the display layer update of the first screen comprises:
the electronic equipment detects a first instruction; the first instruction is used for triggering the electronic equipment to display a first layer, and the first layer is different from the at least one display layer;
and the electronic equipment responds to the first instruction and detects that the display layer of the first picture is updated.
3. The method of claim 2, wherein the electronic device controls the display screen to display a second picture at a second refresh rate, comprising:
the electronic equipment creates the first layer according to the attribute information of the first layer and controls the display screen to display a second picture at a second refresh rate; the attribute information of the first image layer includes at least one of a window size, a window position, or a window name of the first image layer.
4. The method according to claim 1, wherein the electronic device detects a display layer update of the first screen, and comprises:
the electronic equipment acquires image change information; the image change information is used for triggering the electronic equipment to display a second image layer; the second layer is hidden under the third layer; the second layer and the third layer are two display layers in the at least one display layer;
and the electronic equipment detects that the display layer of the first picture is updated according to the image change information.
5. The method of claim 4, wherein the electronic device controls the display screen to display a second picture at a second refresh rate, comprising:
and when the electronic equipment determines that the second layer needs to be displayed, the electronic equipment controls the display screen to display a second picture at a second refresh rate.
6. The method according to claim 5, wherein the electronic device determines that the second layer needs to be displayed, including:
the electronic equipment compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed; the attribute information includes at least one of layer size, layer position, or layer transparency.
7. The method of claim 3, wherein the electronic device comprises: an image synthesis system; the electronic device creates the first layer according to the attribute information of the first layer, including:
and the image synthesis system creates the first image layer according to the attribute information of the first image layer.
8. The method of claim 4, wherein the electronic device comprises: an image buffer queue;
the electronic equipment acquires image change information, and comprises the following steps:
the image buffer queue acquires the image change information;
the electronic device detects that the display layer of the first picture is updated according to the image change information, and the method includes:
and the image cache queue detects that the display layer of the first picture is updated according to the image change information.
9. The method of claim 5 or 6, wherein the electronic device comprises an image composition system; the electronic device determines that the second image layer needs to be displayed, including:
and the image synthesis system compares the attribute information of the second layer with the attribute information of the third layer to determine that the second layer needs to be displayed.
10. The method of any one of claims 2-3, or 7,
the first layer is used for displaying a volume adjustment control; or,
the first layer is used for displaying a screenshot; or,
the first layer is used for displaying a popup window; or,
the first layer is used for displaying a message unread prompt; or,
the first image layer is used for displaying an application red packet.
11. The method of any one of claims 4-6, or 8-9,
the first and second frames comprise a target window; and the target window circularly displays the third layer and the second layer in a carousel mode.
12. The method of any of claims 1-11, wherein the electronic device controls a display screen to display a first picture at a first refresh rate, comprising:
the electronic equipment controls the display screen to display a third picture at a third refresh rate;
and the electronic equipment determines that the third picture does not change within a preset time length, and controls the display screen to display the first picture at the first refresh rate.
13. The method of any of claims 2-3, or 7, wherein the electronic device detects the first instruction comprising:
the electronic equipment responds to the operation of a user on a target control, and detects the first instruction; the target control comprises a volume key or a screen locking key; or,
and after the electronic equipment receives the message of the first application, the first instruction is detected.
14. An electronic device, wherein the electronic device supports a first refresh rate and a second refresh rate; the electronic device comprises a display screen, a memory and one or more processors; the display screen, the memory and the processor are coupled;
the memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-13.
15. A computer-readable storage medium comprising computer instructions; the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-13.
CN202210181930.6A 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium Active CN114661263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210181930.6A CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210181930.6A CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114661263A true CN114661263A (en) 2022-06-24
CN114661263B CN114661263B (en) 2023-06-20

Family

ID=82027023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210181930.6A Active CN114661263B (en) 2022-02-25 2022-02-25 Display method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114661263B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115690269A (en) * 2022-10-31 2023-02-03 荣耀终端有限公司 View object processing method and electronic equipment
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 Electronic equipment dynamic effect playing method, electronic equipment and storage medium
CN117698554A (en) * 2024-02-06 2024-03-15 深圳市欧冶半导体有限公司 Projection control method, device, equipment and system of intelligent car lamp system
CN117711355A (en) * 2022-08-24 2024-03-15 荣耀终端有限公司 Screen refresh rate switching method and electronic equipment
WO2024066834A1 (en) * 2022-09-30 2024-04-04 荣耀终端有限公司 Vsync signal control method, electronic device, storage medium and chip

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090132942A1 (en) * 1999-10-29 2009-05-21 Surfcast, Inc. System and Method for Simultaneous Display of Multiple Information Sources
CN103473051A (en) * 2013-09-02 2013-12-25 小米科技有限责任公司 Method and device for saving terminal power consumption
US20180261143A1 (en) * 2017-03-10 2018-09-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method, device and non-transitory computer-readable storage medium for controlling frame rate of mobile terminal
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium
CN109388461A (en) * 2018-09-27 2019-02-26 青岛海信电器股份有限公司 Display methods, device and the display terminal of object are identified in screen-picture screenshot
CN111767013A (en) * 2020-06-01 2020-10-13 Oppo(重庆)智能科技有限公司 Control method, control device, electronic device, computer-readable storage medium
CN112652263A (en) * 2020-12-25 2021-04-13 深圳传音控股股份有限公司 Refreshing method, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090132942A1 (en) * 1999-10-29 2009-05-21 Surfcast, Inc. System and Method for Simultaneous Display of Multiple Information Sources
CN103473051A (en) * 2013-09-02 2013-12-25 小米科技有限责任公司 Method and device for saving terminal power consumption
US20180261143A1 (en) * 2017-03-10 2018-09-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method, device and non-transitory computer-readable storage medium for controlling frame rate of mobile terminal
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium
CN109388461A (en) * 2018-09-27 2019-02-26 青岛海信电器股份有限公司 Display methods, device and the display terminal of object are identified in screen-picture screenshot
CN111767013A (en) * 2020-06-01 2020-10-13 Oppo(重庆)智能科技有限公司 Control method, control device, electronic device, computer-readable storage medium
CN112652263A (en) * 2020-12-25 2021-04-13 深圳传音控股股份有限公司 Refreshing method, terminal and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117711355A (en) * 2022-08-24 2024-03-15 荣耀终端有限公司 Screen refresh rate switching method and electronic equipment
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 Electronic equipment dynamic effect playing method, electronic equipment and storage medium
CN116684677B (en) * 2022-09-20 2024-06-11 荣耀终端有限公司 Electronic equipment dynamic effect playing method, electronic equipment and storage medium
WO2024066834A1 (en) * 2022-09-30 2024-04-04 荣耀终端有限公司 Vsync signal control method, electronic device, storage medium and chip
CN115690269A (en) * 2022-10-31 2023-02-03 荣耀终端有限公司 View object processing method and electronic equipment
CN115690269B (en) * 2022-10-31 2023-11-07 荣耀终端有限公司 View object processing method and electronic equipment
CN117698554A (en) * 2024-02-06 2024-03-15 深圳市欧冶半导体有限公司 Projection control method, device, equipment and system of intelligent car lamp system
CN117698554B (en) * 2024-02-06 2024-04-16 深圳市欧冶半导体有限公司 Projection control method, device, equipment and system of intelligent car lamp system

Also Published As

Publication number Publication date
CN114661263B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN114661263B (en) Display method, electronic equipment and storage medium
CN114518817B (en) Display method, electronic device and storage medium
US20200192500A1 (en) Electronic Devices With Adaptive Frame Rate Displays
WO2021027725A1 (en) Method for displaying page elements and electronic device
WO2023160194A1 (en) Method for controlling dynamic change in screen refresh rate, and electronic device
WO2022257451A1 (en) Display method, electronic device and computer storage medium
CN112767231B (en) Layer composition method and device
CN114461051B (en) Frame rate switching method and device and storage medium
EP3159876A1 (en) Method and apparatus for displaying content
CN114697446B (en) Refresh rate switching method, electronic device and storage medium
WO2023050722A1 (en) Information display method and electronic device
WO2024041047A1 (en) Screen refresh rate switching method and electronic device
US10134326B2 (en) Device for and method of saving power when refreshing a display screen when displayed content does not change
CN114995693A (en) Display screen window switching method and electronic equipment
WO2023001163A1 (en) Screen refreshing method and device capable of improving dynamic effect performance
CN113805772B (en) Dynamic response method, electronic equipment and storage medium
CN115695699A (en) Display frame rate adjusting method and device, terminal and storage medium
CN116348949A (en) Dynamic frame rate optimization
CN114495864B (en) Screen refresh rate adjusting method, electronic device and readable storage medium
WO2024016798A1 (en) Image display method and related apparatus
CN117724781A (en) Playing method for starting animation by application program and electronic equipment
WO2023124227A9 (en) Frame rate switching method and device
CN117707406A (en) Bright screen display method, electronic equipment and storage medium
CN117724779A (en) Method for generating interface image and electronic equipment
CN117711354A (en) Display method, readable storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant