CN113986002A - Frame processing method, device and storage medium - Google Patents

Frame processing method, device and storage medium Download PDF

Info

Publication number
CN113986002A
CN113986002A CN202111647157.XA CN202111647157A CN113986002A CN 113986002 A CN113986002 A CN 113986002A CN 202111647157 A CN202111647157 A CN 202111647157A CN 113986002 A CN113986002 A CN 113986002A
Authority
CN
China
Prior art keywords
performance mode
terminal device
layer
terminal equipment
resource scheduling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111647157.XA
Other languages
Chinese (zh)
Other versions
CN113986002B (en
Inventor
董礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111647157.XA priority Critical patent/CN113986002B/en
Publication of CN113986002A publication Critical patent/CN113986002A/en
Application granted granted Critical
Publication of CN113986002B publication Critical patent/CN113986002B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a frame processing method, a frame processing device and a storage medium, and relates to the technical field of terminals. The method comprises the following steps: the terminal equipment identifies the identifier of the applied layer; the terminal equipment carries out layer processing on the layer without the preset identifier by adopting a first performance mode; the terminal equipment performs layer processing on the layer with the preset identifier by adopting a second performance mode; the hardware operating frequency of the terminal device in the first performance mode is less than the hardware operating frequency of the terminal device in the second performance mode. Therefore, the terminal equipment performs processing of the layer with the preset identifier and higher requirement on the fluency in a higher performance mode, and performs processing of the layer without the preset identifier and lower requirement on the fluency in a lower performance mode, so that the power consumption of the terminal equipment is reduced on the premise of keeping the fluency.

Description

Frame processing method, device and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a frame processing method and apparatus, and a storage medium.
Background
With the development of electronic technology, the performance of various terminal devices (such as mobile phones) is better and better. The demands of consumers on the man-machine interaction performance of the terminal equipment are also increasing. Among them, fluency is an important human-computer interaction performance.
Fluency may be embodied as the length of a delay time from "a user inputs a user operation to a terminal device" to "the terminal device displays an image corresponding to the user operation". For example, the user operation may be an operation input by a user through a mouse or a key; alternatively, the user operation may be a touch operation of the touch screen by the user. The delay time may be referred to as a response delay of the terminal device. For example, when the user operation is a touch operation, the delay time may be referred to as a touch response delay. Wherein, the longer the delay time, the poorer the fluency (e.g., hand following performance); the shorter the delay time, the better the fluency (e.g., hand following performance).
At present, when a terminal device is in a use scene with a high demand on smoothness, problems such as fast power consumption or heat generation may occur. For example, when the user uses the handwriting pen function of the terminal device, problems such as fast power consumption or heat generation may occur.
Disclosure of Invention
The embodiment of the application provides a frame processing method, a frame processing device and a storage medium, which are applied to terminal equipment. The terminal equipment can adopt different performance modes to carry out frame processing on different layers of the same application, and the method is beneficial to ensuring the smoothness and reducing the power consumption of the terminal equipment.
In a first aspect, an embodiment of the present application provides a frame processing method, which is applied to a terminal device, and the method includes: the terminal equipment identifies the identifier of the applied layer; the terminal equipment carries out layer processing on the layer without the preset identifier by adopting a first performance mode; the layer processing comprises the following steps: measuring at least one of layout, drawing, rendering, or compositing; the terminal equipment performs layer processing on the layer with the preset identifier by adopting a second performance mode; the hardware working frequency of the terminal equipment in the first performance mode is less than that in the second performance mode; the hardware operating frequency includes at least one of a frequency of a graphics processor, a frequency of a central processing unit, or a frequency of a double rate memory.
In the embodiment of the application, the terminal device can identify the identifier of the applied layer, and different performance modes can be used when different layers of the same application are processed, so that the layers with higher requirements on fluency can be processed in a higher new performance mode, and the layers with lower requirements on fluency can be processed in a lower performance mode, thereby reducing the power consumption of the terminal device on the premise of keeping fluency.
In a possible implementation manner, before the terminal device identifies the identifier of the applied layer, the method further includes: the terminal equipment receives a first user operation; the first user operation is to open an application, or the first user operation is to switch to an application; the terminal equipment runs in a first performance mode when receiving first user operation; responding to the first user operation, and under the condition that the window identifier corresponding to the application is identified by the terminal equipment to comprise a preset identifier, starting the application by the terminal equipment in a third performance mode; the hardware working frequency of the terminal device in the third performance mode is less than that in the second performance mode, and the hardware working frequency of the terminal device in the third performance mode is greater than that in the first performance mode. Therefore, the terminal equipment switches the performance mode of the terminal equipment to the higher performance mode under the condition that the application including the layer with higher requirements on fluency is started, and the terminal equipment is facilitated to prepare for processing the layer with higher requirements on fluency.
In another possible implementation manner, the terminal device includes a resource scheduling module; a display synthesis process runs in the terminal equipment; before the terminal device starts the application in the third performance mode, the method further comprises the following steps: the resource scheduling module acquires a first touch instruction corresponding to a first user operation; the resource scheduling module determines that the first touch instruction is used for opening an application, or the resource scheduling module sends a first indication message to the display composition process under the condition that the first touch instruction is used for switching to the application; the first indication message is used for indicating a display synthesis process to carry out frame synthesis under the condition that the rendering of the corresponding layer is finished; the resource scheduling module schedules resources according to a first resource scheduling strategy; the first resource scheduling policy is used to switch the terminal device to the third performance mode.
In another possible implementation manner, the method further includes: the terminal equipment receives a second user operation in the process of running the application in the third performance mode; the terminal equipment is switched to a second performance mode under the condition that a second user operation acts on the first interface and the first interface comprises a layer with a preset identifier; the above-mentioned terminal equipment adopts the second performance mode to carry out the layer processing to the layer that has the sign of predetermineeing, includes: the terminal equipment performs measurement layout, drawing and rendering on the layer with the preset identifier in a second performance mode; and the terminal equipment responds to the rendering completion to synthesize the layer with the preset identification.
Therefore, under the condition that the stylus pen is in contact with the touch screen, the terminal equipment is switched to a second performance mode with higher performance, and the terminal equipment processes the image layer with the preset identification in the second performance mode to guarantee the fluency of display of the terminal equipment.
In another possible implementation manner, the second user operation is the latest user operation acquired by the terminal device; the method further comprises the following steps: and the terminal equipment determines that the time difference between the timestamp of the second user operation and the system clock of the terminal equipment is greater than or equal to the first time threshold and is less than the second time threshold, and switches the terminal equipment from the second performance mode to the third performance mode.
Therefore, under the condition that the time length of the stylus pen leaving the touch screen is greater than or equal to the first time threshold and smaller than the second time threshold, the terminal device is switched from the second performance mode with higher power consumption to the third performance mode with lower power consumption, and the power consumption of the terminal device is reduced.
In another possible implementation manner, the method further includes: under the condition that the terminal equipment determines that the time difference between the timestamp operated by the second user and the system clock of the terminal equipment is greater than a second time threshold, the terminal equipment is switched from the third performance mode to the fourth performance mode; and in the fourth performance mode, the hardware working frequency of the terminal equipment is less than that in the third performance mode.
In another possible implementation manner, the method further includes: the terminal equipment receives a third user operation, and the third user operation is used for displaying a second interface; under the condition that the layer corresponding to the second interface does not comprise the layer with the preset identifier, the terminal equipment responds to the third user operation, and the terminal equipment is switched to the first performance mode; the terminal equipment adopts a first performance mode to carry out layer processing on the layer without the preset identifier, and the method comprises the following steps: and the terminal equipment performs layer processing on the layer corresponding to the second interface in the first performance mode. Therefore, the terminal equipment is switched to the first performance mode with lower power consumption to process the image layer without the preset identifier, and the power consumption of the terminal equipment is reduced.
In another possible implementation manner, the terminal device includes a resource scheduling module; a main thread, a rendering thread and a display synthesis process run in the terminal equipment; in response to a third user operation, the terminal device switches to the first performance mode, including: the resource scheduling module acquires a second touch instruction corresponding to the third user operation; the resource scheduling module sends a second indication message to the display synthesis process under the condition that the resource scheduling module determines that the second touch instruction is used for displaying the second interface, wherein the second indication message is used for indicating the display synthesis process to carry out layer synthesis according to the vertical synchronization signal; the resource scheduling module schedules resources according to a second resource scheduling strategy; the second resource scheduling strategy is used for switching the terminal equipment to the first performance mode; the above terminal device performs layer processing on the layer corresponding to the second interface in the first performance mode, including: the main thread measures and arranges the layer corresponding to the second interface under the condition that the vertical synchronous signal arrives; the rendering thread draws and renders the layer corresponding to the second interface under the condition that the next vertical synchronous signal arrives; and the display synthesis process synthesizes the layer corresponding to the second interface under the condition that another vertical synchronous signal arrives.
In another possible implementation manner, the terminal device further includes a hardware scheduling module; the resource scheduling module schedules resources according to a first resource scheduling policy, and comprises: the resource scheduling module acquires a target hardware parameter corresponding to the first resource scheduling strategy according to the target corresponding relation; the target corresponding relation is the corresponding relation between the resource scheduling strategy and the hardware parameter; the resource scheduling module sends the target hardware parameters to the hardware scheduling module corresponding to the target hardware parameters according to the target hardware parameters; and the hardware scheduling module sets the hardware parameters corresponding to the target hardware parameters as the target hardware parameters.
In another possible implementation manner, the target hardware parameter further includes a preset duration; the method further comprises the following steps: and under the condition that the hardware scheduling module does not receive the hardware parameters after the preset time length, the hardware scheduling module sets the hardware parameters corresponding to the target hardware parameters as the hardware parameters in the first performance mode. Therefore, the hardware parameter when the hardware scheduling module is switched to the first performance mode with lower power consumption under the condition that the hardware parameter is not received through the preset duration is beneficial to reducing the power consumption of the terminal equipment.
In a second aspect, an embodiment of the present application provides a terminal device, which includes a processor and a memory, where the processor is configured to call a computer program in the memory to execute the method according to the first aspect.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, which stores computer instructions that, when executed on a terminal device, cause the terminal device to perform the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a chip comprising a processor, the processor being configured to call a computer program in a memory to perform the method according to the first aspect.
It should be understood that the second aspect to the fourth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic structural diagram of a hardware system of a terminal device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a terminal device software system according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a frame processing method in an interface display process in a terminal device in a usage scenario with a high requirement on smoothness in a possible implementation manner;
fig. 4 is a schematic process diagram of interaction among modules in the frame processing method according to the embodiment of the present application;
fig. 5 is a schematic view of an interaction flow of modules in a terminal device according to an embodiment of the present application;
fig. 6 is a schematic process diagram for triggering and starting a drawing application according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a frame processing method according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a trigger drawing application displaying the result of a drawing operation according to an embodiment of the present application;
fig. 9 is a timing chart of frame processing performed by a terminal device according to the fourth performance mode in the embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first chip and the second chip are only used for distinguishing different chips, and the sequence order thereof is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The frame processing method provided by the embodiment of the application can be applied to terminal equipment with a display function. A terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application:
fig. 1 shows a schematic configuration diagram of a terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the terminal device 100.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other terminal devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is an illustrative description, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The antennas in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor displays images or video via the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used for displaying images, displaying videos, receiving slide operations, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data created during use of the terminal device 100, data used (such as a preset identifier, a target correspondence, and the like), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation from the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, a cloud architecture, or the like. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the terminal device 100.
Fig. 2 is a block diagram of a hierarchical structure of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, a terminal device using an Android system as an example is an application layer, an application framework layer, an Android runtime (Android runtime) and system library layer, a kernel layer, and a hardware layer from top to bottom.
The application layer may include a series of application packages. As shown in FIG. 2, the application packages may include camera, calendar, notepad, game, drawing, navigation, launcher, and video applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, an activity manager, a location manager, a package manager, a touch message module, a resource manager, a resource scheduling module, and a view system, among others.
A Window Manager (WMS) is used to manage the window program. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The activity manager is used for managing the life cycle of each application program and the navigation backspacing function. The method is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program.
The location manager is used to provide location services for applications including querying for a last known location, registering and deregistering location updates from some periodic basis, etc.
The package manager is used for program management within the system, for example: application installation, uninstallation, upgrade, and the like.
The touch message module is used for determining a touch instruction and forwarding the touch instruction to an application of an application layer and the resource scheduling module.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The resource scheduling module is configured to determine a resource scheduling policy, and adjust a Central Processing Unit (CPU) frequency, a GPU frequency, a read/write rate of a Double Data Rate (DDR), and the like according to the resource scheduling policy.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application layer and the application framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. Also running in the virtual machine is a display composition process (e.g., a surface flicker, SF). The display composition process is used to control the composition of the images.
The system library may include a plurality of functional modules. For example: the image rendering module, the image synthesis module and the like.
The image drawing module is used for drawing two-dimensional or three-dimensional images. The image rendering module is used for rendering two-dimensional or three-dimensional images. The image synthesis module is used for synthesizing two-dimensional or three-dimensional images.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least includes a touch driver (e.g., an LCD/LED driver), a display driver, a CPU scheduling module, a GPU scheduling module, a capacitance processing module, a Double Data Rate (DDR) scheduling module, a sensor driver, and the like.
Wherein the capacitance value processing module is used for: and determining a touch instruction according to the capacitance value change amplitude.
The DDR scheduling module is configured to: and receiving configuration parameters of the double-rate memory, and configuring the DDR according to the configuration parameters.
The CPU scheduling module is used for: and receiving the configuration parameters of the CPU, and configuring the CPU according to the configuration parameters.
The GPU scheduling module is used for: and receiving the configuration parameters of the GPU, and configuring the GPU according to the configuration parameters.
The hardware may be a touch panel, CPU, GPU, memory, DDR, bluetooth device, camera device, sensor device, etc.
For ease of understanding, the given part of the examples is for reference to the description of concepts related to the embodiments of the present application and the display flow of the terminal device.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of motion of the object.
2. Frame rate: the number of frames of the picture refreshed in 1 second is also understood as the number of times of refreshing the picture per second of the graphics processor in the terminal equipment. A high frame rate may result in a smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that, before the interface displays the frame, processes such as drawing, rendering, and composition are usually required.
3. And (3) frame drawing: the method refers to drawing pictures on a display interface. The display interface may be comprised of one or more views, each of which may be drawn by a visual control of the view system, each of which is comprised of a sub-view, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
4. Frame rendering: rendering the rendered view or adding 3D effects, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
5. Frame synthesis: is the process of compositing a plurality of the one or more rendered views into a display interface.
With the development of terminal technology, the performance of various terminal devices (such as mobile phones) is better and better. The demands of consumers on the man-machine interaction performance of the terminal equipment are also increasing. Among them, fluency is an important human-computer interaction performance.
Fluency may be embodied as the length of a delay time from "a user inputs a user operation to a terminal device" to "the terminal device displays an image corresponding to the user operation". For example, the user operation may be an operation input by a user through a mouse or a key; alternatively, the user operation may be a touch operation of the touch screen by the user. The delay time may be referred to as a response delay of the terminal device. For example, when the user operation is a touch operation, the delay time may be referred to as a touch response delay. Wherein, the longer the delay time, the poorer the fluency (e.g., hand following performance); the shorter the delay time, the better the fluency (e.g., hand following performance).
Currently, in a use scenario where the requirement on smoothness is high, for example, in an application running process where the requirement on smoothness is high, such as list sliding, writing with a stylus pen, or drawing with a drawing board, the terminal device may consume power quickly or generate heat.
Fig. 3 is a schematic flow chart of a frame processing method in an interface display process in a terminal device in a usage scenario with a high requirement on smoothness in a possible implementation manner.
In response to a touch event (for example, a user touches a Touch Panel (TP) such as a finger touching the TP shown in fig. 3), the process of the application program may call a main thread (corresponding to the UI thread in fig. 3) to measure one or more layers corresponding to the layout touch event at the time when the vertical synchronization signal 1 (corresponding to Vsync1 in fig. 3) arrives, and then call a rendering thread to render the one or more layers. Then, in response to the completion of rendering, a hardware composition (HWC) calls a display composition process (corresponding to a surface maker in fig. 3) to perform layer composition on one or more layers rendered by drawing (i.e., one or more rendered layers) to obtain a frame; finally, the display screen may refresh display the frame at the moment the vertical synchronization signal 2 (corresponding to Vsync2 in fig. 3) arrives (corresponding to frame 1 in fig. 3).
The touch panel in fig. 3 may be integrated in the display screen 194 of the terminal device 10 shown in fig. 1. TP is also referred to as a touch sensor, such as touch sensor 180K described above. The TP may periodically detect a touch operation of the user. After the touch operation is detected by the TP, the vertical synchronization signal 1 may be awakened to trigger an application program process to perform layer drawing rendering based on the vertical synchronization signal 1, and the rendering is completed to trigger a display composition process to perform layer composition.
It can be understood that, in the frame processing method shown in fig. 3, since the terminal device determines that the focus application has a higher requirement on smoothness, the resource scheduling module of the terminal device may switch the terminal device to a high performance mode through a Boost technique, and in the high performance mode, the processing efficiency of the terminal device for processing the touch instruction is higher. The focus application is an application corresponding to the touch operation received by the terminal device, that is, an application on which the actual focus is focused. Therefore, the terminal equipment can complete measurement layout, drawing, rendering and synthesis of the layers in a synchronous period. The processing efficiency of the terminal equipment for processing the touch instruction is shortened, and therefore the display smoothness of the terminal equipment is improved. However, in the process of running the focus application, the terminal device uses the high performance mode to perform frame processing, which results in relatively high power consumption of the terminal device. Further, problems such as rapid power consumption and heat generation may occur.
In view of this, an embodiment of the present application provides a frame processing method, in which when a terminal device performs processing such as measurement layout, drawing, rendering, and composition of a layer, the terminal device may identify the layer and/or a usage scene, and may use different performance modes when processing different layers of the same application, and may use different performance modes when processing different usage scenes for the same layer. It can be understood that the higher the performance of the performance mode is, the higher the power consumption of the terminal device is, so that the higher the performance mode is, the higher the smoothness requirement is, the lower the smoothness requirement is, and thus the power consumption of the terminal device is reduced on the premise of maintaining the smoothness.
For convenience of understanding, the following describes a process of interaction between the modules involved in the frame processing method provided in the embodiment of the present application with reference to fig. 4.
Fig. 4 is a schematic diagram illustrating a process of interaction between modules in a frame processing method according to an embodiment of the present application.
As shown in fig. 4, the system of the terminal device may include a hardware layer, a kernel layer, a system library, an application framework layer, and an application layer. Wherein, the hardware layer includes: memory, touch panel, CPU, GPU and DDR. The inner core layer includes: the device comprises a capacitance value processing module, a touch control driving module, a CPU scheduling module, a GPU scheduling module and a DDR scheduling module. The system library layer includes SF. The application framework layer comprises a touch message module, a resource scheduling module and a WMS. The application layer includes applications. The applications may include system applications inherent to an operating system of the terminal device, such as a system desktop application, a system interface (system UI) application, and third party applications installed on the operating system, such as a drawing application, a game application, a chat application, and the like.
In fig. 4, after the terminal device is powered on, the touch driving module may scan the capacitance sampling values of each area on the touch panel in real time.
When the touch driving module scans and finds that the capacitance sampling value of the partial area on the touch panel changes and the change amplitude of the capacitance value is larger than or equal to a preset reporting threshold (such as 99%), the touch driving module sends a capacitance value change message to the capacitance value processing module. The capacitance value change message carries the position of the area where the capacitance sampling value changes, the capacitance value change amplitude, the equipment type, the touch type and the timestamp. The device type is used for indicating that the touch object is a hand or a stylus pen, the touch type includes a click (down), a lift (up) and a move (move), and the timestamp refers to the time when the capacitance sampling value scanned to the touch panel by the touch driving module changes. For example, the reporting threshold may be set to 99%.
When the capacitance processing module receives a capacitance change message, it first determines whether the capacitance change amplitude carried by the capacitance change message is greater than a touch threshold (the touch threshold may be 99% or 98%), if so, the capacitance processing module sends a touch instruction (touch down) to the touch message module, so that the touch instruction is forwarded to an application of an application layer through the touch message module, and if not, the capacitance processing module does not process the touch instruction.
The touch instruction carries the following information:
the coordinates and time of the contact area, as well as the device type, touch type, and timestamp. The touch area refers to an area on the touch panel, which is in contact with the touch object, and the time of the touch area refers to the time of the touch object being in contact with the touch panel. When the touch object contacts with multiple areas on the touch panel, the touch command may be a list command, and the list command includes the coordinates and time of each contact area.
Optionally, when the touch panel is integrated with a pressure sensor, the touch instruction may further carry a pressure value, where the pressure value is a pressure value of a contact pressure when the touch object is in contact with the touch panel.
And optionally, the touch message module also sends the touch instruction to the resource scheduling module.
And after receiving the touch instruction, the focus application processes the touch instruction according to own processing logic and sends an application layer layout to the window manager. The application layer layout comprises window identifications corresponding to the applications. The window identifier may include a package name and a window name of the application. The window identifiers correspond to the layers of the application one to one.
And the window manager sends each window identifier corresponding to the application to the resource scheduling module.
And the resource scheduling module determines a resource scheduling strategy according to the touch instruction, the window identifications corresponding to the application and the preset identification. The touch instruction comprises time corresponding to touch operation and an area corresponding to the touch operation.
And the resource scheduling module sends a resource switching instruction to the display synthesis process, wherein the resource switching instruction is used for instructing the display synthesis process to synthesize the frame to be displayed corresponding to the focus application under the condition of determining that the layer rendering corresponding to the focus application is completed. The resource scheduling module also determines scheduling parameters of the processor and the DDR according to the resource scheduling policy, and sends the corresponding scheduling parameters to the processor scheduling module and the DDR scheduling module to schedule the resources. The processor may be a CPU and/or a GPU, among others.
The SF monitors rendering threads of the focus application, carries out layer composition on one layer or a plurality of layers by the SF to obtain a frame to be displayed under the condition that layer rendering corresponding to the focus application is completed, and can display the frame to be displayed on a display screen of the terminal equipment under the condition that a vertical synchronization signal arrives.
The following describes in detail the frame processing method provided in the embodiments of the present application with specific embodiments. The following embodiments may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes.
Next, an interaction flow of a module in a terminal device is described with reference to fig. 5, where fig. 5 is a schematic view of an interaction flow of a module in a terminal device according to an embodiment of the present application. As shown in fig. 5, the method may include:
s500: the touch driving module scans the capacitance sampling value of the touch panel.
In a possible implementation manner, the touch driving module scans a capacitance sampling value of the touch panel after the terminal device is started.
S501: the touch driving module sends a capacitance value change message to the capacitance value processing module.
In this embodiment of the application, the capacitance value change message includes: the position of the area where the capacitance sampling value changes, the capacitance value change amplitude, the equipment type, the touch type, the timestamp and the like. The device type is used for indicating that the touch object is a hand or a stylus pen, the touch type includes a click (down), a lift (up) and a move (move), and the timestamp refers to the time when the capacitance sampling value scanned to the touch panel by the touch driving module changes.
In a possible implementation manner, in the scanning process of the touch driving module, the touch object approaches the touch panel to cause a capacitance sampling value of a corresponding area of the touch panel to change, and after the touch driving module scans that the capacitance sampling value of any area on the touch panel changes, the touch driving module sends a capacitance value change message to the capacitance value processing module.
Illustratively, the capacitance value change message includes a capacitance value change amplitude of 100%. It can be understood that the larger the variation range of the capacitance value is, the larger the probability that the touch object touches the touch screen is.
S502: the capacitance processing module judges whether the capacitance variation amplitude in the capacitance variation message is larger than the touch threshold. If yes, go to S503. If not, the process is ended.
In a possible implementation manner, after receiving the capacitance value change message, the capacitance value processing module executes S502.
Based on the example in S501, the capacitance value change amplitude carried by the capacitance value change message is 100% and is greater than the touch threshold value by 95%. Therefore, the capacitance processing module determines that the capacitance variation amplitude is greater than the touch threshold, and executes S503.
S503: and the capacitance value processing module sends a touch instruction to the touch message module.
S504: the touch message module sends a first touch instruction to the focus application.
S505: and the touch message module sends a second touch instruction to the resource scheduling module.
In this embodiment of the application, when the touch message module receives the touch instruction, the touch message module executes S504 to send the first touch instruction to the focus application, and executes S505 to send the second touch instruction to the resource scheduling module. The touch instruction (such as the first touch instruction and the second touch instruction) comprises: coordinates of the contact area, device type and timestamp, etc. Optionally, when the touch panel is integrated with a pressure sensor, the touch instruction may further include a pressure value, where the pressure value is a pressure value of a contact pressure when the touch object is in contact with the touch panel. The device type is used to illustrate whether the touch object is a hand or a stylus.
S506: the focus application executes processing logic corresponding to the first touch instruction.
In a possible implementation manner, after receiving the first touch instruction, the focus application executes processing logic corresponding to the first touch instruction.
For example, the first touch instruction is used to open a focus application, or the first touch instruction is used to switch to a page of the focus application.
S507: and the focus application sends the window identification corresponding to the focus application to the WMS.
In this embodiment of the present application, the window identifier may include a package name and a window name of the focus application.
Based on the example of S506, the main thread corresponding to the focus application sends the packet name of the focus application to the WMS, and the window name of the first interface of the focus application is com.
S508: and the WMS sends the window identifier corresponding to the focus application to the resource scheduling module.
S509: and under the condition that the window identifier corresponding to the focus application comprises a preset identifier, the resource scheduling module judges whether the first touch instruction is used for opening the focus application or switching a page of the focus application according to the window identifier corresponding to the focus application, and if so, executes S510. If not, go to S512.
In the embodiment of the application, the preset identifier includes an application package name and a window name. The resource scheduling module may determine whether the window identifier corresponding to the focus application includes a preset identifier according to the preset identifier.
In one example, the preset flags are as shown in table 1 below:
TABLE 1
Figure 361122DEST_PATH_IMAGE001
Based on the example of S507, in the case that the package name of the focus application is com.
It is to be understood that, in the case that the window name between the applications is unique, the window identifier may also include only the window name of the focus application, which is not limited in this embodiment of the present application.
In a possible implementation manner, when the window identifier corresponding to the focus application includes a preset identifier, the resource scheduling module determines that the first touch instruction is used for opening the focus application or switching to a page of the focus application when the received window identifier corresponding to the focus application is different from the historical window identifier. The historical window identifier is the window identifier received by the resource scheduling module last time.
S510: and the resource scheduling module sends a first indication message to the display composition process under the condition that the device type included in the second touch instruction is the stylus. The first indication message is used for indicating the display composition process to perform frame composition under the condition that the rendering of the layer corresponding to the focus application is completed.
In one example, as shown in fig. 6, the terminal device receives a click operation of a user on an icon of the drawing application by using a stylus pen, and in the process of displaying the first interface of the drawing application in response to the click operation, the resource scheduling module sends a first indication message to the display composition process.
S511: and the resource scheduling module schedules the resources according to the first resource scheduling strategy under the condition that the device type included in the second touch instruction is the stylus pen.
In the embodiment of the present application, the layer corresponding to the focus application is the layer represented by the window identifier corresponding to the focus application. The first resource scheduling policy is used for switching the performance mode of the terminal device to the first performance mode. The performance modes of the terminal device may include a first performance mode, a second performance mode, and a third performance mode. At least one of the frequency of the processor or the frequency of the DDR in each performance mode is different. Illustratively, the processor frequency in the second performance mode is less than or equal to the processor frequency in the first performance mode, and the DDR frequency in the second performance mode is less than or equal to the processor frequency in the first performance mode. The processor frequency in the first performance mode is less than or equal to the processor frequency in the third performance mode. The DDR frequency in the first performance mode is less than or equal to the DDR frequency in the third performance mode.
It should be noted that the performance modes of the terminal devices can be divided into more than three levels. For example, the performance modes of the terminal device further include a fourth performance mode, in the fourth performance mode, the display composition process may perform layer composition based on the vertical synchronization signal Vsync, and in the first to third performance modes, the display composition process may perform layer composition based on layer rendering completion. The hardware working frequency in the fourth performance mode is less than the hardware working frequency in the third performance mode and less than the hardware working frequency in the first performance mode. Wherein the hardware operating frequency comprises at least one of a frequency of a central processing unit, a frequency of a graphics processor, or a frequency of a double rate memory.
In a possible implementation manner, the resource scheduling module sends a parameter instruction to the hardware resource according to the first resource scheduling policy to schedule the hardware resource when the device type included in the second touch instruction is the stylus pen. The hardware resources comprise at least one of a CPU, a GPU or a DDR.
S512: and the resource scheduling module schedules resources according to the second touch instruction, the preset identifier and the window identifier corresponding to the focus application.
In this embodiment of the application, the second touch instruction is a second touch instruction that is received by the resource scheduling module last time.
In a possible implementation manner, the resource scheduling module schedules the resource according to the second resource scheduling policy when the device type included in the second touch instruction is the stylus pen and the time difference between the timestamp included in the second touch instruction and the system clock in the terminal device is smaller than the first time threshold. And the second resource scheduling strategy is used for switching the performance mode of the terminal equipment into a third performance mode.
Based on the example of fig. 6, the terminal device receives a drawing operation of a user on a first interface of a drawing application by using a stylus pen, and in the process of responding to the drawing operation to display a next frame of the first interface of the drawing application, the resource scheduling module schedules resources according to a second resource scheduling policy.
In another possible implementation manner, when the device type included in the second touch instruction is a stylus pen, and a time difference between a timestamp included in the second touch instruction and a system clock in the terminal device is greater than or equal to a first time threshold and is less than a second time threshold, the resource scheduling module schedules the resource according to the first resource scheduling policy. The first resource scheduling policy is used for switching the performance mode of the terminal equipment to the first performance mode.
In another possible implementation manner, the resource scheduling module schedules the resource according to the third resource scheduling policy when the device type included in the second touch instruction is the stylus pen and the time difference between the timestamp included in the second touch instruction and the system clock in the terminal device is greater than or equal to the second time threshold. And the third resource scheduling strategy is used for switching the performance mode of the terminal equipment to the second performance mode.
It is understood that each time the resource scheduling module schedules resources according to the resource scheduling policy, it may be referred to as a mode switching operation. Possible implementations of switching performance modes may be:
the configuration file of the terminal device stores working parameters of a CPU, a GPU and a DDR under each performance mode. And after determining the resource scheduling strategy, the resource scheduling module reads the configuration file from the memory of the terminal equipment, and searches the working parameters of the CPU, the GPU and the DDR corresponding to the target resource scheduling strategy which is stored in advance in the configuration file. And then, the resource scheduling module sends parameter instructions carrying working parameters of corresponding hardware to the CPU scheduling module, the GPU scheduling module and the DDR scheduling module respectively, so that the CPU scheduling module, the GPU scheduling module and the DDR scheduling module configure the corresponding hardware according to the parameter instructions.
Further, each resource scheduling policy in the configuration file may be configured with a corresponding duration, and when the corresponding resource scheduling module executes S511 and S512, the corresponding resource scheduling module may read the working parameter and the duration of the target resource scheduling policy from the configuration file, and send the working parameter and the duration to the scheduling module of the corresponding hardware through the parameter instruction.
And after the CPU scheduling module, the GPU scheduling module and the DDR scheduling module configure the parameters of the corresponding hardware according to the parameter instructions, if no new instruction is received after the duration, the working parameters of the corresponding hardware are automatically set as the working parameters in the second performance mode. This has the advantage that the time for the terminal device to operate in the higher performance mode can be reduced, thereby reducing the power consumption of the terminal device.
Optionally, each resource scheduling policy in the configuration file may also configure a corresponding duration. In this case, the CPU scheduling module, the GPU scheduling module, and the DDR scheduling module may automatically adjust the operating parameters of the corresponding hardware back to the operating parameters in the second performance mode when the duration of the current performance mode reaches the maximum duration preset by the configuration file, for example, 250 ms.
The following describes the process of switching the performance mode by the resource scheduling module by taking the resource scheduling module to schedule resources according to the second resource scheduling policy as an example:
the working parameters corresponding to the second resource scheduling policy stored in the configuration file may include: duration: 50 milliseconds, the lowest frequency gear of the small kernel: 5, shifting; the lowest frequency gear of the large kernel is as follows: 4, shifting; the lowest frequency gear of the super-large kernel: 5, shifting; the lowest gear of the GPU: 6, shifting; DDR lowest gear: and 5, gear. Wherein, the small kernel, the large kernel and the super-large kernel are three kinds of cores in the CPU. The duration is 50 milliseconds, which is the duration of the preset high performance mode.
The higher the lowest gear corresponding to the CPU, the GPU and the DDR is, the higher the minimum frequency of the corresponding component during working is, the higher the driving voltage is, the higher the speed is, and the higher the power consumption is. In the above example, the minimum frequency for the small core in the 5 th rank is 1625000Hz, the minimum frequency for the large core in the 4 th rank is 2200000Hz, and the minimum frequency for the very large core in the 5 th rank is 1632000 Hz.
The resource scheduling module sends a parameter instruction to the CPU scheduling module: the lowest frequency gear of the small kernel is as follows: 5, shifting; the lowest frequency gear of the large kernel is as follows: 4, shifting; the lowest frequency gear of the super-large kernel: 5, shifting; with a duration of 50 milliseconds.
After receiving the parameter instruction, the CPU scheduling module sets the minimum frequency of the small kernel of the CPU in working to be the frequency corresponding to 5 grades; setting the minimum frequency of a large kernel of a CPU during working as a frequency corresponding to 4 gears; and if the CPU scheduling module receives the parameter instruction and does not receive a new instruction after the duration of 50ms, automatically setting the working parameters of the CPU back to the working parameters of the terminal equipment in the second performance mode.
The resource scheduling module also sends the following parameter instructions to the GPU scheduling module: GPU lowest gear 6; with a duration of 50 milliseconds.
And after receiving the instruction, the GPU scheduling module calls a GPU driver to set the minimum frequency of the GPU during working to be the frequency (such as 385MHz) corresponding to the 3 gear, the frequencies corresponding to different gears of the GPU can be stored in the GPU scheduling module, and if the GPU scheduling module receives the parameter instruction and does not receive a new instruction after the duration of 50ms, the GPU scheduling module automatically sets the working parameters of the GPU back to the working parameters of the terminal equipment in the second performance mode.
The resource scheduling module also sends the following parameter instructions to the DDR scheduling module: DDR lowest gear 5; with a duration of 50 milliseconds.
After receiving the instruction, the DDR scheduling module sets the minimum frequency of the DDR during working to be the frequency corresponding to the gear 5, the frequencies corresponding to different gears of the DDR can be stored in the DDR scheduling module, and if the DDR scheduling module receives the parameter instruction and does not receive a new instruction after the duration of 50ms, the DDR scheduling module automatically sets the working parameters of the DDR back to the working parameters of the terminal equipment in the second performance mode.
The CPU scheduling module, the GPU scheduling module and the DDR scheduling module set the working parameters, and the switching of the terminal equipment to the third performance mode is equivalent.
S513: the display composition process listens to the rendering thread of the focused application.
S514: and the display composition process performs composition of one or more layers under the condition that rendering of the rendering thread of the focus application is completed, so as to obtain the frame to be displayed. The frame to be displayed may be a specific interface of the focus application, or the frame to be displayed is an image drawn by the focus application using a drawing and rendering process.
In this way, the display screen of the terminal device can display the frame to be displayed in the case where the vertical synchronization signal arrives.
It can be understood that, in the embodiment shown in fig. 5, on the premise that the device type is a stylus pen, the terminal device may use different performance modes in different layers and/or different usage scenarios, and the terminal device may also not limit the device type, and whatever device type may use different performance modes in different layers and/or different usage scenarios. This is not limited in the embodiments of the present application.
It should be noted that, in each of the above S509 to S514, the step is executed when the window identifier corresponding to the focus application includes a preset identifier. And under the condition that the window identifier corresponding to the focus application does not comprise a preset identifier, the terminal equipment adopts the current scheme to process the frame corresponding to the focus application. Illustratively, under the condition that the window identifier corresponding to the focus application does not include the preset identifier, the terminal device performs frame processing in the fourth performance mode.
In the embodiment of the application, when the terminal device performs processing such as measurement layout, drawing, rendering and synthesis of the layer, the terminal device can identify the layer and/or the use scene, different performance modes can be used when different layers of the same application are processed, and different performance modes can be used for the same layer when different use scenes are processed. It can be understood that the higher the performance of the performance mode is, the higher the power consumption of the terminal device is, so that the higher the performance mode is, the higher the smoothness requirement is, the lower the smoothness requirement is, and thus the power consumption of the terminal device is reduced on the premise of maintaining the smoothness.
A frame processing method during an application operation process including a preset mark is described below with reference to the drawings.
Fig. 7 is a schematic flowchart of a frame processing method according to an embodiment of the present application, where the frame processing method shown in fig. 7 includes the following steps:
s700: the terminal equipment receives a first user operation on an icon of the application and starts the application in response to the first user operation.
In the embodiment of the application, a Launcher logical thread, an incubator process and system services run in a system after the terminal device is powered on. The system service comprises an activity manager, a window manager and the like.
In a possible implementation, the Launcher logical thread receives a start instruction for the application, and sends a process creation instruction of the application to the system service in response to the start instruction, the system service instructs the incubator process to create a process of the application,
illustratively, the terminal equipment receives a clicking operation on an icon of the drawing application, and the drawing application is started in response to the clicking operation.
S701: the terminal device determines whether the window identifier corresponding to the application includes a preset identifier, if so, executes S702, and if not, executes S710.
In a possible implementation manner, a window manager in the terminal device obtains a window identifier corresponding to the application, and determines whether the obtained window identifier includes a preset identifier.
Based on the example in S700, the window identifier corresponding to the first interface of the drawing application is a preset identifier. As shown in fig. 8, the first interface includes a first layer 801 and a second layer 802, where the first layer 801 includes a fixed display toolbar and the second layer 802 is a layer for displaying a contact trace.
S702: the terminal device switches to the first performance mode.
In a possible implementation manner, a resource scheduling module in the terminal device sends a first indication message to the display composition process. The first indication message is used for indicating the display composition process to close a mechanism for calibrating the software vertical synchronization signal according to the hardware vertical synchronization signal, and the first indication message is also used for indicating the display composition process to monitor a rendering thread of the focus application. And a resource scheduling module in the terminal equipment schedules resources according to the first resource scheduling strategy. The first resource scheduling strategy is used for switching the terminal equipment to the first performance mode.
In this embodiment of the application, for the specific implementation that the resource scheduling module in the terminal device schedules resources according to the first resource scheduling policy, reference is made to the description of the resource scheduling module in S512 scheduling resources according to the resource scheduling policy, which is not described again.
It can be understood that, in the case that the terminal device determines that the first user operation is for opening an application including the preset identifier or for switching to an application including the preset identifier, the terminal device may switch to the first performance mode.
S703: and the terminal equipment captures whether the handwriting pen touches the white list application layer. If yes, go to step S704, otherwise go to step S705.
In the embodiment of the application, the white list application layer is a layer corresponding to a window identifier represented by a preset identifier.
In a possible implementation manner, in the process that the terminal device runs the application including the preset identifier in the first performance mode, the terminal device receives a second user operation, the terminal device acts on a second interface in the second user operation, and the terminal device determines that the handwriting pen touches the white list application layer under the condition that the second interface includes the layer including the preset identifier.
Based on the example of fig. 8, the second layer 802 is a white list application layer, and in a case where the stylus pen touches the first layer 801, the terminal device determines that the stylus pen does not touch the white list application layer. Under the condition that the handwriting pen touches the second layer 802, the terminal device determines that the handwriting pen touches the white list application layer.
S704: the terminal device switches to the third performance mode.
In a possible implementation, the terminal device switches from the first performance mode to the third performance mode. Therefore, the terminal device can perform layer processing on the white list application layer in the third performance mode. Wherein, the layer processing comprises: measuring at least one of layout, drawing, rendering, or compositing.
For a possible implementation manner, reference is made to the description of the resource scheduling module in S512 scheduling resources according to the resource scheduling policy, which is not described again.
S705: and the terminal equipment judges whether the time length from the latest touch of the white list application layer is less than a time threshold value. If so, S706 is executed, and if not, S707 is executed.
In a possible implementation manner, if the second user operation is the latest user operation, the time difference between the timestamp of the second user operation and the system clock of the terminal device is greater than or equal to the first time threshold, and the terminal device determines that the time length from the latest touch of the white list application layer is less than the time threshold when the time difference is less than the second time threshold.
S706: the terminal device switches to the first performance mode.
Based on the assumption in S705, the terminal device switches from the third performance mode to the first performance mode.
Therefore, when the time length of the white list application layer of the terminal device is not operated by the user is greater than or equal to the first time threshold and smaller than the second time threshold, the terminal device is switched from the third performance mode with higher power consumption to the first performance mode with lower power consumption, and the power consumption of the terminal device can be reduced. And the next operation of the user on the touch screen of the terminal equipment can be quickly responded.
S707: the terminal device switches to the second performance mode.
In a possible implementation manner, the terminal device switches from the first performance mode to the second performance mode when determining that the time length from the latest touch of the white list application layer is greater than or equal to the second time threshold.
Therefore, under the condition that the duration of the white list application layer in the terminal device is not operated by the user and is greater than or equal to the second time threshold, the terminal device is switched from the first performance mode with relatively high power consumption to the second performance mode with lower power consumption, and the power consumption of the terminal device is further reduced.
S708: the terminal equipment judges whether to quit the application. If yes, go to step S709. If not, S703 is executed.
In a possible implementation manner, the terminal device receives a third user operation, and determines to exit the application when the third user operation is used for displaying a third interface and a layer corresponding to the third interface does not include a layer with a preset identifier.
It should be noted that the exit application in this embodiment is used to characterize that the interface displayed by the terminal device does not display the interface corresponding to the application any more. The exiting of the application in this embodiment includes that the process of the application is cleaned up, or the process of the application is switched to a background operation.
S709: the terminal device switches to the fourth performance mode.
In a possible implementation manner, the resource scheduling module in the terminal device sends the second indication message to the display composition process. Wherein the second indication message is used to indicate the display composition process to open a mechanism for calibrating the software vertical synchronization signal according to the hardware vertical synchronization signal, and the second indication message is also used to indicate the display composition process to perform frame composition according to the software vertical synchronization signal. And a resource scheduling module in the terminal equipment schedules resources according to the fourth resource scheduling strategy. And the fourth resource scheduling strategy is used for switching the terminal equipment to the fourth performance mode.
In this embodiment of the application, for specific implementation of the resource scheduling module in the terminal device scheduling resources according to the fourth resource scheduling policy, reference is made to the description of the resource scheduling module scheduling resources according to the resource scheduling policy in S512, which is not described again.
S710: and the terminal equipment adopts the fourth performance mode to carry out frame processing.
As shown in fig. 9, a timing diagram of frame processing for the terminal device in the fourth performance mode is shown. In fig. 9, in response to a touch event (for example, a touch operation of a touch panel by a user, a process of an application program may call a main thread (corresponding to the UI thread in fig. 9) to measure one or more layers corresponding to a layout touch event at a time when a vertical synchronization signal 1 (corresponding to Vsync1 in fig. 9) arrives, and then call a rendering thread to render the one or more layers.
In theory, the power consumption of the terminal device in the fourth performance mode may be smaller than the power consumption of the terminal device in the second performance mode, which is not limited in the embodiment of the present application.
In the embodiment of the application, the terminal device determines, in the application starting process, that a window identifier corresponding to an application includes a preset identifier, namely, switches to a second performance mode, captures whether a white list application layer is touched by a stylus pen in the application running process, switches to a third performance mode in the touch condition, switches to a first performance mode in the condition that the time length of the stylus pen leaving the touch screen is less than a time length threshold, and switches to the second performance mode in the condition that the time length of the stylus pen leaving the touch screen is greater than or equal to the time length threshold, so that in a scene where a user is operating a layer with a higher fluency requirement, the layer is processed in a higher performance mode, in a scene where the user pauses the operation with a higher fluency requirement, the layer is processed in a lower performance mode in the scene where the user pauses the operation with the higher fluency requirement, and in a scene where the user pauses the operation with the higher fluency requirement on the layer is longer, the layer is processed in a lower performance mode And processing is carried out, so that the power consumption of the terminal equipment is reduced on the premise of maintaining fluency.
The embodiment of the application provides a terminal device, and the structure of the terminal device is shown in fig. 1. The memory of the terminal device may be configured to store at least one program instruction, and the processor is configured to execute the at least one program instruction to implement the technical solutions of the above-mentioned method embodiments. The implementation principle and technical effect are similar to those of the embodiments related to the method, and are not described herein again.
The embodiment of the application provides a chip. The chip comprises a processor for calling a computer program in a memory to execute the technical solution in the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which, when running on a terminal device, enables the terminal device to execute the technical solutions in the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the application provides a computer-readable storage medium, on which program instructions are stored, and when the program instructions are executed by a crowdsourcing equipment, the terminal equipment is caused to execute the technical scheme of the embodiment. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (14)

1. A frame processing method is applied to a terminal device, and the method comprises the following steps:
the terminal equipment identifies the identifier of the applied layer;
the terminal equipment carries out layer processing on the layer without the preset identifier by adopting a first performance mode; the layer processing comprises the following steps: measuring at least one of layout, drawing, rendering, or compositing;
the terminal equipment performs layer processing on the layer with the preset identifier by adopting a second performance mode; the hardware working frequency of the terminal equipment in the first performance mode is smaller than that in the second performance mode; the hardware operating frequency includes at least one of a frequency of a graphics processor, a frequency of a central processing unit, or a frequency of a double rate memory.
2. The frame processing method according to claim 1, wherein before the terminal device identifies the identifier of the applied layer, the method further comprises:
the terminal equipment receives a first user operation; the first user operation is to open the application, or the first user operation is to switch to the application; the terminal equipment runs in the first performance mode when receiving the first user operation;
responding to the first user operation, and starting the application in a third performance mode by the terminal equipment under the condition that the terminal equipment identifies that the window identifier corresponding to the application comprises the preset identifier; the hardware operating frequency of the terminal device in the third performance mode is less than the hardware operating frequency of the terminal device in the second performance mode, and the hardware operating frequency of the terminal device in the third performance mode is greater than the hardware operating frequency of the terminal device in the first performance mode.
3. The frame processing method according to claim 2, wherein the terminal device comprises a resource scheduling module; a display synthesis process runs in the terminal equipment; before the terminal device starts the application in the third performance mode, the method further includes:
the resource scheduling module acquires a first touch instruction corresponding to the first user operation;
the resource scheduling module determines that the first touch instruction is used for opening the application, or the resource scheduling module sends a first indication message to the display composition process when the first touch instruction is used for switching to the application; the first indication message is used for indicating the display composition process to perform frame composition under the condition that the rendering of the layer corresponding to the application is completed;
the resource scheduling module schedules resources according to a first resource scheduling strategy; the first resource scheduling policy is used for switching the terminal device to the third performance mode.
4. The frame processing method according to claim 2, further comprising:
the terminal equipment receives a second user operation in the process of running the application in the third performance mode;
when the second user operation acts on a first interface and the first interface comprises a layer with the preset identifier, the terminal equipment is switched to the second performance mode;
the terminal equipment performs the layer processing on the layer with the preset identifier by adopting a second performance mode, and the method comprises the following steps:
the terminal equipment performs measurement layout, drawing and rendering on the layer with the preset identifier in the second performance mode; and the terminal equipment responds to the rendering completion to synthesize the layer with the preset identification.
5. The frame processing method according to claim 4, wherein the second user operation is the latest user operation acquired by the terminal device; the method further comprises the following steps:
and the terminal equipment determines that the time difference between the timestamp of the second user operation and the system clock of the terminal equipment is greater than or equal to a first time threshold and is less than a second time threshold, and switches the terminal equipment from the second performance mode to the third performance mode.
6. The frame processing method according to claim 5, wherein the second user operation is the latest user operation acquired by the terminal device; the method further comprises the following steps:
the terminal device switches from the third performance mode to a fourth performance mode when determining that the time difference between the timestamp of the second user operation and the system clock of the terminal device is greater than the second time threshold; the hardware operating frequency of the terminal device in the fourth performance mode is less than the hardware operating frequency of the terminal device in the third performance mode.
7. The frame processing method according to claim 3, further comprising:
the terminal equipment receives a second user operation in the process of running the application in the third performance mode;
when the second user operation acts on a first interface and the first interface comprises a layer with the preset identifier, the terminal equipment is switched to the second performance mode;
the terminal equipment performs the layer processing on the layer with the preset identifier by adopting a second performance mode, and the method comprises the following steps:
the terminal equipment performs measurement layout, drawing and rendering on the layer with the preset identifier in the second performance mode; and the terminal equipment responds to the rendering completion to synthesize the layer with the preset identification.
8. The frame processing method according to any one of claims 1 to 7, characterized in that the method further comprises:
the terminal equipment receives a third user operation, and the third user operation is used for displaying a second interface; under the condition that the layer corresponding to the second interface does not include the layer with the preset identifier, the terminal equipment responds to the third user operation, and the terminal equipment is switched to the first performance mode;
the terminal equipment carries out layer processing on the layer without the preset identifier by adopting a first performance mode, and the method comprises the following steps:
and the terminal equipment performs the layer processing on the layer corresponding to the second interface in the first performance mode.
9. The frame processing method according to claim 8, wherein the terminal device comprises a resource scheduling module; a main thread, a rendering thread and a display synthesis process run in the terminal equipment; the terminal device responds to the third user operation, and the terminal device switches to the first performance mode, and the method comprises the following steps:
the resource scheduling module acquires a second touch instruction corresponding to the third user operation;
the resource scheduling module sends a second indication message to the display composition process when determining that the second touch instruction is used for displaying a second interface, wherein the second indication message is used for indicating the display composition process to perform layer composition according to a vertical synchronization signal;
the resource scheduling module schedules resources according to a second resource scheduling strategy; the second resource scheduling policy is used for switching the terminal device to the first performance mode;
the step of performing, by the terminal device, the layer processing on the layer corresponding to the second interface in the first performance mode includes: the main thread performs measurement layout on the layer corresponding to the second interface under the condition that the vertical synchronization signal arrives; the rendering thread draws and renders the layer corresponding to the second interface when the next vertical synchronous signal arrives; and the display synthesis process synthesizes the image layer corresponding to the second interface under the condition that another vertical synchronous signal arrives.
10. The frame processing method according to claim 3, wherein the terminal device further comprises a hardware scheduling module; the resource scheduling module schedules resources according to a first resource scheduling policy, including:
the resource scheduling module acquires a target hardware parameter corresponding to the first resource scheduling strategy according to a target corresponding relation; the target corresponding relation is the corresponding relation between a resource scheduling strategy and a hardware parameter;
the resource scheduling module sends the target hardware parameter to the hardware scheduling module corresponding to the target hardware parameter according to the target hardware parameter;
and the hardware scheduling module sets the hardware parameter corresponding to the target hardware parameter as the target hardware parameter.
11. The frame processing method according to claim 10, wherein the target hardware parameter further includes a preset duration; the method further comprises the following steps:
and when the hardware scheduling module does not receive the hardware parameters after the preset time length, the hardware scheduling module sets the hardware parameters corresponding to the target hardware parameters as the hardware parameters in the first performance mode.
12. A terminal device, characterized in that it comprises a processor for invoking a computer program in a memory for performing the frame processing method according to any one of claims 1-11.
13. A computer-readable storage medium storing computer instructions which, when run on a terminal device, cause the terminal device to perform the frame processing method according to any one of claims 1-11.
14. A chip, characterized in that it comprises a processor for calling a computer program in a memory to perform the frame processing method according to any one of claims 1-11.
CN202111647157.XA 2021-12-31 2021-12-31 Frame processing method, device and storage medium Active CN113986002B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111647157.XA CN113986002B (en) 2021-12-31 2021-12-31 Frame processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111647157.XA CN113986002B (en) 2021-12-31 2021-12-31 Frame processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN113986002A true CN113986002A (en) 2022-01-28
CN113986002B CN113986002B (en) 2022-06-17

Family

ID=79734952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111647157.XA Active CN113986002B (en) 2021-12-31 2021-12-31 Frame processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN113986002B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089057A (en) * 2022-08-26 2023-05-09 荣耀终端有限公司 Resource scheduling method, device, storage medium and program product
CN116700915A (en) * 2022-12-23 2023-09-05 荣耀终端有限公司 Resource scheduling method and device
WO2023174322A1 (en) * 2022-03-17 2023-09-21 华为技术有限公司 Layer processing method and electronic device
WO2023216146A1 (en) * 2022-05-11 2023-11-16 北京小米移动软件有限公司 Display image updating method and apparatus and storage medium
CN117130462A (en) * 2023-03-20 2023-11-28 荣耀终端有限公司 Equipment control method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104469478A (en) * 2014-12-31 2015-03-25 中科创达软件股份有限公司 Information processing method and device and electronic devices
CN104978723A (en) * 2014-04-14 2015-10-14 阿里巴巴集团控股有限公司 Image processing method and device
CN108491275A (en) * 2018-03-13 2018-09-04 广东欧珀移动通信有限公司 program optimization method, device, terminal and storage medium
CN111724293A (en) * 2019-03-22 2020-09-29 华为技术有限公司 Image rendering method and device and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978723A (en) * 2014-04-14 2015-10-14 阿里巴巴集团控股有限公司 Image processing method and device
CN104469478A (en) * 2014-12-31 2015-03-25 中科创达软件股份有限公司 Information processing method and device and electronic devices
CN108491275A (en) * 2018-03-13 2018-09-04 广东欧珀移动通信有限公司 program optimization method, device, terminal and storage medium
CN111724293A (en) * 2019-03-22 2020-09-29 华为技术有限公司 Image rendering method and device and electronic equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023174322A1 (en) * 2022-03-17 2023-09-21 华为技术有限公司 Layer processing method and electronic device
WO2023216146A1 (en) * 2022-05-11 2023-11-16 北京小米移动软件有限公司 Display image updating method and apparatus and storage medium
CN116089057A (en) * 2022-08-26 2023-05-09 荣耀终端有限公司 Resource scheduling method, device, storage medium and program product
CN116089057B (en) * 2022-08-26 2023-10-20 荣耀终端有限公司 Resource scheduling method, device, storage medium and program product
CN116700915A (en) * 2022-12-23 2023-09-05 荣耀终端有限公司 Resource scheduling method and device
CN117130462A (en) * 2023-03-20 2023-11-28 荣耀终端有限公司 Equipment control method and electronic equipment

Also Published As

Publication number Publication date
CN113986002B (en) 2022-06-17

Similar Documents

Publication Publication Date Title
CN112717370B (en) Control method and electronic equipment
CN109814766B (en) Application display method and electronic equipment
CN114816210B (en) Full screen display method and device of mobile terminal
CN113986002B (en) Frame processing method, device and storage medium
WO2021000881A1 (en) Screen splitting method and electronic device
CN113254120B (en) Data processing method and related device
CN112114912A (en) User interface layout method and electronic equipment
CN110830645B (en) Operation method, electronic equipment and computer storage medium
CN114461051B (en) Frame rate switching method and device and storage medium
CN111553846A (en) Super-resolution processing method and device
WO2020155875A1 (en) Display method for electronic device, graphic user interface and electronic device
WO2023066395A1 (en) Application running method and related device
CN115048012B (en) Data processing method and related device
CN114327666A (en) Application starting method and device and electronic equipment
WO2023065873A1 (en) Frame rate adjustment method, terminal device, and frame rate adjustment system
CN112068907A (en) Interface display method and electronic equipment
CN110968252B (en) Display method of interactive system, interactive system and electronic equipment
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN111880661A (en) Gesture recognition method and device
CN114201738A (en) Unlocking method and electronic equipment
CN115686252B (en) Position information calculation method in touch screen and electronic equipment
CN114500732B (en) Interface display method, electronic equipment and storage medium
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN114740986A (en) Handwriting input display method and related equipment
CN115904184B (en) Data processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant