WO2021052070A1 - Procédé d'identification de fréquence de trames et dispositif électronique - Google Patents

Procédé d'identification de fréquence de trames et dispositif électronique Download PDF

Info

Publication number
WO2021052070A1
WO2021052070A1 PCT/CN2020/108714 CN2020108714W WO2021052070A1 WO 2021052070 A1 WO2021052070 A1 WO 2021052070A1 CN 2020108714 W CN2020108714 W CN 2020108714W WO 2021052070 A1 WO2021052070 A1 WO 2021052070A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame rate
application
target frame
images
rendering
Prior art date
Application number
PCT/CN2020/108714
Other languages
English (en)
Chinese (zh)
Inventor
李宗峰
王绪
周未来
丁少文
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021052070A1 publication Critical patent/WO2021052070A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping

Definitions

  • This application relates to the technical field of electronic equipment, and in particular to a frame rate identification method and electronic equipment.
  • the frame rate is the frequency at which a bitmap image, which is called a frame, continuously appears on the display, and can be expressed in frames per second (fps).
  • the game application allows the user to set the frame rate. For example, for a game application, it has a high frame rate mode and a normal frame rate mode. Among them, the frame rate corresponding to the high frame rate mode is 60fps, and the frame rate corresponding to the normal frame rate mode is 30fps.
  • the frame rate is set to 60 fps. If the user selects the normal frame rate mode of the game application, when the game application is running on the electronic device, the frame rate is set to 30 fps.
  • a frame stabilization scheme is proposed.
  • the operating system needs to adjust the central processing unit (CPU) and graphics processing unit (GPU) based on the set frame rate of the game application currently running on the electronic device. ) And other operating frequencies to provide just enough performance supply (or called computing resources).
  • the embodiments of the present application provide a frame rate identification method and electronic device, which can quickly determine a frame rate equal to or close to the set frame rate of a third-party application without relying on a third party to perform frame stabilization. .
  • an embodiment of the present application provides a frame rate identification method, which is applied to an electronic device; the method includes: stabilizing the image rendering and rendering performed by the first application according to the current target frame rate;
  • the frame length of N continuous images is determined at the receiving time of the rendering result of an application, and the N continuous images are the images rendered and rendered by the first application;
  • the measured frame rate is determined according to the frame length of the N continuous images ;
  • According to the measured frame rate and the current target frame rate determine a new target frame rate, so as to stabilize the frame of the image rendering and rendering performed by the first application according to the new target frame rate.
  • the determining the measured frame rate according to the frame length of the N continuous images includes: according to the average frame length of the N continuous images, from A first interval is determined among a plurality of preset frame rate intervals; and the actual measured frame rate is determined according to the first interval.
  • the determining the actual measured frame rate according to the first interval includes: setting the upper limit frame rate of the first interval Used as the measured frame rate.
  • said determining a new target based on the measured frame rate and the current target frame rate includes: when the actual measured frame rate is equal to the current target frame rate, and the number of first-type images in the N continuous images is greater than a first threshold, the current target frame rate and the first A first frame rate obtained by adding a preset frame rate is used as the new target frame rate; wherein the frame rate corresponding to the first type of image is greater than the second frame rate, and the second frame rate is determined by The current target frame rate plus the second preset frame rate is obtained.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: when the actual measured frame rate is equal to the current target frame rate, and the number of first-type images in the N continuous images is greater than a first threshold, and the first type of images in the N continuous images
  • the first frame rate obtained by adding the current target frame rate and the first preset frame rate is used as the The new target frame rate; wherein the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate;
  • the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: when the actual measured frame rate is greater than the current target frame rate, using the actual measured frame rate as the new target frame rate.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: using the larger frame rate of the measured frame rate and the first frame rate as the new target frame rate, wherein the first frame rate is determined by the current target frame rate and the first frame rate.
  • the preset frame rate is added together.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: when the actual measured frame rate is less than the current target frame rate, using the actual measured frame rate as the new target frame rate.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: when the measured frame rate is less than the current target frame rate, and the average sleep time of the rendering thread when the first application is rendering the N consecutive images is greater than a second threshold, the The measured frame rate is used as the new target frame rate.
  • the determining a new target is based on the actual measured frame rate and the current target frame rate
  • the frame rate includes: when the measured frame rate is less than the current target frame rate, and the first application is performing rendering and rendering of the N consecutive images, the average sleep time of the rendering thread is greater than a second threshold, and When the number of images of the second type in the N continuous images is greater than the third threshold, the measured frame rate is used as the new target frame rate; wherein the frame rate corresponding to the second type of image is less than the third frame Rate, the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
  • the measured frame rate includes at least two measured frame rates;
  • Determining the latest target frame rate includes: when the actual measured frame rate is less than the current target frame rate, and the at least two frame rates are equal, using the actual measured frame rate as the latest target frame rate .
  • the measured frame rate includes at least two measured frame rates;
  • the determination of the latest target frame rate includes: when the measured frame rate is less than the current target frame rate, and the at least two frame rates are equal, and the number of images of the second type in the N consecutive images When it is greater than the fourth threshold, the actual measured frame rate is used as the latest target frame rate; wherein the frame rate corresponding to the second type of image is less than the third frame rate, and the third frame rate is determined by the current The target frame rate is subtracted from the third preset frame rate.
  • an embodiment of the present application provides a frame rate identification device, the device includes:
  • the frame stabilization unit is used to stabilize the frame of the image rendering and rendering performed by the first application according to the current target frame rate
  • the first determining unit is configured to determine the frame length of N continuous images according to the receiving moment of receiving the rendering and rendering result of the first application, where the N continuous images are the images rendered and rendered by the first application;
  • the second determining unit is configured to determine the actual measured frame rate according to the frame length of the N continuous images
  • the third determining unit is configured to determine a new target frame rate according to the actual measured frame rate and the current target frame rate, so as to perform image rendering and rendering performed by the first application according to the new target frame rate Stable frame.
  • the second determining unit is further configured to determine from a plurality of preset frame rate intervals according to the average frame length of the N consecutive images The first interval; according to the first interval, the measured frame rate is determined.
  • the second determining unit is further configured to use the upper limit frame rate of the first interval as the measured frame rate.
  • the third determining unit is further configured to: when the measured frame rate is equal to the current target Frame rate, and when the number of images of the first type in the N continuous images is greater than the first threshold, the first frame rate obtained by adding the current target frame rate and the first preset frame rate is used as The new target frame rate; wherein the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
  • the third determining unit is further configured to: when the measured frame rate is equal to the current target Frame rate, and the number of images of the first type in the N continuous images is greater than a first threshold, and the number of images of the first type in the N continuous images is greater than that of the second type of images in the N continuous images
  • the number of class images, the first frame rate obtained by adding the current target frame rate and the first preset frame rate is used as the new target frame rate; wherein, the first class image corresponds to The frame rate is greater than the second frame rate, and the second frame rate is obtained by adding the current target frame rate to the second preset frame rate; the frame rate corresponding to the second type of image is less than the third frame rate, so
  • the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
  • the third determining unit is further configured to: when the measured frame rate is greater than the current target For the frame rate, the actual measured frame rate is used as the new target frame rate.
  • the third determining unit is further configured to compare the measured frame rate with the first frame rate.
  • the larger frame rate of is used as the new target frame rate, where the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
  • the third determining unit is further configured to: when the measured frame rate is less than the current target For the frame rate, the actual measured frame rate is used as the new target frame rate.
  • the third determining unit is further configured to: when the measured frame rate is less than the current target When the average sleep time of the rendering thread when the first application performs rendering and rendering of the N consecutive images is greater than a second threshold, the measured frame rate is used as the new target frame rate.
  • the third determining unit is further configured to: when the measured frame rate is less than the current target Frame rate, and the average sleep time of the rendering thread when the first application is rendering the N consecutive images is greater than the second threshold, and the number of the second-type images in the N consecutive images is greater than the third threshold
  • the measured frame rate is used as the new target frame rate; wherein, the frame rate corresponding to the second type of image is less than the third frame rate, and the third frame rate is determined by the current target frame rate. It is obtained by subtracting the third preset frame rate.
  • the measured frame rate includes at least two measured frame rates; the third determining unit also uses When the actual measured frame rate is less than the current target frame rate, and the at least two frame rates are equal, the actual measured frame rate is used as the latest target frame rate.
  • the measured frame rate includes at least two measured frame rates; the third determining unit further When the measured frame rate is less than the current target frame rate, and the frame rates are the same at least twice, and the number of images of the second type in the N consecutive images is greater than a fourth threshold, The measured frame rate is used as the latest target frame rate; wherein the frame rate corresponding to the second type image is less than a third frame rate, and the third frame rate is subtracted from the current target frame rate. Get the preset frame rate.
  • an embodiment of the present application provides an electronic device, including a processor and a memory; wherein the memory is used to store computer execution instructions; when the electronic device is running, the processor executes the memory storage The computer executes instructions to make the electronic device execute the method described in the first aspect.
  • an embodiment of the present application provides a computer storage medium, the computer storage medium includes computer instructions, when the computer instructions run on an electronic device, the electronic device is caused to execute the method described in the first aspect .
  • embodiments of the present application provide a computer program product, and the program code included in the computer program product implements the method described in the first aspect when the program code included in the computer program product is executed by a processor in an electronic device.
  • the frame rate identification method provided by the embodiments of the present application, it is possible to quickly identify a frame rate that is equal to or close to the set frame rate of the application when the operating system does not know or does not accurately know the set frame rate of the application.
  • the recognized frame rate can be used as the target frame rate used by the operating system for frame stabilization, so as to improve the user experience of the application and avoid or reduce unnecessary power consumption.
  • FIG. 1 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • FIG. 2 is a block diagram of the software structure of an electronic device provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of a module in a system library receiving an image from an application according to an embodiment of the application;
  • FIG. 4 is a schematic block diagram of a frame rate identification device provided by an embodiment of this application.
  • FIG. 5 is a block diagram of the software structure of an electronic device provided by an embodiment of the application.
  • FIG. 6 is a flowchart of a frame rate identification method provided by an embodiment of the application.
  • FIG. 7 is a flowchart of a frame rate identification method provided by an embodiment of the application.
  • FIG. 8 is a schematic block diagram of a frame rate identification device provided by an embodiment of the application.
  • FIG. 9 is a schematic block diagram of an electronic device according to an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • the terms “including”, “including”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized.
  • the operating system of the electronic device can provide computing resources for all programs currently running on the electronic device according to the target frame rate of the application.
  • the application may be a game application, or a video playback application, etc., which will not be listed here.
  • the application can be a third-party application.
  • the user can download and merge the third-party application from the application market, etc., and the user can also uninstall the third-party application.
  • Computing resources can represent computing power. The more computing resources, the stronger the computing power. It is easy to understand that the operating frequency of CPU, GPU, double data rate synchronous dynamic random access memory (DDR), etc., determines the computing capabilities of electronic devices. The amount of computing resources is positively related to the operating frequency of CPU, GPU, DDR, etc.
  • the operating system of the electronic device may determine the frame length of the image according to the receiving moment when the image rendered and rendered by the application A is received.
  • the image may also be referred to as the rendering result of application A rendering.
  • the frame length of the image may be the time interval between the receiving moment of the image and the receiving moment of the previous image of the image, the previous image of the image is the rendering result of application A received by the operating system before the image is received, and Adjacent to the received image.
  • the operating system can determine the frame rate of the image according to the frame length of the image. Specifically, the duration of 1000 milliseconds can be divided by the frame length of the image to obtain the measured frame rate of the image.
  • the measured frame rate of the image can indicate or reflect how fast the operating system receives the rendering result (image) of the application A from the application A.
  • the operating system can adjust the computing resources according to the size relationship between the measured frame rate of the image and the target frame rate. Specifically, the operating frequency of CPU, GPU, DDR, etc. can be adjusted so that the provided computing resources can make the actual frame rate of the rendered image drawn by application A stabilize at about the target frame rate, so as to improve user experience and reduce or avoid unnecessary Power consumption overhead.
  • the above process can be referred to as stabilizing the image rendering and rendering performed by the application A, where the target frame rate can be referred to as the target frame rate for stabilizing the image rendering and rendering performed by the application A.
  • the operating system can receive the rendered image C drawn by the application A from the application A.
  • the operating system receives the image D rendered by the application A from the application A.
  • Image D is the next image of image C and is adjacent to image C. That is, after the operating system receives image C from application A, the image received from application A is image D.
  • the time interval between the second moment and the first moment may be referred to as the frame length of the image D.
  • the 1000 millisecond duration is divided by the frame length of the image D, and the measured frame rate of the rendered image D by the application A can be obtained.
  • the operating system will increase the computing resources provided by it. If the frame rate of the rendered image D drawn by the application A is greater than the target frame rate of the stable frame, the operating system lowers the computing resources provided by it.
  • the operating system may determine the average frame length of each of the multiple consecutive images received from the application A according to the frame length of each image, and the multiple consecutive images are images rendered and drawn by the application A.
  • the frame length of each image can be referred to the introduction in the previous embodiment, and will not be repeated here.
  • the time length of 1000 milliseconds is divided by the average frame length of each image to obtain the measured frame rate of the multiple continuous images. If the measured frame rate of the multiple continuous images is less than the target frame rate of the stable frame, the operating system will increase the computing resources provided by it. If the measured frame rate of the multiple continuous images is greater than the target frame rate of the stable frame, the operating system reduces the computing resources provided by it.
  • the operating system can divide the computing resources that the electronic device can provide into multiple levels, where the computing power corresponding to any level of the computing resource in the multiple levels is smaller than the computing power corresponding to the computing resource of the previous level.
  • the operating frequencies of CPU, GPU, DDR, etc. can be graded to obtain different levels of computing resources.
  • a computing resource of level A can correspond to a CPU operating frequency of 12000MHz, a GPU operating frequency of 500MHz, and a DDR operating frequency of 1066.
  • a computing resource of level B corresponds to a CPU operating frequency of 17000MHz, and a GPU operating frequency of 600MHz.
  • the operating frequency of DDR is 1333MHz
  • the operating frequency of CPU corresponding to computing resources of level C is 22000MHz
  • the operating frequency of GPU is 700MHz
  • the operating frequency of DDR is 1600MHz, and so on, which will not be listed here.
  • the operating system lowers or raises the computing resources, it can adjust one level at a time, or it can adjust multiple levels at a time. For example, if the difference between the measured frame rate of the image and the target frame rate of the stable frame is small, you can adjust one level; if the difference between the measured frame rate of the image and the target frame rate of the stable frame is large, you can Adjust multiple levels.
  • This embodiment only exemplifies the classification and adjustment methods of computing resources, and does not constitute a limitation. Developers can classify computing resources and set specific rules for adjusting computing resources based on experience or experiments.
  • the target frame rate of the stable frame is an important basis for the stable frame solution to adjust the computing resources. Therefore, it is very important for the stable frame solution to enable the operating system to quickly and accurately obtain the target frame rate of the stable frame.
  • the user can set the frame rate of the application, for example, the frame rate can be set through the frame rate setting interface of the application, in order to hope that the electronic device displays the application screen at the frame rate set by the user.
  • the frame rate of the application set by the user may be referred to as the set frame rate. If the target frame rate for image rendering and rendering of the application for frame stabilization is inconsistent with the set frame rate, it will be difficult for the electronic device to display the image of the application at the set frame rate, resulting in poor user experience.
  • the embodiment of the present application provides a frame rate identification method.
  • the rendering and rendering of the first application can be determined
  • the measured frame rate of multiple continuous images, and according to the measured frame rate and the current target frame rate, a new target frame rate equal to or close to the set frame rate is determined, so that the first application can be performed according to the new target frame rate.
  • the image is drawn and rendered for a stable frame. In this way, the set frame rate of the application can be quickly and accurately identified, especially when the application is a third-party application, the set frame rate of the application can be quickly and accurately identified without relying on a third party, and Perform frame stabilization to improve user experience.
  • the frame rate identification method of the embodiment of the present application can be applied to electronic equipment.
  • the electronic device may be a portable electronic device such as a mobile phone, a tablet computer, a digital camera, a personal digital assistant (PDA), a wearable device, and a laptop computer (laptop).
  • portable electronic devices include, but are not limited to, carrying Or other portable electronic devices with operating systems.
  • the aforementioned portable electronic device may also be other portable electronic devices, such as a laptop with a touch-sensitive surface (such as a touch panel). It should also be understood that, in some other embodiments of the present application, the electronic device may not be a portable electronic device, but a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the embodiment of the present application does not specifically limit the type of electronic device.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface can include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may couple the touch sensor 180K, the charger, the flash, the camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and transforms it into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers and so on.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of this application are based on a layered architecture
  • the system is taken as an example to illustrate the software structure of the electronic device 100 by way of example.
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, music, video, short message, and games.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify that the download is complete, and message reminders.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), frame rate recognizer, media library (Media Libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the surface manager can provide buffer queue and surfaceflinger.
  • the buffer queue, surfaceflinger, and applications that need to perform dynamic picture rendering and rendering in the application layer form a graphics producer-consumer model.
  • the application program that needs to perform dynamic picture rendering is a producer, and the surfaceflinger is a consumer.
  • the buffer queue may include multiple buffers, and the buffers may be used as a carrier for image transmission.
  • the application when the application needs to draw a rendered image, it calls a buffer in an idle state in the buffer queue and draws the rendered image in the called buffer.
  • the process of the application calling the buffer can be called the dequeue of the buffer.
  • the buffer can be handed over to the buffer queue.
  • the handover process can be referred to as the enqueue of the buffer.
  • the buffer queue can receive the rendered image drawn by the application from the application.
  • the consumer namely the surfaceflinger, can obtain the buffer containing the rendered image in the buffer queue, and use the image for image merging, for example, merging the image with the status bar.
  • the frame rate recognizer can obtain and record the receiving time of the image received by the buffer queue from the application, and calculate the time interval between the receiving time of two consecutive images received.
  • the time interval between the receiving moments of two adjacent images may reflect or represent the frame rate (rate) at which the application program draws the rendered image.
  • the frame rate recognizer can determine a frame rate equal to or close to the set frame rate of the application program according to the rate at which the application program draws the rendered image. The specific process will be described below in conjunction with FIG. 6.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the two-dimensional graphics engine is a graphics engine for two-dimensional graphics.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes the touch operation into the original input event (including touch coordinates, time stamp of the touch operation, etc.).
  • the original input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the second frame rate of the game application as an example, the application can respond to the touch operation and adjust the target frame rate from the frame rate T to the frame rate t .
  • the device may be a software device included in the operating system of the electronic device 100, that is, the device belongs to the operating system side.
  • the device may include an information collection module, a frame stabilization module, and a frame rate recognition module.
  • Application A may be an application that needs to perform dynamic picture rendering and rendering, for example, it may be a game application.
  • the application can be a third-party application.
  • the user can download and merge the third-party application from the application market, etc., and the user can also uninstall the third-party application.
  • the information collection module can collect the receiving time when the operating system receives the image rendered and drawn by the application from the application, and transfer the collected receiving time to the frame stabilization module.
  • the frame stabilization module can determine the frame length of the image according to the image receiving time, and then can determine the measured frame rate of the image, and then can adjust the computing resources in combination with the current target frame rate of the frame stabilization module to achieve frame stabilization.
  • the specific process of frame stabilization and the determination method of the frame length can be referred to the above introduction, which will not be repeated here.
  • the developer of the frame rate recognition device or the developer of the operating system may preset at least one initial target frame rate.
  • the frame stabilization module can determine an initial target frame rate from at least one initial target frame rate as its current target frame rate.
  • At least one initial target frame rate may be preset according to the type of application A, for example, at least one initial target frame rate may be preset for game applications, such as 30fps, 60fps, and so on.
  • the developer of the frame rate recognition apparatus or the developer of the operating system may use the screen refresh rate of the display screen of the electronic device as the preset initial target frame rate.
  • the screen refresh rate of an electronic device is fixed and rarely changes. Therefore, the screen refresh rate can be used as a preset initial target frame rate. If the screen refresh rate of the electronic device changes, for example, the user changes the screen refresh rate through the device management program in the operating system.
  • the frame stabilization device can learn the changed screen refresh rate and use the changed screen refresh rate as the preset initial target frame rate.
  • the frame stabilizing device may use the preset initial target frame rate as its current target frame rate.
  • the frame stabilization module can pass the frame length of the image to the frame rate recognition module.
  • the frame rate recognition module judges whether the frame length of N consecutive images has been received every time the frame length of an image is received.
  • the specific value of N can be preset by the developer of the frame rate recognition device or the developer of the operating system, for example, it can be 50, 100, 200, and so on.
  • the frame rate identification module may determine the measured frame rate E corresponding to the N continuous images according to the frame lengths of the N images. Then, the measured frame rate E can be compared with the current target frame rate to determine the new target frame rate.
  • the new target frame rate may be equal to the set frame rate of application A, or closer to the set frame rate of application A than the current target frame rate.
  • the frame rate recognition module may pass the new target frame rate to the frame stabilization module, so that the frame stabilization module uses the new target frame rate as the current target frame rate at the next moment to stabilize the frame.
  • the frame stabilization module calculates the frame length and sends it to the frame rate recognition module as an example to introduce the frame length calculation and transmission process.
  • the information collection module can collect the receiving time of the image, and calculate the frame length of the image according to the receiving time of the image, and pass the frame length of the image to the frame stabilization module, and the frame stabilization module will transfer the frame length of the image. Passed to the frame rate recognition module.
  • the information collection module may collect the receiving time of the image, calculate the frame length of the image according to the receiving time of the image, and pass the frame length of the image to the frame stabilization module and the frame rate recognition module respectively.
  • the information collection module may transmit the receiving moment of the image collected to the frame stabilization module and the frame rate identification module respectively.
  • the frame stabilization module and the frame rate recognition module respectively calculate the frame length of the image according to the receiving moment of the image.
  • the frame rate identification device provided by the embodiment of the present application can quickly identify a new target frame equal to or close to the set frame rate of the application when the operating system does not know or does not accurately know the set frame rate of the application.
  • the new target frame rate can be passed to the frame stabilization module, so that the frame stabilization module adjusts computing resources according to the new target frame rate, so as to improve the user’s experience of the application and avoid or reduce unnecessary power consumption.
  • the software framework of the electronic device may include an application and an operating system.
  • the application can be a third-party application.
  • the application may include a frame rate setting module and a drawing rendering module.
  • the frame rate setting module can set the frame rate in response to an operation initiated by the user.
  • the frame rate set by the application can be referred to as the frame rate set by the application.
  • the drawing and rendering module can occupy or call most or most of the computing resources provided by the operating system according to the set frame rate to perform image rendering and rendering. Therefore, the rate or frame rate of rendering the rendered image by the rendering and rendering module is jointly restricted by the set frame rate and the computing resources provided by the operating system.
  • the operating system may include an image receiving module, a frame rate recognition device, and a computing resource providing module.
  • the image receiving module may receive the image rendered by the application.
  • the image receiving module may be the surface manager shown in FIG. 2.
  • the image receiving module may be the buffer queue described above.
  • the frame rate identification device may include an information collection module, a frame stabilization module, and a frame rate identification module.
  • the information collection module can collect the receiving time of each image received by the image receiving module.
  • the frame stabilization module can determine the frame length of the image according to the interval of the receiving time of the image, and execute the frame stabilization scheme according to the frame length of the image and the current target frame rate.
  • the current target frame rate may be the preset initial target frame rate, or the frame rate recently obtained by the frame stabilization module from the frame rate identification module.
  • the frame stabilization module can pass the frame length of the image to the frame rate recognition module so that the frame rate recognition module can determine the new target frame rate.
  • the frame rate recognition module can pass the determined new target frame rate to the frame stabilization module.
  • the computing resource providing module provides computing resources according to the stable frame solution for the operation of the electronic device. Among them, most or most of the resources provided by the computing resource providing module are occupied or called by the rendering and drawing module to render the image.
  • the frame that is equal to or close to the set frame rate of the application can be quickly determined
  • the determined frame rate can be passed to the frame stabilization module, so that the frame stabilization module adjusts computing resources according to the determined frame rate, so as to improve the user’s experience of the application and avoid or reduce unnecessary power consumption.
  • FIG. 6 an example of a frame rate identification method provided by an embodiment of the present application is described.
  • the method can be applied to the electronic device 100 shown in FIG. 1, and can be specifically executed by the operating system of the electronic device 100.
  • the method may include steps 601-613a (or 613b). details as follows.
  • the operating system may stabilize the frame of the image rendering and rendering performed by the application A according to the current frame rate F.
  • Application A may be an application program that needs to perform dynamic image rendering. For details, please refer to the above introduction, and will not be repeated here.
  • the operating system can use the current frame rate F as the target frame rate of the stable frame, and adjust it to provide computing resources to achieve the stable frame.
  • frame stabilization please refer to the above introduction, which will not be repeated here.
  • steps 601-613a may be executed by the electronic device cyclically, where each cycle may be referred to as a round of identification process or a cycle of identification.
  • the current round of recognition process (or this recognition cycle) is the non-first round of recognition process (non-first recognition cycle) after application A starts running
  • the current frame rate F in this round of recognition process is The new target frame rate determined in the last recognition process (or the last recognition cycle).
  • the previous round of identification process (or the previous identification cycle) is the previous round of identification process adjacent to the current round of identification process (or this identification cycle).
  • the current frame rate F of the current round of recognition process can be the preset initial target frame rate.
  • the preset initial target frame rate reference may be made to the introduction of the embodiment shown in FIG. 4 above, and details are not described herein again.
  • step 601 of the first round of the recognition process may be executed after the operating system receives at least one image from the application A.
  • Step 603 Record the frame length of the nth image received by the operating system from the application A.
  • the frame length of the nth image is determined by the time when the operating system receives the nth image from the application A and the n-1th image received by the operating system. The receiving moment of each image is determined.
  • the nth image and the n-1th image are images rendered by application A.
  • the nth image and the n-1th image are the two images sequentially received by the operating system from the application A, that is, the n-1th image is adjacent to the nth image and is the previous image of the nth image .
  • the receiving time of the n-1th image is adjacent to the receiving time of the nth image, and is before the receiving time of the nth image.
  • the operating system may obtain the receiving time when it receives the n-1th image from the application A and the receiving time when it receives the nth image.
  • the operating system can obtain the entry time of the buffer corresponding to the n-1th image and the entry time of the buffer corresponding to the nth image.
  • the entry time can be used as the receiving time. Reference may be made to the above description of the embodiment shown in FIG. 2, which will not be repeated here.
  • the operating system can calculate the time interval between the receiving moment of the nth image and the receiving moment of the n-1th image, and use the calculated time interval as the frame length of the nth image.
  • the receiving time of the nth image is the 80th ms after the application A is started
  • the receiving time of the n-1th image is the 60th ms after the application A is started
  • the frame length of the nth image is 20ms.
  • step 605 can be executed to determine whether the frame length of N images has been recorded.
  • N can be a preset integer, such as 50, 100, 200, etc., which will not be listed here.
  • step 603 is executed again.
  • step 607 is executed to remove the abnormal frame length and smooth the fluctuation of the frame length.
  • the abnormal frame length refers to the apparently abnormal frame length caused by application bugs, network abnormalities and other reasons.
  • developers can preset an abnormality removal threshold based on experience or experiment, and remove frame lengths whose frame length exceeds the threshold.
  • the gentle fluctuation of the frame length refers to replacing the original frame length of the m images with the average of the frame lengths of consecutive m images, and m is an integer greater than 1.
  • the frame lengths of the 9th, 10th, and 11th images are 40ms, 60ms, and 40ms respectively, and the average value of 40ms, 60ms, and 50ms, that is, 50ms, is used to represent the 9th and 11th images.
  • the frame lengths of the 10th and 11th images that is, after gentle frame length fluctuations, the frame lengths of the 9th, 10th, and 11th images are 50ms, 50ms, and 50ms, respectively.
  • step 609 may be performed.
  • step 609 in step 605, if the frame length of N images has been recorded, step 609 can be performed directly.
  • step 609 the average frame length and the measured frame rate FD1 corresponding to the average frame length can be determined.
  • the average frame length of the frame lengths of the N images processed in step 607 or not processed in step 6057 can be calculated, and the measured frame rate FD1 corresponding to the N images can be determined according to the average frame length.
  • the measured frame rate FD1 corresponding to the N images can be obtained.
  • the result obtained by dividing the duration of 1000 milliseconds by the average frame length is not an integer, the result can be rounded.
  • the rounding rule can be rounding up, rounding down, or rounding.
  • the frame rate range from 0 fps to the highest configurable frame rate may be divided into a plurality of preset frame rate intervals in advance, where the frame rate range from 0 to the lowest configurable frame rate is one of the intervals.
  • the lowest settable frame rate can also be the floor frame rate.
  • the target frame rate of application A exceeds this minimum set frame rate, the user experience of the dynamic picture of application A will be poor or very poor.
  • the minimum frame rate that can be set is generally 20fps.
  • the highest settable frame rate can be equal to the refresh rate of the display screen of the electronic device, for example, 60 fps.
  • the settable frame rate refers to the frame rate that can be adjusted in response to the user's initial operation. Taking game applications as an example, 20fps and 25fps are configurable frame rates, and 21fps, etc., are non-settable frame rates.
  • the settable frame rate of various application types can be counted, so as to divide the frame rate interval according to the statistical result.
  • the settable frame rate of different application types can be preset based on experience or research and analysis of multiple applications.
  • the settable frame rate of the game application can be preset to be 20fps, 25fps, 30fps, 35fps, 40fps, 45fps, 50fps, 55fps, 60fps.
  • any one of the plurality of frame rate intervals divided between the lowest settable frame rate and the highest settable frame rate may include one settable frame rate.
  • the configurable frame rate included therein is the upper limit of the frame rate interval.
  • the minimum set frame rate of the game application is 20 fps
  • the maximum set frame rate is 60 fps.
  • the adjustable range of frame rate is generally 5fps, allowing users to adjust the target frame rate according to integer multiples of 5fps.
  • the multiple preset frame rate intervals for game applications may specifically be (0, 20], (20, 25], (25, 30), (30, 35], (35, 40), (40, 45], (45, 50], (50, 55], (55, 60).
  • the configurable frame rate included in it is close to the upper limit of the frame rate interval.
  • multiple preset frame rate intervals can be (0, 21], (21, 26), (26, 31), (31, 36], (36, 41), (41,46], (46,51], (51,56], (56,61).
  • the frame rate can be set as the upper limit of the frame rate interval
  • the upper limit of the frame rate interval is taken as the actual measured frame rate FD1.
  • the settable frame rate close to the upper limit of the frame rate interval is used as the actual measured frame rate FD1.
  • the number N1 of images whose measured frame rate is higher than the threshold Y1 can also be determined.
  • the number of images of Y1 is N1.
  • the result of dividing the duration of 1000 milliseconds by the frame length of any image can be rounded to the frame rate of the image. Rounding can be rounding up, rounding down, or rounding.
  • the threshold Y1 may be the sum of the current frame rate F plus the preset frame rate H, and the value of the preset frame rate H may be an integer such as 2, 3, etc.
  • the preset frame rate H may specifically be a frame rate smaller than the target frame rate with an adjustable range. Taking application A as a game application as an example, the preset frame rate H is a frame rate less than 5fps, and its value can be an integer.
  • the preset frame rate H may be associated with the frame rate F, that is, different frame rates F may correspond to different preset frame rates H.
  • the frame rate F is 60 fps
  • the preset frame rate H may be 4 fps
  • the frame rate is 40 fps
  • the preset frame rate H may be 3 pfs. Wait, I won't list them all this time.
  • the number N2 of images whose measured frame rate is lower than the threshold Y2 can also be determined.
  • the measured frame rate of each image can be compared with the threshold Y2, and the number N2 of the available N images whose frame rate is lower than the threshold Y2.
  • the threshold Y2 may be the difference between the current frame rate F minus the preset frame rate K, and the value of the preset frame rate K may be an integer such as 2, 3, etc.
  • the preset frame rate K may be equal to the preset frame rate H.
  • the preset frame rate K may specifically be a frame rate smaller than the target frame rate with an adjustable range. Taking application A as a game application as an example, the preset frame rate H is a frame rate less than 5fps, and its value is an integer.
  • the preset frame rate K may be associated with the frame rate F, that is, different frame rates F may correspond to different preset frame rates H.
  • the frame rate F is 60 fps
  • the preset frame rate K may be 4 fps
  • the frame rate is 40 fps
  • the preset frame rate K may be 3 pfs. Wait, I won't list them all this time.
  • the average value FL of the sleep time of the rendering thread may also be determined.
  • the operating system can obtain the sleep duration of the rendering thread of the application A when the application A is rendering the image. Taking the nth image as an example, the operating system obtains the activity duration of the rendering thread of application A detected after receiving the n-1th image and before receiving the nth image, and calculates the frame length of the nth image and the activity The difference in duration can get the sleep duration of the rendering thread corresponding to the nth image.
  • the kernel layer can obtain the running status of each thread running in the CPU, including the rendering thread of application A, so that the activity duration of the rendering thread of application A can be detected.
  • applications such as games can have multiple rendering threads, and the active duration of the rendering thread here specifically refers to the active duration of the main rendering thread.
  • the main rendering thread is the rendering thread with the longest active duration among the multiple rendering threads.
  • the average sleep duration FL can be calculated according to the sleep time of the rendering thread of each image.
  • condition 1 it can be determined whether condition 1 is satisfied.
  • Condition 1 is: FD1 ⁇ F, and FL>P1, and N2>P2.
  • step 613a the actual frame rate FD1 is set as the new frame rate F.
  • application A can quickly complete the rendering and rendering of an image, and the set frame rate of application A is small, and the time interval between the initial drawing moments of adjacent images is also longer. The foregoing two factors may cause the sleep duration of the rendering thread to become longer.
  • application A can perform image rendering and rendering according to its set frame rate or close to its set frame rate, even if the computing resources are sufficient, it is limited by the lower set frame rate (ie, the set frame rate of application A). It is a factor that limits the speed of rendering the rendered image), the measured frame rate of the rendered image will be relatively low, resulting in an increase in the number of images whose measured frame rate is lower than the threshold Y2.
  • P1 and P2 can be preset values based on experience or experiment.
  • P1 can be 10 ms
  • the value of P2 can be 20. If the values of P1, P3, etc. are preset according to the experiment, the following experimental scheme can be used.
  • the P1 and P2 values are determined by comparing the difference in characteristic values of the measured frame rate FD1, the average sleep duration FL of the rendering thread, and the number of images N2 with the measured frame rate less than the threshold Y2 in the two sets of experiments.
  • FD2 and FD3 are the measured frame rates corresponding to the average frame length of the image determined by the last two rounds of the recognition process (the current round of recognition cycle) of the current round of recognition process (the current round of recognition cycle).
  • the last two rounds and the current round are three consecutive rounds of identification process (identification cycle).
  • the target frame rate used for frame stabilization in the last two rounds of recognition process is equal to the target frame rate used for frame stabilization in the current round of recognition process, that is, the frame rate F in the last two rounds of recognition process is equal to that in the current round of recognition process.
  • the frame rate F is equal.
  • step 613a the actual frame rate FD1 is set as the new frame rate F.
  • the measured frame rates corresponding to the average frame lengths of the images in the consecutive three rounds of recognition are all equal, which indicates that the measured frame rate FD1 is likely to be equal to or close to the set frame rate of application A.
  • application A can perform image rendering and rendering according to its set frame rate or close to its set frame rate, even if the computing resources are sufficient, it is limited to a lower set frame rate (that is, the set frame rate of application A is Factors that limit the speed of rendering the rendered image), the frame rate of rendering the rendered image will be relatively low, resulting in an increase in the number of images whose measured frame rate is lower than the threshold Y2.
  • the threshold Y2 When the number of images whose actual frame rate is lower than the threshold Y2 is greater than P3, it is further reflected that the set frame rate of the application A is lower than the frame rate F.
  • P3 can be preset according to experience or experiment, and the value of P3 can be 10. Among them, when setting the value of P3 in advance according to the experiment, the process of the experiment can refer to the introduction of condition 1 above.
  • step 611 it can be determined whether condition 3 is satisfied, and condition 3 is: FD>F.
  • the measured frame rate FD1 may be used as the new frame rate F.
  • the first frame rate obtained by adding the current frame rate F and the preset frame rate S may be used as the new frame rate F.
  • the preset frame rate S may be equal to the difference between the adjacent settable frame rates.
  • the preset frame rate S may be equal to the interval length of the above-mentioned preset frame rate interval, and the frame rate interval is the interval formed by dividing 0 fps to the lowest settable frame rate among the intervals of the plurality of preset frame rates. Any interval outside. Taking application A as a game application as an example, the interval length of any interval other than the interval formed by 0 fps to the lowest settable frame rate is 5 fps, that is, the preset frame rate S may be 5 fps.
  • the larger value of the measured frame rate FD1 and the first frame rate may be used as the new frame rate F.
  • the first frame rate can refer to the introduction of the previous example, which will not be repeated here.
  • the frame rate of the rendered image drawn by application A is lower than its set frame rate. Therefore, the measured frame rate FD1 (or the first frame rate) is compared with that of application A. There is still a big gap between the set frame rates of.
  • the frame rate F can be continuously updated through subsequent rounds of the frame rate recognition process (recognition period), so that the updated frame rate F is closer to or equal to the set frame rate of the application A. details as follows.
  • the new frame rate F determined by 613a in the current round of recognition process can be used as the current frame rate F used as a stable frame in the next round of recognition process (recognition cycle).
  • the current frame rate F of the next round of recognition process that is, the new frame rate F determined by the current round of recognition process
  • the measured frame in the next round of recognition process can be determined in step 611 of the next round of recognition process
  • the new frame rate F determined in the next round is closer to the frame rate set by application A than the frame rate F determined in this round.
  • a frame rate F equal to or close to the set frame rate of application A can be determined in a limited round of recognition process.
  • the m+1th round is the next round of the mth round and is a recognition process adjacent to the mth round. It can be set in the mth round to determine the new frame rate F as the frame rate Z.
  • the frame rate Z is used as the current frame rate F in the m+1th round.
  • the determined relationship between the actual measured frame rate FD1 and the frame rate Z can be compared, and then a new frame rate F can be determined in the m+1th round.
  • the new frame rate F determined in the m+1 round is closer to the set frame rate of application A than the current frame rate F, or the new frame rate F determined in the m+1 round may be equal to the application A A's set frame rate. In this way, the frame rate F that is equal to or close to the frame rate set by the application A can be determined through a limited round of identification process.
  • the operating system when the operating system detects that the actual frame rate of the image rendered by application A is higher than the target frame rate used in the stabilization scheme (that is, the current frame rate F), the operating system will Lower computing resources. It is easy to understand that when the N images are rendered by application A, computing resources are insufficient, and the frame rate of rendering the rendered image is lower than its set frame rate, that is, insufficient resources are a factor that limits the speed of application A in rendering the rendered image. If the operating system continuously detects that the measured frame rate of the image rendered by application A is higher than the frame rate F, the operating system will continuously reduce the computing resources so that the measured frame rate of the image rendered by application A is close to or equal to the current frame Rate F.
  • the operating system detects that the measured frame rate of the image rendered by application A is lower than the frame rate F, the operating system will increase the computing resources so that the measured frame rate of the image rendered by application A is close to or equal to the current frame rate F .
  • the measured frame rate of the image rendered by the application A can be stabilized at the frame rate F, that is, the measured frame rate FD1 is equal to the current frame rate F.
  • the set frame rate of application A is greater than the current frame rate F.
  • the value of P4 can be preset according to experience or experiment.
  • the value of P4 can be 15.
  • the value of P4 is preset according to the experiment, the following scheme can be adopted.
  • a game application Take a game application as an example, you can set the operating system to stabilize the frame at a lower target frame rate (for example, 30fps), and set the set frame rate of the application to a higher frame rate (for example, 40fps), and you can observe the game application During operation, characteristic values such as the number of images whose measured frame rate is higher than the threshold Y1 and the number of images whose measured frame rate is lower than the threshold Y2. You can also set the target frame rate used by the operating system for frame stabilization to be equal to the set frame rate of the application, and then observe the number of images with the measured frame rate higher than the threshold Y1 during the running of the game application, and the measured frame rate lower than the threshold Y2 Feature values such as the number of images.
  • the P4 value is determined by comparing the difference in characteristic values observed in the two sets of experiments.
  • condition 4 When condition 4 is met, the actual measured frame rate FD1 is difficult to express or reflect the set frame rate of application A, although the new frame rate F is closer to the set frame rate of application A than the current frame rate F, but the new There may still be a large gap between the frame rate F and the set frame rate of the application A.
  • the frame rate F can be continuously updated through the subsequent round of frame rate recognition process (recognition period), so that the updated frame rate F is closer to or equal to the set frame rate of the application A.
  • step 613a when condition 3 is satisfied, which will not be repeated here.
  • step 613b may be executed, and the current frame rate F is not updated.
  • the current frame rate F can be used as the current frame rate F for the next round of recognition process.
  • step 601 can be executed again to start the next round of identification process.
  • the new frame rate F determined in step 613a in this round or the unupdated frame rate F in step 613b is used as the target frame rate of the next round, that is, the current frame rate F of the next round.
  • round B is the identification process of the next round of round A.
  • the new frame rate determined in step 613a in round A can be set to frame F'.
  • the operating system uses the frame rate F'as the current frame rate to execute the stable frame scheme.
  • the frame rate identification method provided by the embodiments of the present application can quickly identify a frame rate equal to or close to the set frame rate of the application when the operating system does not know or does not accurately know the set frame rate of the application, and
  • the identified frame rate can be used as the target frame rate used by the operating system for frame stabilization, so as to improve the user experience of the application and avoid or reduce unnecessary power consumption.
  • the value of N can be set to 100, the value of P1 is 10ms, the value of P2 is 20, the value of P3 is 10, and the value of P4 is 15.
  • the preset multiple frame rate intervals are (0, 20], (21, 25], (26, 30), (31, 35], (36, 40), (41, 45], (46, 50) , (51,55], (56,60].
  • the user can adjust the frame rate from the original 20fps to 25fps through the frame rate setting interface of the game application.
  • This 25fps can be referred to as the set frame rate of the game application.
  • the test results of 999 out of these 1,000 tests show that the frame rate identification method provided by the embodiment of this application can be used within 5 seconds.
  • the operating system uses 25fps as the target frame rate to stabilize the image rendering and rendering of game applications.
  • the set frame rate of the game application can be recognized within 5 seconds.
  • the user can adjust the frame rate from the original 30fps to 25fps through the frame rate setting interface of the game application.
  • the test results of 999 out of these 1,000 tests show that the frame rate identification method provided by the embodiment of this application can be used within 5 seconds.
  • the operating system uses 25fps as the target frame rate to stabilize the image rendering and rendering of game applications.
  • the set frame rate of the game application can be recognized within 5 seconds.
  • the user can adjust the frame rate from the original 40fps to 60fps through the frame rate setting interface of the game application.
  • the game application does not inform the operating system that its set frame rate has been adjusted to 60fps
  • the test results of 1,000 out of these 1,000 tests show that the frame rate identification method provided by the embodiment of this application can be used within 5 seconds.
  • the operating system uses 60fps as the target frame rate to stabilize the image rendering and rendering of game applications.
  • the set frame rate of the game application can be recognized within 5 seconds.
  • the user can adjust the frame rate from the original 40fps to 30fps through the frame rate setting interface of the game application.
  • the game application does not inform the operating system that its set frame rate has been adjusted to 30fps
  • the test results of 1,000 out of these 1,000 tests show that the frame rate identification method provided by the embodiment of this application can be used within 5 seconds.
  • the operating system uses 30fps as the target frame rate to stabilize the image rendering and rendering of game applications.
  • the set frame rate of the game application can be recognized within 5 seconds.
  • the method shown in Figure 6 can be used to quickly and accurately identify a frame rate equal to or close to the application’s set frame rate, and the response time is Less than 5 seconds, the accuracy rate is greater than 99.9%, the false negative rate is less than 0.1%, and the recognition granularity can be as low as 5fps, so as to realize the set frame rate that can identify the third-party game without relying on the third-party game. Identify the set frame rate of various games, and combine with the stabilization scheme, which effectively improves the user's gaming experience.
  • the embodiment of the present application provides a frame rate identification method, which can be applied to electronic devices.
  • the method includes the following steps.
  • Step 701 Perform frame stabilization on the image rendering and rendering performed by the first application according to the current target frame rate. For details, refer to the above description of step 601 in FIG. 6 for implementation.
  • Step 703 Determine the frame length of N continuous images according to the receiving moment of receiving the rendering and rendering result of the first application, and the N continuous images are the images rendered and rendered by the first application. For details, refer to the above description of steps 603 and 605 in FIG. 6 for implementation.
  • Step 705 Determine the actual measured frame rate according to the frame length of the N continuous images. For details, refer to the above description of step 609 in FIG. 6 for implementation.
  • Step 707 Determine a new target frame rate according to the measured frame rate and the current target frame rate, so as to stabilize the frame of the image rendering and rendering performed by the first application according to the new target frame rate.
  • the determining the measured frame rate according to the frame length of the N continuous images includes: determining the first frame rate from a plurality of preset frame rate intervals according to the average frame length of the N continuous images. Interval; according to the first interval, the measured frame rate is determined. For details, refer to the above description of step 609 in FIG. 6 for implementation.
  • the determining the actual measured frame rate according to the first interval includes: using the upper limit frame rate of the first interval as the actual measured frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is equal to the current target frame rate, and the N When the number of images of the first type in the consecutive images is greater than the first threshold, the first frame rate obtained by adding the current target frame rate and the first preset frame rate is used as the new target frame rate ; Wherein, the frame rate corresponding to the first type of image is greater than a second frame rate, and the second frame rate is obtained by adding a second preset frame rate to the current target frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is equal to the current target frame rate, and the N When the number of images of the first type in the continuous images is greater than the first threshold, and the number of images of the first type in the N continuous images is greater than the number of images of the second type in the N continuous images,
  • the first frame rate obtained by adding the current target frame rate and the first preset frame rate is used as the new target frame rate; wherein the frame rate corresponding to the first type of image is greater than the second frame
  • the second frame rate is obtained by adding a second preset frame rate to the current target frame rate; the frame rate corresponding to the second type of image is less than the third frame rate, and the third frame rate is determined by The current target frame rate is obtained by subtracting the third preset frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is greater than the current target frame rate, setting the The measured frame rate is used as the new target frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: using the larger frame rate of the measured frame rate and the first frame rate as In the new target frame rate, the first frame rate is obtained by adding the current target frame rate and a first preset frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is less than the current target frame rate, setting the The measured frame rate is used as the new target frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is less than the current target frame rate, and the second When the average sleep time of the rendering thread when an application performs rendering and rendering of the N consecutive images is greater than a second threshold, the measured frame rate is used as the new target frame rate.
  • the determining a new target frame rate according to the measured frame rate and the current target frame rate includes: when the measured frame rate is less than the current target frame rate, and the second When an application is rendering the N continuous images, the average sleep time of the rendering thread is greater than the second threshold, and when the number of second-type images in the N continuous images is greater than the third threshold, the measured frame Rate is used as the new target frame rate; wherein, the frame rate corresponding to the second type of image is less than a third frame rate, and the third frame rate is subtracted from the current target frame rate by a third preset frame Rate get.
  • the measured frame rate includes at least two measured frame rates; the determining the latest target frame rate according to the measured frame rate includes: when the measured frame rate is less than the current target frame rate And when the frame rates are the same at least twice, the actual measured frame rate is used as the latest target frame rate.
  • the measured frame rate includes at least two measured frame rates; the determining the latest target frame rate according to the measured frame rate includes: when the measured frame rate is less than the current target frame rate , And the at least two frame rates are equal, and the number of images of the second type in the N consecutive images is greater than a fourth threshold, the actual measured frame rate is used as the latest target frame rate; wherein, The frame rate corresponding to the second type image is less than a third frame rate, and the third frame rate is obtained by subtracting a third preset frame rate from the current target frame rate.
  • the frame rate identification method provided by the embodiments of the present application can quickly identify a frame rate equal to or close to the set frame rate of the application when the operating system does not know or does not accurately know the set frame rate of the application, and
  • the identified frame rate can be used as the target frame rate used by the operating system for frame stabilization, so as to improve the user experience of the application and avoid or reduce unnecessary power consumption.
  • the embodiment of the present application provides a frame rate identification device 800.
  • the device 800 includes:
  • the frame stabilization unit 810 is configured to stabilize the frame of the image rendering and rendering performed by the first application according to the current target frame rate
  • the first determining unit 820 is configured to determine the frame length of N continuous images according to the receiving moment of receiving the rendering and rendering result of the first application, where the N continuous images are the images rendered and rendered by the first application;
  • the second determining unit 830 is configured to determine the measured frame rate according to the frame length of the N continuous images
  • the third determining unit 840 is configured to determine a new target frame rate according to the measured frame rate and the current target frame rate, so as to render and render the image of the first application according to the new target frame rate Stabilize the frame.
  • each electronic device includes a hardware structure and/or software module corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • the embodiments of the present application may divide the functional modules of electronic devices and the like according to the method embodiments shown in FIG. 7.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one.
  • Processing module The above-mentioned integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the device provided by the embodiment of the present application can quickly identify a frame rate equal to or close to the set frame rate of the application when the operating system does not know or does not accurately know the set frame rate of the application, and can identify The output frame rate is used as the target frame rate used by the operating system for frame stabilization, so as to improve the user experience of the application and avoid or reduce unnecessary power consumption.
  • the electronic device may include a processor 910 and a memory 920.
  • the memory 920 is used to store computer execution instructions; when the electronic device is running, the processor 910 executes the computer execution instructions stored in the memory 920, so that the electronic device executes the method shown in FIG. 7 .
  • the processor 910 is configured to stabilize the frame of the image rendering and rendering performed by the first application according to the current target frame rate; the processor 910 is configured to determine N according to the receiving moment of receiving the rendering and rendering result of the first application.
  • the frame length of the N continuous images is the image rendered and rendered by the first application; the processor 910 is configured to determine the measured frame rate according to the frame length of the N continuous images; the processing The device 910 is configured to determine a new target frame rate according to the measured frame rate and the current target frame rate, so as to stabilize the frame of the image rendering and rendering performed by the first application according to the new target frame rate.
  • the electronic device further includes a communication bus 930, wherein the processor 910 can communicate with the memory 920 via the communication bus 930, so as to obtain computer-executable instructions stored in the memory 920 and execute the computer-executed instructions.
  • the operating system when the operating system does not know or does not accurately know the application’s set frame rate, it can quickly identify a frame rate equal to or close to the application’s set frame rate, and use the identified frame rate as The target frame rate used by the operating system for frame stabilization to improve the user experience of the application and to avoid or reduce unnecessary power consumption.
  • the method steps in the embodiments of the present application can be implemented by hardware, or can be implemented by a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, which can be stored in random access memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or well-known in the art Any other form of storage medium.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and can write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in the ASIC.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention se rapporte au domaine technique des dispositifs électroniques et concerne en particulier un procédé d'identification de fréquence de trames et un dispositif électronique. Le procédé consiste à : stabiliser, conformément à une fréquence de trames cible courante, un rendu d'image effectué par une première application ; déterminer des longueurs de trame de N images consécutives conformément à un moment de réception auquel est reçu un résultat de rendu de la première application, les N images consécutives étant des images rendues par la première application ; déterminer la fréquence de trames mesurée conformément aux longueurs de trame des N images consécutives ; et déterminer une nouvelle fréquence de trames cible conformément à la fréquence de trames mesurée et à la fréquence de trames cible courante, de façon à stabiliser, conformément à la nouvelle fréquence de trames cible, le rendu d'image effectué par la première application.
PCT/CN2020/108714 2019-09-19 2020-08-12 Procédé d'identification de fréquence de trames et dispositif électronique WO2021052070A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910888838.1 2019-09-19
CN201910888838.1A CN112516590A (zh) 2019-09-19 2019-09-19 一种帧率识别方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021052070A1 true WO2021052070A1 (fr) 2021-03-25

Family

ID=74884005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/108714 WO2021052070A1 (fr) 2019-09-19 2020-08-12 Procédé d'identification de fréquence de trames et dispositif électronique

Country Status (2)

Country Link
CN (1) CN112516590A (fr)
WO (1) WO2021052070A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114510274A (zh) * 2020-11-16 2022-05-17 深圳市万普拉斯科技有限公司 电子设备的画面帧率调整方法、电子装置及存储介质
CN115904184B (zh) * 2021-09-30 2024-03-19 荣耀终端有限公司 数据处理方法和相关装置
CN114442792A (zh) * 2022-02-09 2022-05-06 北京小米移动软件有限公司 处理器的运行频率调整方法、装置及存储介质
CN115278366B (zh) * 2022-09-28 2023-03-24 天津卓朗昆仑云软件技术有限公司 虚拟机视频流的数据处理方法、装置以及电子设备
CN115665482B (zh) * 2022-11-09 2023-06-30 腾讯科技(深圳)有限公司 视频渲染方法、装置、计算机设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8291460B1 (en) * 2010-02-12 2012-10-16 Adobe Systems Incorporated Rate adaptation based on dynamic performance monitoring
US20130163953A1 (en) * 2010-06-18 2013-06-27 Adobe Systems Incorporated Media player instance throttling
CN106603543A (zh) * 2016-12-22 2017-04-26 努比亚技术有限公司 校正流媒体音视频同步的方法及装置
CN107610039A (zh) * 2016-07-12 2018-01-19 联发科技股份有限公司 图像处理方法及图像处理装置
CN109165103A (zh) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 帧率控制方法、装置、终端及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366391B (zh) * 2013-06-26 2016-07-06 广州市动景计算机科技有限公司 动态图像的画面渲染方法及画面渲染装置
JP2018007210A (ja) * 2016-07-08 2018-01-11 ソニーセミコンダクタソリューションズ株式会社 信号処理装置および方法、並びに撮像装置
CN107786748B (zh) * 2017-10-31 2021-04-13 Oppo广东移动通信有限公司 图像显示方法及设备
KR102586695B1 (ko) * 2018-02-09 2023-10-11 삼성전자주식회사 디스플레이 장치 및 그 제어 방법
CN108762465B (zh) * 2018-03-27 2020-06-30 Oppo广东移动通信有限公司 帧率自适应调整方法、装置、存储介质及智能终端
CN109857559B (zh) * 2019-01-25 2021-04-06 维沃移动通信有限公司 终端控制方法及终端
CN109800141B (zh) * 2019-01-28 2020-08-18 Oppo广东移动通信有限公司 Gpu性能瓶颈的确定方法、装置、终端及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8291460B1 (en) * 2010-02-12 2012-10-16 Adobe Systems Incorporated Rate adaptation based on dynamic performance monitoring
US20130163953A1 (en) * 2010-06-18 2013-06-27 Adobe Systems Incorporated Media player instance throttling
CN107610039A (zh) * 2016-07-12 2018-01-19 联发科技股份有限公司 图像处理方法及图像处理装置
CN106603543A (zh) * 2016-12-22 2017-04-26 努比亚技术有限公司 校正流媒体音视频同步的方法及装置
CN109165103A (zh) * 2018-10-15 2019-01-08 Oppo广东移动通信有限公司 帧率控制方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN112516590A (zh) 2021-03-19

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2020103764A1 (fr) Procédé de commande vocale et dispositif électronique
WO2021052070A1 (fr) Procédé d'identification de fréquence de trames et dispositif électronique
WO2020134869A1 (fr) Procédé de fonctionnement d'un dispositif électronique et dispositif électronique
WO2020191685A1 (fr) Procédé et appareil de réglage de fréquence appliqués à un terminal, et dispositif électronique
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2021209047A1 (fr) Procédé de réglage de capteur, appareil et dispositif électronique
WO2021052415A1 (fr) Procédé de planification de ressources et dispositif électronique
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
EP4130969B1 (fr) Procédé et dispositif de réglage de paramètre de configuration de mémoire
WO2021258814A1 (fr) Procédé et appareil de synthèse vidéo, dispositif électronique, et support de stockage
WO2021218540A1 (fr) Procédé d'ajustement de puissance d'antenne, dispositif de terminal et support de stockage
WO2021082815A1 (fr) Procédé d'affichage d'élément d'affichage et dispositif électronique
WO2021218429A1 (fr) Procédé de gestion d'une fenêtre d'application, dispositif terminal et support de stockage lisible par ordinateur
WO2021190314A1 (fr) Procédé et appareil de commande de réponse au glissement d'un écran tactile, et dispositif électronique
WO2021104000A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2022078105A1 (fr) Procédé de gestion de mémoire, dispositif électronique et support de stockage lisible par ordinateur
WO2021155709A1 (fr) Procédé de commande de charge et dispositif électronique
WO2022170856A1 (fr) Procédé d'établissement de connexion et dispositif électronique
WO2021164300A1 (fr) Procédé de présentation de données, dispositif de terminal et support de stockage
WO2021104122A1 (fr) Procédé et appareil permettant de répondre à une demande d'appel et dispositif électronique
WO2023179123A1 (fr) Procédé de lecture audio bluetooth, dispositif électronique, et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866610

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866610

Country of ref document: EP

Kind code of ref document: A1