CN111880602A - Dropped frame detection method and device - Google Patents

Dropped frame detection method and device Download PDF

Info

Publication number
CN111880602A
CN111880602A CN202010767819.6A CN202010767819A CN111880602A CN 111880602 A CN111880602 A CN 111880602A CN 202010767819 A CN202010767819 A CN 202010767819A CN 111880602 A CN111880602 A CN 111880602A
Authority
CN
China
Prior art keywords
frame
time
time information
reference frame
actual display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010767819.6A
Other languages
Chinese (zh)
Other versions
CN111880602B (en
Inventor
程立
王涵
顾云建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010767819.6A priority Critical patent/CN111880602B/en
Publication of CN111880602A publication Critical patent/CN111880602A/en
Application granted granted Critical
Publication of CN111880602B publication Critical patent/CN111880602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/04Generating or distributing clock signals or signals derived directly therefrom
    • G06F1/06Clock generators producing several clock signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/04Generating or distributing clock signals or signals derived directly therefrom
    • G06F1/12Synchronisation of different clock signals provided by a plurality of clock generators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/04Generating or distributing clock signals or signals derived directly therefrom
    • G06F1/14Time supervision arrangements, e.g. real time clock
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Algebra (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure relates to a dropped frame detection method and device. The method relates to a screen display technology, and solves the problem that the result has errors due to frame dropping detection in a certain link. The method comprises the following steps: acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image; acquiring time information actually displayed in an application image synthesis display process; and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity. The technical scheme provided by the disclosure is suitable for different application screen displays under different refreshing frequencies, and realizes accurate, efficient and low-cost frame drop detection.

Description

Dropped frame detection method and device
Technical Field
The present disclosure relates to screen display technologies, and in particular, to a method and an apparatus for detecting dropped frames.
Background
Android is one of mainstream systems of current mobile terminals, and fluency of Android is always a key point concerned by all parties. The frame dropping rate of Application (APP) is an important index for describing fluency, so after system application or third-party application is updated iteratively, the frame dropping rate of a new version of application is often required to be evaluated. Frame information can be recorded through a display trace record (FrameTracker) and a drop frame trace record (janktacker) when the system runs to describe the drop frame rate of the application. Engineers calculate the frame loss rate of the application by the information given by FrameTracker or JankTracker. However, with the development of the android system, a frame drop rate calculated by information given by FrameTracker or janktacker alone has a great error.
The android system runs to record the planned synchronization (intendedcvsync) time and the vertical synchronization (Vsync) time of the frame by janktacker, which respectively represent the point in time when the application expects to draw and the point in time when the application actually starts to draw. Scheme one can describe how long it is slow to apply the last frame by their difference:
Figure BDA0002615341870000011
in the formula (1), ceil () is an rounding-up function, JankPeriod is the number of frames lost in the previous frame, and RefreshPeriod is the screen refresh period.
A general image synthesis and display process is shown in fig. 1, and generally includes four processes, which are: the APP initiates drawing commands to the GPU on an interface thread (UIThread) and a rendering thread (renderThread), and GPU Hardware (HW) draws a Buffer (Buffer), a Buffer queue (BufferQuene) flow, a composition and BufferQuene flow in a window system (SurfaceFlinger), a submission to the HWComponer and a Display Controller (Display Controller). When the formula (1) is used for calculation, the method mainly relates to a process of initiating a drawing command by an APP, lacks consideration of subsequent links in the process, and cannot evaluate frame dropping of other links.
Meanwhile, a butter plan shown in fig. 2 is introduced into the Android graph display scheme, namely, a Buffer is added to relieve frame dropping caused by the fact that software drawing is slower than actual display. This results in severe errors as soon as the scheme is used to evaluate the dropped frame situation. For example, the time for synthesizing buffers B and C is normally 1 RefreshPeriod, but this synthesis timeout for some reason should lose 1 frame at B and C, respectively, according to scheme one. However, as shown in fig. 2, B and C are displayed continuously, and no dropped frames occur. Therefore, the above scheme can cause serious errors in the estimation of the dropped frame condition.
To avoid the occurrence of the above error, the second scheme calculates the frame dropping condition of the application in the running process based on the expected display time (desirefresenttime) and the frame preparation time (frameready time) recorded by the FrameTracker. Where desirefpresenttime denotes a time point when an application draws an image, and FrameReadyTime denotes a time point when surfafinger commits a frame to draw. Namely, by the following formula (2):
Figure BDA0002615341870000021
wherein JankPeriod is used to describe the frame dropping situation of the application. There are generally two ways of determining: one uses Jankperiod as frame dropping number; and the other is that when the current JankPeriod is different from the last JankPeriod, the frame drop is considered to occur.
When the system has continuous and uniform frame dropping, the second using scheme has great deviation, and the second scheme does not consider the whole image synthesis and display sending process, so that result errors always occur when the frame dropping number is calculated and applied.
In summary, the frame dropping evaluation schemes for the APP submission and drawing stages and for the Surface rendering stage do not consider the whole image synthesis and display sending process, and result errors exist.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a method and an apparatus for detecting dropped frames.
According to a first aspect of the embodiments of the present disclosure, a method for detecting dropped frames is provided, including:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
Selecting a reference frame;
searching a frame drop tracking record, and determining the scheduled synchronization time of the reference frame and the next frame of the reference frame;
and calculating the scheduled synchronous time difference value of the reference frame and the next frame of the reference frame as the time information for the application program to initiate drawing.
Preferably, the step of acquiring the time information actually displayed in the process of image synthesis display includes:
searching a frame display tracking record, and determining the actual display time of the reference frame and the next frame of the reference frame;
and calculating the actual display time difference value of the reference frame and the next frame of the reference frame as the actual display time information in the application image synthesis display process.
Preferably, the actual display time of the reference frame and the next frame of the reference frame after the correction is used in calculating the actual display time difference,
before the step of calculating the actual display time difference as the actual display time information in the application image synthesis display process, the method further includes:
and correcting the actual display time of the reference frame and the next frame of the reference frame according to the relation between the actual display time and the hardware vertical synchronization time neighborhood.
Preferably, the step of comparing the time information for initiating the drawing with the time information for actually displaying to determine the number of dropped frames includes:
calculating the number of frames of the planned time drop according to the planned synchronous time difference, and calculating the actual number of frames of the display drop according to the actual display time difference;
and calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result.
According to a second aspect of the embodiments of the present disclosure, there is provided a dropped frame detection apparatus, including:
the system comprises an initiated drawing time analysis module, a drawing starting time analysis module and a drawing starting time analysis module, wherein the initiated drawing time analysis module is used for acquiring time information of an application program in the application image synthesis display process;
the composite display time analysis module is used for acquiring the time information actually displayed in the composite display process of the application image;
and the frame drop judging module is used for comparing the time information for initiating the drawing with the actually displayed time information and determining the number of frame drops.
Preferably, the starting drawing time analysis module includes:
a reference frame selecting unit for selecting a reference frame;
the frame drop searching unit is used for searching a frame drop tracking record and determining the scheduled synchronization time of the reference frame and the next frame of the reference frame;
and the planned synchronization time difference calculation unit is used for calculating the planned synchronization time difference value of the reference frame and the next frame of the reference frame as the time information for the application program to initiate drawing.
Preferably, the synthesized display time analysis module includes:
the frame display searching unit is used for searching a frame display tracking record and determining the actual display time of the reference frame and the next frame of the reference frame;
and the actual display time difference calculating unit is used for calculating the actual display time difference value of the reference frame and the next frame of the reference frame as the actual display time information in the application image synthesis display process.
Preferably, the actual display time difference calculation unit uses the actual display time of the reference frame and the frame next to the reference frame after the correction in calculating the actual display time difference,
the composite display time analysis module further comprises:
and the time correction unit is used for correcting the actual display time of the reference frame and the next frame of the reference frame according to the relation between the actual display time and the hardware vertical synchronization time neighborhood.
Preferably, the dropped frame determining module includes:
the frame dropping number calculating unit is used for calculating the frame dropping number of the planned time according to the planned synchronous time difference and calculating the actual frame dropping number according to the actual display time difference;
and calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result.
According to a third aspect of embodiments of the present disclosure, there is provided a computer apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having instructions stored thereon which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform a method of dropped frame detection, the method comprising:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: and determining the frame dropping quantity by comparing the time information of the application program initiating drawing and the time information of the actual display in the process of applying image synthesis display. The data submitting stage and the display stage of the application program are compared and analyzed, the whole image synthesis and display process is covered, various factors which can cause frame dropping are comprehensively considered, and the problem that the result has errors due to the fact that frame dropping detection is carried out on a certain link is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic diagram of an application image composition rendering process.
Fig. 2 is a schematic diagram of butter plan implementation.
Fig. 3 is a flow chart illustrating a method of dropped frame detection according to an example embodiment.
Fig. 4 is a timing diagram illustrating links in an image composition display process rendering flow according to an exemplary embodiment.
Fig. 5 is a flow chart illustrating a method of dropped frame detection according to an example embodiment.
FIG. 6 is a diagram illustrating a relationship between time points of a reference frame and a next frame according to an example embodiment
Fig. 7 is a flow chart illustrating a method of dropped frame detection according to an example embodiment.
Fig. 8 is a block diagram illustrating a dropped frame detection apparatus according to an example embodiment.
Fig. 9 is a schematic structural diagram illustrating an initiated drawing time analysis module 801 according to an exemplary embodiment.
Fig. 10 is a schematic diagram illustrating a structure of a composite display time analysis module 802 according to an exemplary embodiment.
Fig. 11 is a schematic diagram illustrating a structure of a composite display time analysis module 802 according to an exemplary embodiment.
Fig. 12 is a schematic structural diagram illustrating a dropped frame determination module 803 according to an exemplary embodiment.
Fig. 13 is a block diagram illustrating an apparatus (a general structure of a mobile terminal) according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The frame drop evaluation schemes for the APP submission and drawing stage and the Surface rendering stage do not consider the whole image synthesis and display sending process, and result errors exist.
In order to solve the above problem, embodiments of the present disclosure provide a method and an apparatus for detecting dropped frames, which take the entire image synthesis display process as an investigation object, evaluate the entire dropped frame situation, and solve the problem of result errors. Based on the image synthesis display process city, the influence of cache and different screen refresh rates on the frame dropping number calculation is fully considered, and the method is suitable for the accurate frame dropping calculation of various scenes, and comprises the following steps: continuous and irregular sliding, web browsing, video playing and the like are applied. And is suitable for different types and equipment use scenes in the market.
An exemplary embodiment of the present disclosure provides a method for detecting dropped frames, where a flow of dropped frame detection in an image synthesis display process using the method is shown in fig. 3, and the method includes:
step 301, acquiring time information of application program initiated drawing in the process of application image synthesis display.
In this step, in a stage of submitting drawing data by an application program in an image synthesis display process, janktacker records a drawing initiation time point (e.g., scheduled synchronization time, intendedpvsync) of each frame. By searching the JankTracker, the time information of the application program initiating the drawing can be obtained. Specifically, a drawing initiation time point of a reference frame and a next frame of the reference frame is obtained, and a difference between the two time points is used as time information for initiating drawing of an application program.
And step 302, acquiring the time information actually displayed in the process of applying image synthesis display.
In this step, in a Display control phase (Display Controller) in which an image composition Display process is applied, a time point (e.g., actual Display time, ActualPresentTime) at which each frame is actually displayed is recorded in the FrameTracker. The actual displayed time information can be obtained by searching the FrameTracker. Specifically, the actual display time point of the reference frame and the next frame of the reference frame is obtained, and the difference between the two time points is used as the actual display time information.
It should be noted that the above steps 301 and 302 are not strictly time-series related.
As shown in fig. 4, a time relationship diagram of each link in the flow is drawn for a normal image composition display process. Taking Buffer frame 1(Buffer 1) as an example, drawing data is submitted by the APP before two synchronization periods, IntendedVsync of Buffer 1 is determined, and ActualPresentTime of Buffer 1 is determined by displaying in a display link after two periods.
Step 303, comparing the time information for initiating the drawing with the actually displayed time information, and determining the number of dropped frames.
The time information for initiating the drawing relates to the initial stage of the image composition display flow, and the time information for actually displaying relates to the last stage of the image composition display flow. In this step, by comparing the time information of initiating the drawing with the time information of actually displaying, it is possible to determine the occurrence of frame dropping in the whole image synthesis display flow, and accurately determine whether frame dropping occurs.
An exemplary embodiment of the present disclosure further provides a method for detecting dropped frames, where a flow of acquiring time information of drawing initiated by an application program in an application image synthesis display process by using the method is shown in fig. 5, and includes:
step 501, selecting a reference frame.
In this step, a frame drop detection interval is determined, a first frame in the interval is taken as a reference frame, and then a next frame of the reference frame is selected. And taking the reference frame and the next frame of the reference frame as analysis objects to acquire time information for initiating drawing.
Step 502, finding a drop frame tracking record, and determining the scheduled synchronization time of the reference frame and the next frame of the reference frame.
In this step, a drop frame tracking record is searched, specifically, a JankTracker is searched to determine the scheduled synchronization time of the reference frame and the next frame of the reference frame.
FIG. 6 shows a relationship between time points of a reference frame and a next frame. Wherein, the intendedcvsync 1 is the scheduled synchronization time of the reference frame, and the intendedcvsync 2 is the scheduled synchronization time of the next frame of the reference frame; the ActualPresentTime1 is the actual display time of the reference frame, and the ActualPresentTime2 is the actual display time of the next frame of the reference frame.
Step 503, calculating the plan synchronization time difference value as the time information for the application program to initiate drawing.
In this step, the planned synchronization time difference is calculated according to the following expression:
the scheduled synchronization time difference is the scheduled synchronization time of the next frame of the reference frame-the scheduled synchronization time of the reference frame.
Specifically, it can be calculated according to the following expression:
IV=IntendedVsync2-IntendedVsync1
IVi.e. the scheduled synchronization time difference. To be provided withIVAs a frame drop determination criterion, a scene in which the screen is not refreshed when the device is not operated is considered. Therefore, the normal screen is not refreshed and is not counted as frame loss, and the applicable scene of the scheme is increased.
An exemplary embodiment of the present disclosure further provides a method for detecting dropped frames, where a flow of acquiring time information actually displayed in an application image synthesis display process by using the method is shown in fig. 7, and includes:
step 701, searching a frame display tracking record, and determining the actual display time of the reference frame and the next frame of the reference frame.
In this step, a frame display tracking record is searched, specifically, FrameTracker can be searched, and the actual display time of the reference frame and the next frame of the reference frame is determined.
Also taking fig. 6 as an example, ActualPresentTime1 is the actual display time of the reference frame, and ActualPresentTime2 is the actual display time of the next frame of the reference frame.
Step 702, calculating the actual display time difference as the actual display time information in the application image synthesis display process.
In this step, the actual display time difference is calculated according to the following expression:
and the actual display time difference is the actual display time of the next frame of the reference frame-the actual display time of the reference frame.
Specifically, the actual display time difference is calculated according to the following expression:
AC=ActualPresentTime2-ActualPresentTime1
ACi.e. the actual display time difference.
An exemplary embodiment of the present disclosure further provides a method for detecting a dropped frame, where an actual display time is not necessarily completely consistent with a vertical synchronization period of hardware (i.e., an actual present time is not necessarily completely consistent with a hardware Vsync period), before calculating an actual display time difference, actual display times of the reference frame and a next frame of the reference frame may be corrected according to a relationship between the actual display time and a neighborhood of the hardware vertical synchronization time, and the corrected actual display times of the reference frame and the next frame of the reference frame are used when calculating the actual display time difference.
The correction process is as follows:
correcting the actual display time of the reference frame and the next frame of the reference frame according to the following expression:
the corrected actual display time is the hardware vertical synchronization time if the actual display time is ∈ U (hardware vertical synchronization time, Δ),
where U represents the delta neighborhood of the hardware vertical synchronization time (e.g., HWVsync), e.g., U is 1 ms.
The hardware vertical synchronization time is a vertical synchronization (e.g., Vsync) time point of the hardware.
After the correction is completed, the corrected actual display time may be used to calculate the actual display time difference, which may be calculated according to the following expression:
and the actual display time difference is the actual display time of the next frame of the corrected reference frame-the actual display time of the corrected reference frame.
An exemplary embodiment of the present disclosure further provides a frame dropping detection method, which calculates a scheduled time frame dropping number according to the scheduled synchronization time difference, and calculates an actual display frame dropping number according to the actual display time difference; and then calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result. The specific process is as follows:
calculating the frame dropping number according to the plan synchronization time difference and the actual display time difference by the following expression:
Figure BDA0002615341870000081
wherein ceil () is a ceiling function.
The actual display time difference covers links of drawing, synthesizing, sending and displaying of the whole cache frame, and the whole image synthesizing and displaying process is considered.
Specifically, the actual display time difference may be calculated by the following expression:
Figure BDA0002615341870000082
wherein the content of the first and second substances,IVin order to plan for a time difference to be synchronized,ACfor the actual display time difference, ceil () is an rounding-up function, and RefreshPeriod is a screen refresh period.
An exemplary embodiment of the present disclosure further provides a device for detecting dropped frames, which is shown in fig. 8 and includes:
an initiated drawing time analysis module 801, configured to obtain time information for initiating drawing by an application program in an application image synthesis display process;
a composite display time analysis module 802, configured to obtain time information actually displayed in an application image composite display process;
and a dropped frame determining module 803, configured to compare the time information for initiating the drawing with the actually displayed time information, and determine the number of dropped frames.
Preferably, the structure of the initiation drawing time analysis module 801 is shown in fig. 9, and includes:
a reference frame selecting unit 901 configured to select a reference frame;
a dropped frame searching unit 902, configured to search a dropped frame tracking record, and determine scheduled synchronization time of the reference frame and a next frame of the reference frame;
and a scheduled synchronization time difference calculating unit 903, configured to calculate a scheduled synchronization time difference between the reference frame and a frame next to the reference frame as time information when the application program initiates drawing.
Preferably, the structure of the composite display time analysis module 802 is shown in fig. 10, and includes:
a frame display search unit 1001, configured to search a frame display tracking record, and determine actual display time of the reference frame and a next frame of the reference frame;
an actual display time difference calculating unit 1002, configured to calculate an actual display time difference value between the reference frame and a frame next to the reference frame as time information actually displayed in the application image synthesis display process.
Preferably, the actual display time difference calculation unit uses the actual display time of the reference frame and the frame next to the reference frame after the correction in calculating the actual display time difference,
the structure of the composite display time analysis module 802 is shown in fig. 11, and further includes:
a time correction unit 1003, configured to correct the actual display time of the reference frame and the actual display time of the next frame of the reference frame according to a relationship between the actual display time and a hardware vertical synchronization time neighborhood.
Preferably, the structure of the dropped frame determining module 803 is shown in fig. 12, and includes:
a dropped frame number calculating unit 1201, configured to calculate a dropped frame number for the planned time according to the planned synchronization time difference, and calculate an actual displayed frame number according to the actual display time difference;
and calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result.
The device can be integrated in equipment with a display module, and the equipment realizes corresponding functions. With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
An exemplary embodiment of the present disclosure also provides a computer apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
Fig. 13 is a block diagram illustrating an apparatus 1300 for dropped frame detection according to an example embodiment. For example, apparatus 1300 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and so forth.
Referring to fig. 13, the apparatus 1300 may include one or more of the following components: a processing component 1302, a memory 1304, a power component 1306, a multimedia component 1308, an audio component 1310, an interface for input/output (I/O) 1312, a sensor component 1314, and a communications component 1316.
The processing component 1302 generally controls overall operation of the device 1300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1302 may include one or more processors 1320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1302 can include one or more modules that facilitate interaction between the processing component 1302 and other components. For example, the processing component 1302 may include a multimedia module to facilitate interaction between the multimedia component 1308 and the processing component 1302.
The memory 1304 is configured to store various types of data to support operation at the device 1300. Examples of such data include instructions for any application or method operating on device 1300, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1304 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 1306 provides power to the various components of device 1300. The power components 1306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 1300.
The multimedia component 1308 includes a screen between the device 1300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1308 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the back-facing camera may receive external multimedia data when the device 1300 is in an operational mode, such as a capture mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1310 is configured to output and/or input audio signals. For example, the audio component 1310 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1300 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1304 or transmitted via the communication component 1316. In some embodiments, the audio component 1310 also includes a speaker for outputting audio signals.
The I/O interface 1312 provides an interface between the processing component 1302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1314 includes one or more sensors for providing various aspects of state assessment for the device 1300. For example, the sensor assembly 1314 may detect an open/closed state of the device 1300, the relative positioning of components, such as a display and keypad of the apparatus 1300, the sensor assembly 1314 may also detect a change in position of the apparatus 1300 or a component of the apparatus 1300, the presence or absence of user contact with the apparatus 1300, orientation or acceleration/deceleration of the apparatus 1300, and a change in temperature of the apparatus 1300. The sensor assembly 1314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1316 is configured to facilitate communications between the apparatus 1300 and other devices in a wired or wireless manner. The apparatus 1300 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1316 also includes a Near Field Communications (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1304 comprising instructions, executable by the processor 1320 of the apparatus 1300 to perform the method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform a method of dropped frame detection, the method comprising:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
The embodiment of the disclosure provides a dropped frame detection method and device, and the dropped frame number is determined by comparing time information of drawing initiated by an application program in an application image synthesis display process with time information of actual display. The data submitting stage and the display stage of the application program are compared and analyzed, the whole image synthesis and display process is covered, various factors which can cause frame dropping are comprehensively considered, and the problem that the result has errors due to the fact that frame dropping detection is carried out on a certain link is solved.
Describing the frame number with the interval of each intendedsync and ActualPresentTime interval, the effect of buffering on the final result is naturally avoided in the iterative computation.
The technical scheme provided by the embodiment of the disclosure shows good precision in the actual operation process, and the calculation result is completely consistent with the shooting statistical result of the high-speed camera. The method has universality and is suitable for sliding, webpage browsing, video playing and the like of applications. Meanwhile, the method can also be applied to different devices, the refresh rate of the current mobile phone terminal is mainly 60Hz, but more and more high-end machines are configured with screens of 90 Hz and 120Hz, so that the method is also very important for the adaptability of different refresh frequencies.
The technology provided by the embodiment of the disclosure has good usability, realizes automatic statistics, calculation and identification, does not need additional three-party equipment or manpower, has low realization cost, and is suitable for cases with high duration.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (12)

1. A method for detecting dropped frames, comprising:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
2. The method according to claim 1, wherein the step of obtaining the time information of the application program initiated drawing in the process of applying image synthesis display comprises:
selecting a reference frame;
searching a frame drop tracking record, and determining the scheduled synchronization time of the reference frame and the next frame of the reference frame;
and calculating the scheduled synchronous time difference value of the reference frame and the next frame of the reference frame as the time information for the application program to initiate drawing.
3. The method according to claim 2, wherein the step of acquiring time information actually displayed in the process of image synthesis display includes:
searching a frame display tracking record, and determining the actual display time of the reference frame and the next frame of the reference frame;
and calculating the actual display time difference value of the reference frame and the next frame of the reference frame as the actual display time information in the application image synthesis display process.
4. The dropped frame detection method according to claim 3, wherein the actual display time of the reference frame and the frame next to the reference frame after the correction is used in calculating the actual display time difference,
before the step of calculating the actual display time difference as the actual display time information in the application image synthesis display process, the method further includes:
and correcting the actual display time of the reference frame and the next frame of the reference frame according to the relation between the actual display time and the hardware vertical synchronization time neighborhood.
5. The method according to claim 3, wherein the step of comparing the time information of initiating the drawing with the time information of actually displaying to determine the number of dropped frames comprises:
calculating the number of frames of the planned time drop according to the planned synchronous time difference, and calculating the actual number of frames of the display drop according to the actual display time difference;
and calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result.
6. A dropped frame detection apparatus, comprising:
the system comprises an initiated drawing time analysis module, a drawing starting time analysis module and a drawing starting time analysis module, wherein the initiated drawing time analysis module is used for acquiring time information of an application program in the application image synthesis display process;
the composite display time analysis module is used for acquiring the time information actually displayed in the composite display process of the application image;
and the frame drop judging module is used for comparing the time information for initiating the drawing with the actually displayed time information and determining the number of frame drops.
7. The apparatus of claim 6, wherein the means for initiating drawing time analysis comprises:
a reference frame selecting unit for selecting a reference frame;
the frame drop searching unit is used for searching a frame drop tracking record and determining the scheduled synchronization time of the reference frame and the next frame of the reference frame;
and the planned synchronization time difference calculation unit is used for calculating the planned synchronization time difference value of the reference frame and the next frame of the reference frame as the time information for the application program to initiate drawing.
8. The apparatus according to claim 7, wherein the composite display time analysis module comprises:
the frame display searching unit is used for searching a frame display tracking record and determining the actual display time of the reference frame and the next frame of the reference frame;
and the actual display time difference calculating unit is used for calculating the actual display time difference value of the reference frame and the next frame of the reference frame as the actual display time information in the application image synthesis display process.
9. The frame drop detection device according to claim 8, wherein the actual display time difference calculation unit uses the corrected actual display time of the reference frame and the frame next to the reference frame in calculating the actual display time difference,
the composite display time analysis module further comprises:
and the time correction unit is used for correcting the actual display time of the reference frame and the next frame of the reference frame according to the relation between the actual display time and the hardware vertical synchronization time neighborhood.
10. The apparatus according to claim 8, wherein the dropped frame determining module comprises:
the frame dropping number calculating unit is used for calculating the frame dropping number of the planned time according to the planned synchronous time difference and calculating the actual frame dropping number according to the actual display time difference;
and calculating the difference between the actual drop frame number and the planned time drop frame number to serve as a final drop frame detection result.
11. A computer device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
12. A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform a method of dropped frame detection, the method comprising:
acquiring time information of application program initiated drawing in the process of synthesizing and displaying the application image;
acquiring time information actually displayed in an application image synthesis display process;
and comparing the time information for initiating drawing with the actually displayed time information to determine the frame dropping quantity.
CN202010767819.6A 2020-08-03 2020-08-03 Dropped frame detection method and device Active CN111880602B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010767819.6A CN111880602B (en) 2020-08-03 2020-08-03 Dropped frame detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010767819.6A CN111880602B (en) 2020-08-03 2020-08-03 Dropped frame detection method and device

Publications (2)

Publication Number Publication Date
CN111880602A true CN111880602A (en) 2020-11-03
CN111880602B CN111880602B (en) 2022-03-01

Family

ID=73205364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010767819.6A Active CN111880602B (en) 2020-08-03 2020-08-03 Dropped frame detection method and device

Country Status (1)

Country Link
CN (1) CN111880602B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767524A (en) * 2021-01-26 2021-05-07 北京小米移动软件有限公司 Image display method and device, electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101217339A (en) * 2007-12-29 2008-07-09 华为技术有限公司 A method, device and base station for frame dropping detection
US20090122879A1 (en) * 2006-05-05 2009-05-14 Mariner Partners, Inc. Transient video anomaly analysis and reporting system
US20090185072A1 (en) * 2008-01-18 2009-07-23 Kabushiki Kaisha Toshiba Information processing apparatus
JP2011019795A (en) * 2009-07-17 2011-02-03 Daito Giken:Kk Game machine
CN109522208A (en) * 2018-09-29 2019-03-26 中国平安人寿保险股份有限公司 Page fluency test method and device, computer installation and storage medium
CN109753262A (en) * 2019-01-04 2019-05-14 Oppo广东移动通信有限公司 frame display processing method, device, terminal device and storage medium
CN110806909A (en) * 2019-11-01 2020-02-18 北京金山安全软件有限公司 Method and device for determining page frame dropping information of application program and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090122879A1 (en) * 2006-05-05 2009-05-14 Mariner Partners, Inc. Transient video anomaly analysis and reporting system
CN101217339A (en) * 2007-12-29 2008-07-09 华为技术有限公司 A method, device and base station for frame dropping detection
US20090185072A1 (en) * 2008-01-18 2009-07-23 Kabushiki Kaisha Toshiba Information processing apparatus
JP2011019795A (en) * 2009-07-17 2011-02-03 Daito Giken:Kk Game machine
CN109522208A (en) * 2018-09-29 2019-03-26 中国平安人寿保险股份有限公司 Page fluency test method and device, computer installation and storage medium
CN109753262A (en) * 2019-01-04 2019-05-14 Oppo广东移动通信有限公司 frame display processing method, device, terminal device and storage medium
CN110806909A (en) * 2019-11-01 2020-02-18 北京金山安全软件有限公司 Method and device for determining page frame dropping information of application program and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767524A (en) * 2021-01-26 2021-05-07 北京小米移动软件有限公司 Image display method and device, electronic device and storage medium

Also Published As

Publication number Publication date
CN111880602B (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN107193678B (en) Method and device for determining cause of stuck and storage medium
EP3163887A1 (en) Method and apparatus for performing media synchronization
US20170064245A1 (en) Method, device, terminal device, and storage medium for video effect processing
CN107315792B (en) Page updating method and device, electronic equipment and computer readable storage medium
CN112114765A (en) Screen projection method and device and storage medium
US20180341501A1 (en) Method and device for distributing application
EP2988205A1 (en) Method and device for transmitting image
CN106775235B (en) Screen wallpaper display method and device
CN108769769B (en) Video playing method and device and computer readable storage medium
US20220256230A1 (en) Method and apparatus for video playing
CN106792024B (en) Multimedia information sharing method and device
CN111246278B (en) Video playing method and device, electronic equipment and storage medium
CN115719586A (en) Screen refresh rate adjusting method and device, electronic equipment and storage medium
CN111880602B (en) Dropped frame detection method and device
CN112333518B (en) Function configuration method and device for video and electronic equipment
CN112910592A (en) Clock synchronization method and device, terminal and storage medium
CN106354464B (en) Information display method and device
CN112333384B (en) Image preview method, image preview device and storage medium
CN110312117B (en) Data refreshing method and device
CN114554271A (en) Information pushing and displaying method and device, electronic equipment and storage medium
CN113660513A (en) Method, device and storage medium for synchronizing playing time
CN112866612A (en) Frame insertion method, device, terminal and computer readable storage medium
CN111356001A (en) Video display area acquisition method and video picture display method and device
CN106604088B (en) Method, device and equipment for processing data in buffer area
WO2023216146A1 (en) Display image updating method and apparatus and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant