CN116048933B - Fluency detection method - Google Patents

Fluency detection method Download PDF

Info

Publication number
CN116048933B
CN116048933B CN202210950178.7A CN202210950178A CN116048933B CN 116048933 B CN116048933 B CN 116048933B CN 202210950178 A CN202210950178 A CN 202210950178A CN 116048933 B CN116048933 B CN 116048933B
Authority
CN
China
Prior art keywords
frame
layer
electronic device
instruction
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210950178.7A
Other languages
Chinese (zh)
Other versions
CN116048933A (en
Inventor
孙丽娜
韦思臣
韩风
朱勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210950178.7A priority Critical patent/CN116048933B/en
Publication of CN116048933A publication Critical patent/CN116048933A/en
Application granted granted Critical
Publication of CN116048933B publication Critical patent/CN116048933B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a fluency detection method. The electronic device 100 may acquire and count drawing time of each frame image or a layer in the image when the frame is brushed, and determine information such as a frame interval between two adjacent frame images, accumulated frame loss time, and refresh completion rate. Then, the electronic device 100 may determine whether the frame is dropped in the frame brushing process, thereby determining whether the screen display is smooth in the frame brushing process. If the screen display is not smooth, the electronic equipment can save and download the field data in the process and report the field data to the cloud for subsequent developers to analyze the cause of the blocking, so that the smoothness of the screen display is improved.

Description

Fluency detection method
Technical Field
The application relates to the field of terminals, in particular to a fluency detection method.
Background
In the scenes of sliding, playing animation, video and the like of mobile phones, tablet computers and other terminal devices, it is generally required to continuously refresh the image content displayed in the screen so as to provide a smooth display effect. When a frame is dropped in the continuous screen refreshing process, a user can obviously feel the blocking or shaking, namely, the screen display is not smooth, so that the use experience of the user is affected.
Disclosure of Invention
In a first aspect, the present application provides a smoothness detection method applied to an electronic device having a screen, the method comprising: after the first instruction is acquired, starting to acquire the drawing time of the image displayed on the screen; after the second instruction is acquired, stopping acquiring the drawing time of the image displayed by the screen; determining first information according to drawing time of an image acquired between a first instruction and a second instruction; the first information is used for reflecting the fluency of the screen display image between the first instruction and the second instruction; when the first information indicates that the screen display image is not smooth, saving the field data of the screen display image between the first instruction and the second instruction; the field data includes first information.
By implementing the method provided by the first aspect, the application installed on the electronic equipment can call the method provided by the application at any time to detect the smoothness of screen display. The electronic equipment can acquire the drawing time of the image displayed on the screen between the two instructions according to the start detection instruction and the end detection instruction issued by the application. Then, according to the drawing time of each frame of the display image, the electronic device can determine whether the screen display is smooth between the two instructions. The electronic equipment can save the field information in the unsmooth detection process, and later analysis and use by subsequent developers.
In combination with the method provided in the first aspect, in some embodiments, the first information includes a frame interval, where the frame interval is a time interval during which two frames of images are displayed on the screen sequentially; when the inter-frame distance is longer than the inter-frame distance threshold, the first information indicates that the screen display image is not smooth.
By implementing the method provided by the embodiment, the electronic device can determine the frame interval between two successive frames of images according to the acquired drawing time of each frame of image. When there are one or more inter-frame distances longer than the inter-frame distance threshold, the first information indicates that the screen display image is not smooth. At this time, the electronic device may determine that the screen display is not smooth according to the first indication information.
In combination with the method provided in the first aspect, in some embodiments, a system refresh rate is preset in the electronic device, where the system refresh rate corresponds to a system frame spacing, and a frame spacing threshold is greater than or equal to the system frame spacing.
By implementing the method provided by the embodiment, the electronic device can set the system frame interval corresponding to the system refresh rate as the frame interval threshold. When the inter-frame distance determined during the detection is larger than the above-described system inter-frame distance, the first information indicates that the screen display image is not smooth. In some scenarios, it may also be acceptable for the user that the actual refresh rate of the electronic device is slightly lower than the system refresh rate. At this time, the threshold of the inter-frame distance set by the electronic device may be slightly higher than the system inter-frame distance.
In combination with the method provided in the first aspect, in some embodiments, a latitude M is further provided in the electronic device, where M is used to indicate a number of times that the inter-frame distance is allowed to be longer than the inter-frame distance threshold; when the frame interval is longer than the frame interval threshold value, the first information indicates that the screen display image is not smooth, specifically: for a plurality of frame intervals corresponding to the images displayed on the screen between the first instruction and the second instruction, when N frame intervals are longer than a frame interval threshold, the first information indicates that the images displayed on the screen are not smooth, and N is larger than M, and N and M are positive integers.
In some scenarios, when detecting smoothness, the electronic device may accidentally detect that the inter-frame distance is greater than the inter-frame distance threshold twice for a long period of time, where the two-frame dropping has substantially no effect on the user browsing the screen display. At this time, the electronic device may ignore the above-described two-time frame dropping situation by implementing the method provided in the above-described embodiment. And after the frame dropping times exceed the preset tolerance of the electronic equipment, the electronic equipment judges that the screen display is not smooth in the detection process. By implementing the method provided by the embodiment, the electronic equipment can further accurately position scenes with unsmooth screen display, which influence the use experience of the user.
In some embodiments, the first information further includes an average frame spacing, and the first information indicates that the screen display image is smooth when at least one of the frame spacing is longer than a frame spacing threshold, but the average frame spacing is less than the frame spacing threshold.
By implementing the method provided by the embodiment, the electronic device can also realize the condition of neglecting twice frame dropping through the average frame interval, so that scenes with unsmooth screen display, which influence the use experience of a user, are further accurately positioned.
In some embodiments, in combination with the method provided in the first aspect, before saving live data of the screen display image between the first instruction and the second instruction, the method further includes: acquiring a processor load between a first instruction and a second instruction; when the first information indicates that the screen display image is not smooth, saving the field data of the screen display image between the first instruction and the second instruction, specifically: and when the first information indicates that the screen display image is not smooth and the processor load is lower than the overload threshold, saving the field data of the screen display image between the first instruction and the second instruction.
When the processor load of the electronic device is too high, the electronic device is prone to frame dropping due to the fact that the processor is not timely calculating and rendering the image. The reason for frame dropping and the subsequent optimization direction in this scenario are relatively clear. Thus, after determining a dropped frame based on the inter-frame distance, the electronic device may also determine whether to save the field data by detecting the processor load in the process. The electronic device saves the field data during the detection when the processor load is below the overload threshold (i.e., not dropped frames due to processor overload).
In some embodiments, the method provided in connection with the first aspect, the field data further comprises one or more of: the system refresh rate, the image displayed on the screen, and the drawing time of the image displayed on the screen.
With reference to the method provided in the first aspect, in some embodiments, the method further includes: and sending the field data to a cloud for storage.
By implementing the method provided by the embodiment, the electronic device can store the field data in the unsmooth scene displayed on the storage screen and then send the data to the cloud for storage, so that the storage and calculation resources of the electronic device can be prevented from being occupied, and the field data of a plurality of electronic devices can be synthesized, so that the analysis and the processing of developers are facilitated.
With reference to the method provided in the first aspect, in some embodiments, a frame of image displayed on a screen is composed of one or more layers; acquiring drawing time of an image displayed on a screen, comprising: and acquiring drawing time of a first layer, wherein the first layer is a layer with image content changed in the detection process.
By implementing the method provided by the embodiment, the electronic device can determine the drawing time of each layer in one frame of image. Further, the electronic device can determine the detailed process of the electronic device drawing a frame of image displayed on the screen from the layer hierarchy, and further find out whether the screen display is blocked or not. The electronic device may also locate the layer that caused the jam, if any.
In combination with the method provided in the first aspect, in some embodiments, before the drawing time of the first layer is acquired, the method further includes: the first layer is determined by the window focus.
In a second aspect, the present application provides an electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a third aspect, embodiments of the present application provide a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, the application provides a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fifth aspect, the application provides a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
It will be appreciated that the electronic device provided in the second aspect, the chip system provided in the third aspect, the computer storage medium provided in the fourth aspect, and the computer program product provided in the fifth aspect are all configured to perform the method provided by the present application. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
Drawings
Fig. 1 is a schematic software structure of an electronic device 100 according to an embodiment of the present application;
fig. 2 is a schematic software structure of another electronic device 100 according to an embodiment of the present application;
FIG. 3A is a desktop image of a frame of a screen display of an electronic device 100 provided by an embodiment of the present application;
FIG. 3B is a schematic diagram of a layer structure of a desktop image according to an embodiment of the present application;
Fig. 4 is a timing chart of a smoothness detection method according to an embodiment of the present application;
FIG. 5A is a diagram of a user interface of a sliding switch desktop according to an embodiment of the present application;
fig. 5B is a two-frame image displayed on a screen in a process of sliding and switching a desktop according to an embodiment of the present application;
FIG. 5C is a diagram illustrating a layer structure of another desktop image according to an embodiment of the present application;
fig. 6 is a schematic diagram of determining detection information by drawing information according to an embodiment of the present application;
FIG. 7 is a timing diagram of overScroller for smoothness detection according to an embodiment of the present application;
fig. 8 exemplarily shows a hardware configuration diagram of the electronic device 100.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In the scenes of sliding, playing animations, videos, etc. of electronic devices such as mobile phones and tablet computers, it is generally required to continuously refresh the image content (brush frames) displayed in the screen to provide a smooth display effect. When the frame is dropped in the frame brushing process, a user can obviously feel the blocking and the shaking, namely, the screen display is not smooth, so that the use experience of the user is affected. The frame dropping means that a specified image is not displayed when a specified display signal arrives.
However, dropped frames in a single scene are generally sporadic and do not provide efficient information for development and analysis. At this time, in order to analyze the reason of frame dropping, reduce the frame dropping, and improve the display smoothness in the screen refreshing process, the electronic device needs to determine the scene where the frame dropping occurs, that is, the scene where the frame is swiped and the scene where the frame is stuck, and save the field data in the scene.
The application provides a fluency detection method. The method can be applied to electronic equipment with display capability such as mobile phones and tablet computers. The electronic device 100 is referred to as the electronic device such as a mobile phone and a tablet computer with display capability.
By implementing the smoothness detection method provided by the embodiment of the application, the electronic device 100 can acquire and count the drawing time of each frame of image when brushing the frame, and determine the interval of the drawing time between two adjacent frames of images, which is also called as the frame interval. If the frame interval in the frame brushing process meets the frame interval threshold, the electronic device 100 may determine that the frame dropping problem occurs when the frame brushing operation is performed, and further determine that the screen display is stuck (not smooth).
The above-described detection of whether the screen displays a smooth scene may be referred to as a detection scene. The detected scene of the screen display stuck determined through the detection may be referred to as a problem scene. The field data in the problem scene is the data required by the subsequent development and analysis. The field data includes, but is not limited to: image frames of the screen display, drawing time of the image frames, frame spacing, and system frame rate of the electronic device 100. After determining the problem scenario, the electronic device 100 may save the field data of the problem scenario.
After acquiring a large amount of field data of the electronic device 100 in the problem scene, the developer can analyze the reason for using the frame dropping, that is, the reason for the screen display to be blocked, so as to optimize the screen frame brushing process, reduce the frame dropping and blocking phenomena, and improve the smoothness of the screen display.
Not limited to a cell phone, tablet computer, electronic device 100 may also be a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular telephone, a Personal Digital Assistant (PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and the specific type of the electronic device is not particularly limited by the embodiments of the present application.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated. The smoothness detection method provided by the embodiment of the application can be also used for software systems of other layered architectures, which are not limited to the Android system.
Fig. 1 is a schematic software structure of an electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 1, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
An application framework layer (FWK) provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 1, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 2 is a schematic software structure of another electronic device 100 according to an embodiment of the present application. Fig. 2 is a schematic diagram of a software structure for implementing the smoothness detection method provided by the application based on fig. 1.
As shown in FIG. 2, the software architecture diagram illustrates the application framework layer, the system libraries, and the kernel layer. The application framework layer, the system library and the kernel layer correspond to the application framework layer, the system library and the kernel layer in fig. 1, and are not described herein.
In fig. 2, the application framework layer may include a smoothness detection interface (FluencyDetectorAPI), a synchronization interface. The fluency detection interface can be used for an application program or a client of an application layer to call the fluency detection method provided by the embodiment of the application and realize the fluency detection function. The synchronization interface may be used to send field data in the problem scenario to the cloud. Such as the above-mentioned synchronous interfacesAnd the hiviewtunnel interfaces below.
The system library may include a display module, a fluency detection module, and a synchronization module. The display module may be used to draw and display images. The fluency detection module can be used for detecting whether the problem of frame dropping occurs in the image display process. The synchronization module can be used for reporting field data in a problem scene.
Specifically, the display module may include a receiving module (SurfaceFlingerEx), a Layer management module (Layer), and a compositing module (SurfaceFlinger). The receiving module can be used for receiving and transmitting the fluency detection instruction. The smoothness detection instruction includes an instruction to start detection and an instruction to end detection. A Layer management module (Layer) may manage the window layers. In the embodiment of the application, the Layer can be used for determining the Layer with the image content changed in the screen refreshing process.
At some point, one frame of image displayed in the screen of the electronic device 100 may be composed of a plurality of image layers.
For example, fig. 3A illustrates a frame of desktop (homepage) image of electronic device 100. The desktop includes a status bar 201, an application icon tray 202. The application icon tray has a plurality of application icons presented therein. The desktop shown in fig. 3A may be split into multiple layers. Fig. 3B illustrates a layer structure of the desktop. As shown in fig. 3B, the desktop image shown in fig. 3A may include 3 layers: a control layer 20, a status bar layer 21, and a background layer 22. The control layer 20 is a layer where controls such as application icons and the like are located in the desktop. The status bar layer 21 is the layer of the desktop where the status bar 201 is located. The background layer 22 is the layer in which the desktop wallpaper is located. It will be appreciated that the desktop wallpaper shown in the background layer 20 is not specifically shown in fig. 3A and 3B.
The composition module may be used to compose and display images obtained from one or more image combinations. Generally, a frame of image displayed by the electronic device in the bright screen state includes a plurality of layers, and reference is made to the layer structure of the desktop shown in fig. 3B. The combined display of the multiple layers is achieved by the composition module described above. For example, the composition module combines the respective layer images shown in fig. 3B, and then displays the desktop image shown in fig. 3A in the screen.
The fluency detection module may include a detection module (FluencyDetector) and an analysis module (FluencyDetectorThread). The detection module can be used for acquiring drawing information of the layers. The drawing information includes, but is not limited to, an image identification (e.g., frame number) of the belonging image, a layer identification (e.g., layer name), a drawing time, and the like. The analysis module (FluencyDetectorThread) can count and analyze specific data in the drawing information so as to determine whether the screen display is smooth in the detection process.
The synchronization module corresponds to a synchronization interface of the application framework layer and can be used for reporting field data in a problem scene. The synchronization module can acquire the field data from the log module of the kernel layer, and send the field data to the synchronization interface of the application program framework layer, and then the field data is uploaded to the cloud end through the synchronization interface for subsequent analysis and use. The synchronization module may be hiview, corresponding to the synchronization interface hiviewtunnel used by the application framework layer.
The log module may be used to record field data in a problem scenario. When the analysis module determines that the detected frame-brushing scene has a frame dropping condition, namely that the screen display is stuck, the analysis module can instruct the log module to record the field data. At this time, the recorded field data is the field data in the problem scene. Then, the log module can send the field data to the synchronization module, and the field data is transmitted to the synchronization interface through the synchronization module, and then reported to the cloud for storage.
Fig. 4 is a timing chart of a smoothness detection method according to an embodiment of the present application.
As shown in fig. 4, the timing chart includes: a receiving module (SurfaceFlingerEx), a Layer management module (Layer), a synthesizing module (SurfaceFlinger), a detecting module (FluencyDetector) and an analyzing module (FluencyDetectorThread). The above modules may be described with reference to fig. 2, and are not described herein.
When an application or a client (such as a camera application, a video player and the like, which are hereinafter referred to as an upper layer application) in an application layer needs to monitor whether a frame is dropped in the running process of a program, the smoothness detection method provided by the embodiment of the application can be called to realize the function of detecting the smoothness of screen display.
S401, receiving an instruction for starting detection by SurfaceFlingerEx.
The upper layer application may determine when to begin detection and end detection. For example, the upper layer application may determine to begin detecting the smoothness of the screen refresh when switching pages, playing video. Referring to fig. 5A, the upper layer application may perform a smoothness detection method when an event of a left-hand switching desktop is detected, and determine whether the action of the left-hand switching desktop corresponds to a frame brushing process. After determining to begin detection, the upper layer application may send SurfaceFlingerEx an instruction to begin detection.
S402. surfeflinger ex sends a start detect instruction to FluencyDetector.
SurfaceFlingerEx to a system that can be used to receive and transmit fluency detection instructions. After receiving the start detection instruction issued by the upper layer application, surfaceFlingerEx may send the start detection instruction to FluencyDetector.
S403.FluencyDetector creates a detection information table (DetectionInfo).
After receiving the instruction to start detection, fluencyDetector may first create a detection information table (DetectionInfo). The detection information table may be used to record a scene in which detection occurs, detection information, and the like. The detection information can be used for later determining whether the screen smoothly displays the frame brushing image in the detection process.
Illustratively, the detection information table may be as shown in table 1:
TABLE 1
Scene number, layer name and scene description are used to record where the smoothness detection occurs. The scene number may be used to mark an application or client where smoothness detection occurs. The application or client may be any application installed on the electronic device 100, a client, or a functional module therein. The layer name may be used to mark the layer being detected. The scene description is optional and may be used to further describe where the smoothness detection occurs, such as a page switch scene from interface a to interface B.
The system refresh rate refers to the number of image frames that can be refreshed by the electronic device 100 in a theoretical unit of time. Taking a 60Hz refresh rate as an example, the electronic device 100 may refresh the image 60 times on the screen, i.e., brush the frame 60 times, within 1 second. The frame spacing of two adjacent frames of images displayed on a screen corresponding to the system refresh rate may be referred to as the system frame spacing. For example, a 60Hz refresh rate corresponds to a system frame spacing of 16.6 milliseconds, i.e., a time interval of 16.6 milliseconds between two frames of images displayed one after the other on the screen.
The start time refers to the time at which detection is started. Here, fluencyDetector may take the time of the received start detection instruction as the start time. The end time refers to the time at which the detection is ended. Correspondingly, fluencyDetector may take the time of the received end detection instruction as the end time.
The number of frames refers to the number of image frames refreshed on the screen during detection (between the instruction to start detection and the instruction to end detection). The longest inter-frame distance refers to the inter-frame distance having the longest time interval among all inter-frame distances during detection. The time when the frame interval of two adjacent frame images is longer than the system frame interval may be referred to as a frame loss time. The accumulated frame loss time length is the sum of the frame loss time of all frame intervals in the detection period compared with the frame interval of the system.
The refresh completion rate refers to the ratio of the number of image frames that the actual screen refreshes to the number that should theoretically be refreshed. For example, a system refresh rate of 60Hz indicates that electronic device 100 should theoretically display 60 frames of images within 1 second. During the actual swipe frame detection, the electronic device 100 displays 58-frame images within 1 second. Correspondingly, the refresh completion rate of the electronic device 100 during the above detection is 58/60, i.e., 0.9666.
When the detection information table is created, the values of part of the attributes in the table are currently determinable, and the rest part needs to wait for detection to finish to be determined. For example, upon receiving an instruction to BEGIN detection, fluencyDetector may record the timestamp of the receipt of the instruction, denoted as T-BEGIN. After creating the detection information table shown in table 1, fluencyDetector may determine the start time in table 1 as the above-described T-BEGIN. The end time in table 1 needs to be determined after FluencyDetector receives the instruction to end the test. Therefore, at this time, the end time in table 1 is to be determined.
S404. surfeflinger delivers VSync signals to Layer. The above-described instruction for delivering VSync signals is, for example, the DoTransaction instruction.
VSync is a periodic signal. The period length of the VSync signal coincides with the system frame spacing. For example, in a 60Hz refresh rate scene, the period of the vertical synchronization signal VSync is 16.6 milliseconds, and the time for which the screen displays one frame image is also 16.6 milliseconds.
SurfaceFlinger are modules for synthesis and display. When a VSync signal arrives, surfaceFlinger of the electronic device 100 may compose an image frame and send the frame image into the display buffer. The screen may acquire an image from the buffer and display it when the next VSync signal arrives.
In the embodiment of the present application, surfaceFlinger may compose and display a frame of image when each VSync signal arrives after starting detection. Meanwhile, surfaceFlinger may pass each VSync signal to a Layer as it arrives.
S405.layer determines the layer in which the image content changes.
After each time a VSync signal is received, the Layer can determine the Layer in which the change occurs in the newly synthesized image. The latest synthesized image is the image SurfaceFlinger synthesized in the VSync signal period.
FIG. 5B illustrates SurfaceFlinger a two-frame image synthesized during the sliding-switching desktop illustrated in FIG. 5A: f1, F2. F1 and F2 respectively correspond to one VSync signal period. SurfaceFlinger the VSync signal period for composite F1 is denoted VSync1 and the VSync signal period for composite F2 is denoted VSync2.
The Layer may traverse SurfaceFlinger individual layers in the synthesized image frame and determine which layers have changed in image content and which layers have not. The layer in which the image content changes may be referred to as a change layer. For example, after receiving VSync2, layer may obtain SurfaceFlinger layers of synthesized F2, as shown in fig. 5C. At this time, the Layer can determine that the control Layer 20 has changed, compared to the respective layers of F1 (refer to fig. 3B).
S406.layer sends the drawing information of the change layer to FluencyDetector.
After determining the change Layer, the Layer may send FluencyDetector drawing information of the change Layer. The drawing information includes, but is not limited to, layer name, drawing time, VSync period, and the like. The layer name is used to identify the change layer, e.g., control layer 20 of F2 above. SurfaceFlinger when a Layer is redrawn, the Layer can determine that the Layer is a change Layer. Thus, the Layer determines the time to change the Layer, i.e., the drawing time of the Layer.
Each time a frame of image is synthesized, the Layer can determine a change Layer in the frame of image, and send FluencyDetector drawing information of the change Layer. Also taking F2 as an example, after determining the change Layer control Layer 20 in F2, layer may send the drawing information of the control Layer 20 in the VSync period to FluencyDetector. Drawing information of the control layer 20 is, for example: control layers 20, T2, VSync2. Where t_f2 determines the time at which the control Layer 20 changes for the Layer.
It will be appreciated that there are a plurality of layers that change during a screen refresh. In this case, there are also a plurality of Layer-defined change layers. Accordingly, the Layer may send FluencyDetector the drawing information of the plurality of change layers. The Layer may transmit all the drawing information of the plurality of layers at one time, or may sequentially transmit the drawing information of each Layer by Layer.
Also taking the scene of switching the desktop as shown in fig. 5A as an example, when the wallpaper in the desktop is dynamic wallpaper, the Layer needs to draw the image of the corresponding background Layer 22 in addition to drawing the control Layer 20 described in F2. At this time, the Layer-determined change Layer further includes a background Layer 22. At this time, the Layer may also send the rendering information of the background Layer 22 to FluencyDetector.
S407.fluency detector records drawing information.
After each receipt of the drawing information, fluencyDetector may record the drawing information. Taking the 2-frame image shown in fig. 5B as an example, fluencyDetector may receive the drawing information of the 2-frame image as shown in table 2 below:
TABLE 2
Frame numbering Layer name Drawing time VSync period
F1 Control layer 20 T1 VSync1
F2 Control layer 20 T2 VSync2
…… …… …… ……
Wherein T1 and T2 respectively represent the time when the Layer determines that the control Layer 20 in F1 and F2 has changed, i.e. redrawing has occurred.
When multiple layers are updated simultaneously, for example, the control layer 20 and the background layer 22 are updated simultaneously, the drawing information of the 2 frames of images received by FluencyDetector may be as shown in table 3:
TABLE 3 Table 3
Wherein T3 and T4 respectively represent the time when the Layer determines the redrawing of the background Layer 22 in F1 and F2.
S408, surfaceFlingerEx receives an instruction for ending detection.
Referring to the description of S401, the upper layer application may determine when to start detection and end detection. Thus, at any time, the upper layer application may determine to end the detection. At this point, the upper layer application may send an instruction to SurfaceFlingerEx to end the detection. After receiving the instruction of ending detection issued by the upper layer application, s409.surfeflingerex sends the instruction of ending detection to FluencyDetector.
S410, waking FluencyDetectorThread. Upon receiving an instruction to end the detection, fluencyDetector may send an indication to FluencyDetectorThread to wake up FluencyDetectorThread. Meanwhile, after receiving the instruction for ending the detection, fluencyDetector will not receive new drawing information any more.
S411, determining detection information based on the recorded drawing information, and writing the detection information into a detection information table.
After waking FluencyDetectorThread, fluencyDetectorThread may obtain and count FluencyDetector recorded drawing information. In the statistics process, fluencyDetectorThread may determine the frame spacing between each image frame in the detection process, so as to determine whether the frame dropping condition exists in the detection process, and whether the screen display is smooth.
Fig. 6 is a schematic diagram of determining detection information by drawing information according to an embodiment of the present application.
As shown in FIG. 6, F1-F7 may be used to represent 7 frame images that control layer 20 continuously updates during detection. Referring to S407, fluencyDetector may record the drawing information of the 7 frame images, including the layer name, drawing time, VSync period, and the like of each frame image.
FluencyDetectorThread may determine the frame interval between two adjacent frames of the 7-frame image during the refresh process based on the drawing time in the drawing information. FluencyDetectorThread may obtain a preset inter-frame distance threshold for electronic device 100. FluencyDetectorThread may determine that a frame drop has occurred between two adjacent frame images when the frame spacing between the two adjacent frame images is greater than a frame spacing threshold. Further, fluencyDetectorThread may mark the detection scenario as a problem scenario.
The inter-frame distance threshold is preset, and the electronic device 100 may set inter-frame distance thresholds with different sizes according to various factors such as a refresh rate supported by the electronic device 100, a user experience, and a power policy. Preferably, the inter-frame distance threshold is consistent with the system inter-frame distance indicated by the system refresh rate. Alternatively, in a scenario where the user's refresh smoothness is relatively low, the frame spacing threshold may also be slightly greater than the system frame spacing described above.
For example, fluencyDetectorThread may determine that the drawing times of the 7 frame images are respectively: T1-T7. According to the drawing time FluencyDetectorThread, the frame interval between two adjacent frame images in the 7 frame images can be determined respectively: S1-S6. For example, the preset frame spacing threshold is 17 milliseconds, s1=s2=s3=s4=s6=16.6 milliseconds, s5=19 milliseconds. At this time, fluencyDetectorThread may determine that a frame drop occurred while electronic device 100 displayed F6 because the inter-frame distance S5 between images F5 and F6 exceeded the inter-frame distance threshold of 17 milliseconds.
Optionally, fluencyDetectorThread may determine that the screen display is stuck after determining that there is a frame spacing longer than the frame spacing threshold during the detection (one drop of frames occurs). FluencyDetectorThread may mark the detected scene as a problem scene.
Optionally, a latitude may also be provided in the electronic device 100. The latitude may indicate the number of times the allowed inter-frame distance is longer than the inter-frame distance threshold. For example, in a scene with a tolerance of 3, fluencyDetectorThread may determine that the screen display is smooth if the inter-frame distance is longer than the inter-frame distance threshold only once in the entire detected scene. The detection scenario is not a problem scenario. If the inter-frame distance is longer than the inter-frame distance threshold for 4 times in the whole detection process, fluencyDetectorThread can determine that the screen display is stuck. At this time, the detection scene may be marked as a problem scene.
In some embodiments FluencyDetectorThread may also combine the average frame spacing, accumulated frame loss time, refresh completion rate, etc. to determine if the frame is dropped and if the screen display is smooth.
For example, if there is a longest inter-frame distance longer than the inter-frame distance threshold, or a plurality of inter-frame distances longer than the inter-frame distance threshold, but the average inter-frame distance is lower than the inter-frame distance threshold, fluencyDetectorThread may determine that a frame drop occurred in the detection process, but that the screen display was smooth. Similarly, if there is a longest frame interval longer than the frame interval threshold, or a plurality of frame intervals longer than the frame interval threshold, but the accumulated frame loss time does not reach the preset value, fluencyDetectorThread may also determine that the screen display is smooth. If the longest frame interval is longer than the frame interval threshold, or a plurality of frame intervals are longer than the frame interval threshold, the refresh completion rate satisfies the preset value, fluencyDetectorThread may also determine that the screen display is smooth.
In some embodiments FluencyDetectorThread may also determine from the VSync period in the rendering information whether a frame drop occurred during detection. SurfaceFlinger should compose and display a frame of image in each VSync period. This means that the identification within the VSync period of the images displayed in two adjacent frames should be continuous. If the identifiers of VSync periods of two adjacent frames in the drawing information recorded in FluencyDetector are not continuous, the electronic device 100 may determine that a frame is dropped in the detection process.
For example, in theory, in fig. 6, each frame corresponds to one VSync period, and then the VSync periods of F1 to F7 are VSync1 to VSync7, respectively. If, during the detection process, the VSync period in which F5 is located is VSync5 and the VSync period in which F6 is located is VSync7, then FluencyDetectorThread may also determine that a frame drop occurs between F5 and F6.
In some embodiments, fluencyDetectorThread may also obtain processor load based on determining that the screen display is stuck. FluencyDetectorThread may determine, according to the processor load, whether the detected scene of the stuck is a problem scene. Optionally, the processor load may include a central processing unit (central processing unit, CPU) load and/or a GPU load.
Screen display jamming is understood in the event that the processor load is too high. At this time, the problem of screen display blocking can be correspondingly solved by improving the processing capacity of the processor and optimizing the processing strategy. If the processor is not overloaded, screen display stuck still occurs. The problem of causing the screen display to jam is unknown. The developer needs to further study and analyze the field data in the scene to determine the cause of the problem. Therefore, in view of this, fluencyDetectorThread marks the detected scene of screen display stuck as a problem scene when the processor load is below the overload threshold, i.e., not overloaded. The overload threshold is preset.
When the parameters of the frame interval, the longest frame interval, the average frame interval, the accumulated frame loss time, the refresh completion rate, and the like are determined FluencyDetectorThread, fluencyDetectorThread may write specific values of the parameters into the created detection information table (detectionInfo). In the method for determining whether the processor is overloaded, the detection information table may further include a processor load for recording a processor load condition of the electronic device 100 during the detection.
Taking the detection scenario shown in fig. 6 as an example, after the detection is finished, the electronic device 100 may obtain the detection information table shown in table 4:
TABLE 4 Table 4
/>
Wherein, T-BEGIN is FluencyDetector the time when the start detection instruction is received; the T-END is FluencyDetector times when the END detection instruction is received. As shown in fig. 6, the number of image frames refreshed by the electronic device 100 during detection is 7. The inter-frame distances between the 7 frames of images are in turn: 16.6, 19, 16.6. The longest frame spacing is 19. The accumulated drop time is 2.4. The refresh completion rate is 0.9833 (59/60). It will be appreciated that the values of the attributes in table 4 above are exemplary.
S412, confirming the problem scene and storing the field data of the problem scene.
S411 determines detection information at introduction FluencyDetectorThread: the method of determining whether the detected scene is a problem scene by FluencyDetectorThread in combination with the above information is also described in the following description, and details are not repeated here.
The field data for the problem scenario may be saved and reported by FluencyDetectorThread after the problem scenario is determined. The field data includes at least the detection information described in the detection information table. Further, the field data may further include each frame of image of the change layer and corresponding drawing information thereof. For example, after determining that the detected scene shown in fig. 6 is a problem scene, the detection information shown in table 4 is field data. Further, the field data may further include each frame of images F1-F7 updated by the change layer in the detected scene shown in fig. 6, and corresponding rendering information thereof.
The log module may then generate a detection log corresponding to each problem scenario. The detection log includes the field data. Furthermore, the log module can upload the detection log to the cloud for storage through the synchronization module and the synchronization interface for subsequent analysis. The synchronization module may report the field data of the problem scene stored in the electronic device 100 to the cloud after the field data reaches a certain scale, so as to improve the data transmission efficiency.
In fig. 4, the instruction to start detection shown in S401 may be referred to as a first instruction; the instruction of ending the detection shown in S408 may be referred to as a second instruction. The first information includes inter-frame distances, inter-frame distance thresholds, system inter-frame distances, latitude, average inter-frame distances, and the like. The first information may further include a system refresh rate, a start time, an end time, a number of frames, a longest frame interval, an accumulated frame loss duration, a refresh completion rate, etc., as described with reference to S411. As described with reference to S406, the change layer may be referred to as a first layer, such as control layer 20.
By implementing the smoothness detection method provided by the embodiment, the upper layer application installed on the electronic device 100 can call the smoothness detection method provided by the application at any time, acquire the drawing information during the screen refreshing period, and then determine whether the frame dropping problem occurs during the screen refreshing period based on the drawing information. After confirming the problem of frame dropping, the electronic device 100 can save the field data and upload the field data to the cloud for storage, thereby providing effective data support for subsequent analysis of the problem scene and optimization of screen refreshing.
Fig. 7 is a timing chart of smoothness detection performed by a desktop application in a scenario of sliding a switching desktop according to an embodiment of the present application.
As shown in fig. 7, the timing diagram includes a smooth scrolling module, a layer normalization module, a smoothness detection module, and a synchronization module. The smooth scrolling module is used for realizing smooth scrolling of the window view. In this embodiment, the smooth scrolling module corresponds to an upper layer application. The smooth scrolling module may detect whether a frame drop problem occurs in the smoothly scrolled scene in a process of implementing smooth scrolling of the window view. The layer normalization module may be used to determine a target layer at which a user operates the focus. In general, a target layer in which a user operates focus is a layer in which a user can intuitively feel that a change in image content occurs in a detection scene.
Wherein, inIn the architecture, the smooth scrolling module may be overScroller; the smoothing detection module FlingJankDetector may be used to determine a target layer at which the user is operating the focus, i.e., to determine a detected layer. Further FlingJankDetector can be regarded as a receiving module (SurfaceFlingerEx) at overScroller. Global window management module windowGlobalManager can be used to provide all layers of a window.
S701. overscaler sends FlingJankDetector an instruction to start detection.
As shown in fig. 7, overScroller may send FlingJankDetector an instruction to start detection during execution of the smooth scroll. For example, referring to the user interface shown in fig. 5A, overScroller may determine that a smooth scroll action is to be performed after a user's sliding operation is detected. At this point overScroller may trigger the fluency detection procedure, issuing instructions to begin detection.
Upon receiving an instruction to begin detection, s702.Flingjank detector may send an instruction to windowGlobalManager to acquire a window view, e.g., getWindowView ().
S703, returning all layers of all window views. In response to the instruction to acquire the window view windowGlobalManager may determine all layers of all windows currently and return FlingJankDetector all layers described above.
S704, traversing all layers by the FlingJankDetector to determine a target layer in which a user operates to focus.
After receiving all the layers of the current window view returned by windowGlobalManager, flingJankDetector may determine the layer having the window focus among all the layers, and determine the layer to be subjected to sliding refresh. The layer with the window focus is the target layer focused by the user operation. FlingJankDetector the layer with the focus of the window can be determined according to the information of the position on the screen where the user operates, the type of operation (sliding, clicking, double clicking, long pressing, etc.), the scene where the operation occurs, etc.
Assuming the current view is the user interface shown in FIG. 5A, flingJankDetector may determine that all layers of the current view include: a control layer 20, a status bar layer 21, and a background layer 22. At this time FlingJankDetector may determine that the focused layer on which the user sliding operation is acting is the control layer 20, i.e., the control layer 20 is the layer with the focus of the window. The control layer 20 is the target layer.
S705.flingjank detector instructs the fluency detection module to start detecting the dropped frame condition of the target layer.
Specifically, flingJankDetector may send an instruction to start detection to the fluency detection module. In response to the above instruction in response to the timing chart shown in fig. 4, fluencyDetector in the smoothness detection module may obtain the drawing information of the control Layer 20 from the Layer after each Layer determines to change the control Layer 20, and record the drawing information, specifically referring to the description of S404-S407 in fig. 4.
At the moment the upper layer application determines to end the detection, e.g. after one sliding operation ends, s706. Overscaler sends to FlingJankDetector an instruction to end the detection. In response to the above instruction, s707.Flingjankdetector instructs the fluency detection module to stop detecting the frame dropping condition of the target layer.
In combination with the timing chart shown in fig. 4, after receiving the above-mentioned instruction to stop detecting the target Layer, the Layer no longer can send the currently refreshed Layer and its drawing information to the smoothness detection module. I.e. the fluency detection module no longer obtains the new control Layer 20 image from Layer and the drawing information of the frame image.
Meanwhile, fluencyDetectorThread in the fluency detection module can start counting the frame brushing condition in the detection process.
S708, determining a problem scene of screen display blocking based on the recorded drawing information.
The fluency detection module can carry out statistical analysis according to recorded drawing information to determine detection information such as frame spacing and the like, so as to determine whether the problem of frame dropping exists in the detection process and whether screen display is fluent. Scenes in which the screen display is stuck may be marked as problem scenes.
The fluency detection module may record the detection information, for example, write the detection information into a detection information table, while determining the detection information. When the detected scene is determined to be a problem scene, the detection information is field data. The fluency detection module can directly acquire the field data of the problem scene from the detection information table.
S709, reporting field data of the problem scene.
After confirming that the detected scene is a problem scene, the fluency detection module can send the field data of the problem scene to the synchronization module, and then report the field data to the cloud for storage for subsequent analysis.
Fig. 8 exemplarily shows a hardware configuration diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD). The display panel may also be manufactured using organic light-emitting diodes (OLED), active-matrix organic LIGHT EMITTING diode (AMOLED), flexible light-emitting diodes (FLED), miniled, microled, micro-OLED, quantum dot LIGHT EMITTING diodes (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
In an embodiment of the present application, the electronic device 100 relies on the GPU, the display 194, and display functions provided by an application processor, etc. to implement a screen refresh function. Further, the electronic device 100 may detect whether a frame dropping problem affecting the smoothness of screen display occurs in the above-described screen refreshing process.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (dynamic random access memory, DRAM), synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, e.g., fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc. The nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC-LEVEL CELL), multi-level memory cells (multi-LEVEL CELL, MLC), triple-level memory cells (LEVEL CELL, TLC), quad-LEVEL CELL, QLC), etc. divided according to a memory cell potential order, may include general FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory card (eMMC) MEDIA CARD, eMMC), etc. divided according to a memory specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
In an embodiment of the present application, the program code for implementing smoothness detection by the electronic device 100 may be stored in the nonvolatile memory; when the smoothness detection method is called for detection, the electronic device 100 can load the program codes into a random access memory and execute the program codes, so that smoothness detection provided by the application is realized.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary.
The distance sensor 180F is used to measure a distance. The electronic device 100 may measure the distance by infrared or laser. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In an embodiment of the present application, the electronic device 100 may receive a touch operation of a user, such as a sliding operation shown in fig. 5A, through the touch sensor 180K.
The bone conduction sensor 180M may acquire a vibration signal. The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a control of pictures, words, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being a toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), picture and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifying the controls contained in the interface by nodes < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application program interface, and is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, GTML), cascading style sheets (CASCADING STYLE SHEETS, CSS), java script (JavaScript, JS), etc., and the web page source code may be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser function. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as GTML defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items. As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.

Claims (6)

1. The smoothness detection method is applied to an electronic device with a screen, and is characterized by comprising a desktop application, a detection module, an analysis module and a log module, wherein the method comprises the following steps:
after detecting a sliding operation acting on a first desktop, the desktop application sends a first instruction to the detection module;
Responding to the first instruction, the detection module starts to acquire drawing time of a first image layer, wherein the first image layer is an image layer with changed image content in an image displayed on the screen, and the first image layer is determined through a window focus;
After the second desktop is displayed, the desktop application sends a second instruction to the detection module;
responding to the second instruction, and stopping obtaining the drawing time of the first layer by the detection module;
The analysis module determines a frame interval according to the drawing time of the first layer acquired between the first instruction and the second instruction;
when N inter-frame distances in the inter-frame distances are longer than an inter-frame distance threshold, the analysis module determines that the screen display image is not smooth, wherein N is more than M, M is used for indicating the number of times that the allowed inter-frame distances are longer than the inter-frame distance threshold, and N and M are both positive integers;
When the screen display image is not smooth and the load of the processor between the first instruction and the second instruction is lower than an overload threshold value, the log module stores field data of the screen display image between the first instruction and the second instruction; the field data includes the inter-frame distance.
2. The method of claim 1, wherein a system refresh rate is preset in the electronic device, the system refresh rate corresponds to a system frame spacing, and a frame spacing threshold is greater than or equal to the system frame spacing.
3. The method of claim 1, wherein the field data further comprises one or more of: the system refresh rate, the image displayed on the screen, and the drawing time of the image displayed on the screen.
4. The method according to claim 1, wherein the method further comprises: and sending the field data to a cloud for storage.
5. An electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the method of any of claims 1-4 to be performed.
6. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the method of any of claims 1-4 to be performed.
CN202210950178.7A 2022-08-09 2022-08-09 Fluency detection method Active CN116048933B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210950178.7A CN116048933B (en) 2022-08-09 2022-08-09 Fluency detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210950178.7A CN116048933B (en) 2022-08-09 2022-08-09 Fluency detection method

Publications (2)

Publication Number Publication Date
CN116048933A CN116048933A (en) 2023-05-02
CN116048933B true CN116048933B (en) 2024-05-03

Family

ID=86127868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210950178.7A Active CN116048933B (en) 2022-08-09 2022-08-09 Fluency detection method

Country Status (1)

Country Link
CN (1) CN116048933B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116916093B (en) * 2023-09-12 2023-11-17 荣耀终端有限公司 Method for identifying clamping, electronic equipment and storage medium
CN117097883B (en) * 2023-10-20 2024-04-12 荣耀终端有限公司 Frame loss fault cause determining method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165154A (en) * 2018-09-10 2019-01-08 广东小天才科技有限公司 A kind of display interface fluency statistical method and system, mobile terminal and server
CN111159042A (en) * 2019-12-31 2020-05-15 可牛网络技术(北京)有限公司 Fluency testing method and device and electronic equipment
CN112711519A (en) * 2019-10-25 2021-04-27 腾讯科技(深圳)有限公司 Method and device for detecting fluency of picture, storage medium and computer equipment
CN114648951A (en) * 2022-02-28 2022-06-21 荣耀终端有限公司 Method for controlling dynamic change of screen refresh rate and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165154A (en) * 2018-09-10 2019-01-08 广东小天才科技有限公司 A kind of display interface fluency statistical method and system, mobile terminal and server
CN112711519A (en) * 2019-10-25 2021-04-27 腾讯科技(深圳)有限公司 Method and device for detecting fluency of picture, storage medium and computer equipment
CN111159042A (en) * 2019-12-31 2020-05-15 可牛网络技术(北京)有限公司 Fluency testing method and device and electronic equipment
CN114648951A (en) * 2022-02-28 2022-06-21 荣耀终端有限公司 Method for controlling dynamic change of screen refresh rate and electronic equipment

Also Published As

Publication number Publication date
CN116048933A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN110597512B (en) Method for displaying user interface and electronic equipment
CN115473957B (en) Image processing method and electronic equipment
CN116048933B (en) Fluency detection method
CN113132526B (en) Page drawing method and related device
CN113553130A (en) Method for executing drawing operation by application and electronic equipment
CN114371985A (en) Automated testing method, electronic device, and storage medium
CN116483734B (en) Pile inserting method and system based on compiler and related electronic equipment
CN116467221B (en) Pile inserting method and system based on interpreter and related electronic equipment
WO2023016014A1 (en) Video editing method and electronic device
CN112416984A (en) Data processing method and device
WO2022033355A1 (en) Mail processing method and electronic device
CN114205457B (en) Method for moving user interface element, electronic equipment and storage medium
CN115994006A (en) Animation effect display method and electronic equipment
CN115964231A (en) Load model-based assessment method and device
WO2024046010A1 (en) Interface display method, and device and system
CN114690975B (en) Dynamic effect processing method and related device
CN117221713B (en) Parameter loading method and electronic equipment
CN116193275B (en) Video processing method and related equipment
WO2022166550A1 (en) Data transmission method and electronic device
WO2024083014A1 (en) Interface generation method and electronic device
WO2024061292A1 (en) Interface generation method and electronic device
WO2024083009A1 (en) Interface generation method and electronic device
US20240061549A1 (en) Application switching method, graphical interface, and related apparatus
WO2024027570A1 (en) Interface display method and related apparatus
CN117909000A (en) Interface generation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant