CN116048933A - Fluency detection method - Google Patents
Fluency detection method Download PDFInfo
- Publication number
- CN116048933A CN116048933A CN202210950178.7A CN202210950178A CN116048933A CN 116048933 A CN116048933 A CN 116048933A CN 202210950178 A CN202210950178 A CN 202210950178A CN 116048933 A CN116048933 A CN 116048933A
- Authority
- CN
- China
- Prior art keywords
- frame
- instruction
- layer
- electronic device
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 163
- 238000000034 method Methods 0.000 claims abstract description 87
- 230000008569 process Effects 0.000 claims abstract description 37
- 230000015654 memory Effects 0.000 claims description 39
- 238000004590 computer program Methods 0.000 claims description 7
- 230000001680 brushing effect Effects 0.000 abstract description 10
- 230000000903 blocking effect Effects 0.000 abstract description 6
- 239000010410 layer Substances 0.000 description 202
- 238000007726 management method Methods 0.000 description 25
- 230000008859 change Effects 0.000 description 22
- 238000012545 processing Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 18
- 238000004458 analytical method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 238000010295 mobile communication Methods 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 10
- 230000004044 response Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 210000004027 cell Anatomy 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000002618 waking effect Effects 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000012792 core layer Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The application provides a smoothness detection method. The electronic device 100 may acquire and count drawing time of each frame image or a layer in the image when the frame is brushed, and determine information such as a frame interval between two adjacent frame images, accumulated frame loss time, and refresh completion rate. Then, the electronic device 100 may determine whether the frame is dropped in the frame brushing process, thereby determining whether the screen display is smooth in the frame brushing process. If the screen display is not smooth, the electronic equipment can save and download the field data in the process and report the field data to the cloud for subsequent developers to analyze the cause of the blocking, so that the smoothness of the screen display is improved.
Description
Technical Field
The application relates to the field of terminals, in particular to a smoothness detection method.
Background
In the scenes of sliding, playing animation, video and the like of mobile phones, tablet computers and other terminal devices, it is generally required to continuously refresh the image content displayed in the screen so as to provide a smooth display effect. When a frame is dropped in the continuous screen refreshing process, a user can obviously feel the blocking or shaking, namely, the screen display is not smooth, so that the use experience of the user is affected.
Disclosure of Invention
In a first aspect, the present application provides a smoothness detection method applied to an electronic device having a screen, the method including: after the first instruction is acquired, starting to acquire the drawing time of the image displayed on the screen; after the second instruction is acquired, stopping acquiring the drawing time of the image displayed by the screen; determining first information according to drawing time of an image acquired between a first instruction and a second instruction; the first information is used for reflecting the fluency of the screen display image between the first instruction and the second instruction; when the first information indicates that the screen display image is not smooth, saving the field data of the screen display image between the first instruction and the second instruction; the field data includes first information.
By implementing the method provided by the first aspect, the application installed on the electronic equipment can call the method provided by the application at any time to detect the smoothness of screen display. The electronic equipment can acquire the drawing time of the image displayed on the screen between the two instructions according to the start detection instruction and the end detection instruction issued by the application. Then, according to the drawing time of each frame of the display image, the electronic device can determine whether the screen display is smooth between the two instructions. The electronic equipment can save the field information in the unsmooth detection process, and later analysis and use by subsequent developers.
In combination with the method provided in the first aspect, in some embodiments, the first information includes a frame interval, where the frame interval is a time interval during which two frames of images are displayed on the screen sequentially; when the inter-frame distance is longer than the inter-frame distance threshold, the first information indicates that the screen display image is not smooth.
By implementing the method provided by the embodiment, the electronic device can determine the frame interval between two successive frames of images according to the acquired drawing time of each frame of image. When there are one or more inter-frame distances longer than the inter-frame distance threshold, the first information indicates that the screen display image is not smooth. At this time, the electronic device may determine that the screen display is not smooth according to the first indication information.
In combination with the method provided in the first aspect, in some embodiments, a system refresh rate is preset in the electronic device, where the system refresh rate corresponds to a system frame spacing, and a frame spacing threshold is greater than or equal to the system frame spacing.
By implementing the method provided by the embodiment, the electronic device can set the system frame interval corresponding to the system refresh rate as the frame interval threshold. When the inter-frame distance determined during the detection is larger than the above-described system inter-frame distance, the first information indicates that the screen display image is not smooth. In some scenarios, it may also be acceptable for the user that the actual refresh rate of the electronic device is slightly lower than the system refresh rate. At this time, the threshold of the inter-frame distance set by the electronic device may be slightly higher than the system inter-frame distance.
In combination with the method provided in the first aspect, in some embodiments, a latitude M is further provided in the electronic device, where M is used to indicate a number of times that the inter-frame distance is allowed to be longer than the inter-frame distance threshold; when the frame interval is longer than the frame interval threshold value, the first information indicates that the screen display image is not smooth, specifically: for a plurality of frame intervals corresponding to the images displayed on the screen between the first instruction and the second instruction, when N frame intervals are longer than a frame interval threshold, the first information indicates that the images displayed on the screen are not smooth, and N is larger than M, and N and M are positive integers.
In some scenarios, when detecting smoothness, the electronic device may accidentally detect that the inter-frame distance is greater than the inter-frame distance threshold twice for a long period of time, where the two-frame dropping has substantially no effect on the user browsing the screen display. At this time, the electronic device may ignore the above-described two-time frame dropping situation by implementing the method provided in the above-described embodiment. And after the frame dropping times exceed the preset tolerance of the electronic equipment, the electronic equipment judges that the screen display is not smooth in the detection process. By implementing the method provided by the embodiment, the electronic equipment can further accurately position scenes with unsmooth screen display, which influence the use experience of the user.
In some embodiments, the first information further includes an average frame spacing, and the first information indicates that the screen display image is smooth when at least one of the frame spacing is longer than a frame spacing threshold, but the average frame spacing is less than the frame spacing threshold.
By implementing the method provided by the embodiment, the electronic device can also realize the condition of neglecting twice frame dropping through the average frame interval, so that scenes with unsmooth screen display, which influence the use experience of a user, are further accurately positioned.
In some embodiments, in combination with the method provided in the first aspect, before saving live data of the screen display image between the first instruction and the second instruction, the method further includes: acquiring a processor load between a first instruction and a second instruction; when the first information indicates that the screen display image is not smooth, saving the field data of the screen display image between the first instruction and the second instruction, specifically: and when the first information indicates that the screen display image is not smooth and the processor load is lower than the overload threshold, saving the field data of the screen display image between the first instruction and the second instruction.
When the processor load of the electronic device is too high, the electronic device is prone to frame dropping due to the fact that the processor is not timely calculating and rendering the image. The reason for frame dropping and the subsequent optimization direction in this scenario are relatively clear. Thus, after determining a dropped frame based on the inter-frame distance, the electronic device may also determine whether to save the field data by detecting the processor load in the process. The electronic device saves the field data during the detection when the processor load is below the overload threshold (i.e., not dropped frames due to processor overload).
In some embodiments, the method provided in connection with the first aspect, the field data further comprises one or more of: the system refresh rate, the image displayed on the screen, and the drawing time of the image displayed on the screen.
With reference to the method provided in the first aspect, in some embodiments, the method further includes: and sending the field data to a cloud for storage.
By implementing the method provided by the embodiment, the electronic device can store the field data in the unsmooth scene displayed on the storage screen and then send the data to the cloud for storage, so that the storage and calculation resources of the electronic device can be prevented from being occupied, and the field data of a plurality of electronic devices can be synthesized, so that the analysis and the processing of developers are facilitated.
With reference to the method provided in the first aspect, in some embodiments, a frame of image displayed on a screen is composed of one or more layers; acquiring drawing time of an image displayed on a screen, comprising: and acquiring drawing time of a first layer, wherein the first layer is a layer with image content changed in the detection process.
By implementing the method provided by the embodiment, the electronic device can determine the drawing time of each layer in one frame of image. Further, the electronic device can determine the detailed process of the electronic device drawing a frame of image displayed on the screen from the layer hierarchy, and further find out whether the screen display is blocked or not. The electronic device may also locate the layer that caused the jam, if any.
In combination with the method provided in the first aspect, in some embodiments, before the drawing time of the first layer is acquired, the method further includes: the first layer is determined by the window focus.
In a second aspect, the present application provides an electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a third aspect, embodiments of the present application provide a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
It will be appreciated that the electronic device provided in the second aspect, the chip system provided in the third aspect, the computer storage medium provided in the fourth aspect, and the computer program product provided in the fifth aspect are all configured to perform the method provided in the present application. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
Drawings
Fig. 1 is a schematic software structure of an electronic device 100 according to an embodiment of the present application;
fig. 2 is a schematic software architecture diagram of another electronic device 100 according to an embodiment of the present application;
FIG. 3A is a frame of desktop image of a screen display of the electronic device 100 provided in an embodiment of the present application;
FIG. 3B is a schematic diagram of a layer structure of a desktop image according to an embodiment of the present disclosure;
fig. 4 is a timing chart of a smoothness detection method according to an embodiment of the present disclosure;
FIG. 5A is a schematic diagram of a user interface of a sliding and switching desktop provided by an embodiment of the present application;
Fig. 5B is a two-frame image displayed on a screen in a process of sliding and switching a desktop according to an embodiment of the present application;
FIG. 5C is a diagram illustrating a layer structure of another desktop image according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of determining detection information by drawing information provided in an embodiment of the present application;
fig. 7 is a timing chart of smoothness detection by an overscaler according to an embodiment of the present application;
fig. 8 exemplarily shows a hardware configuration diagram of the electronic device 100.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In the scenes of sliding, playing animations, videos, etc. of electronic devices such as mobile phones and tablet computers, it is generally required to continuously refresh the image content (brush frames) displayed in the screen to provide a smooth display effect. When the frame is dropped in the frame brushing process, a user can obviously feel the blocking and the shaking, namely, the screen display is not smooth, so that the use experience of the user is affected. The frame dropping means that a specified image is not displayed when a specified display signal arrives.
However, dropped frames in a single scene are generally sporadic and do not provide efficient information for development and analysis. At this time, in order to analyze the reason of frame dropping, reduce the frame dropping, and improve the display smoothness in the screen refreshing process, the electronic device needs to determine the scene where the frame dropping occurs, that is, the scene where the frame is swiped and the scene where the frame is stuck, and save the field data in the scene.
The application provides a smoothness detection method. The method can be applied to electronic equipment with display capability such as mobile phones and tablet computers. The electronic device 100 is referred to as the electronic device such as a mobile phone and a tablet computer with display capability.
By implementing the smoothness detection method provided by the embodiment of the present application, the electronic device 100 may acquire and count the drawing time of each frame of image when brushing the frame, and determine the interval of the drawing time between two adjacent frames of images, which is also referred to as the frame interval. If the frame interval in the frame brushing process meets the frame interval threshold, the electronic device 100 may determine that the frame dropping problem occurs when the frame brushing operation is performed, and further determine that the screen display is stuck (not smooth).
The above-described detection of whether the screen displays a smooth scene may be referred to as a detection scene. The detected scene of the screen display stuck determined through the detection may be referred to as a problem scene. The field data in the problem scene is the data required by the subsequent development and analysis. The field data includes, but is not limited to: image frames of the screen display, drawing time of the image frames, frame spacing, and system frame rate of the electronic device 100. After determining the problem scenario, the electronic device 100 may save the field data of the problem scenario.
After acquiring a large amount of field data of the electronic device 100 in the problem scene, the developer can analyze the reason for using the frame dropping, that is, the reason for the screen display to be blocked, so as to optimize the screen frame brushing process, reduce the frame dropping and blocking phenomena, and improve the smoothness of the screen display.
Not limited to a cell phone, tablet computer, electronic device 100 may also be a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular telephone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (artificial intelligence, AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and the specific type of the electronic device is not particularly limited in the embodiments of the present application.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated. The smoothness detection method provided by the embodiment of the application can be also used for software systems of other layered architectures, which are not limited to the Android system.
Fig. 1 is a schematic software structure of an electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 1, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
An application framework layer (FWK) provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 1, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 2 is a schematic software structure of another electronic device 100 according to an embodiment of the present application. Fig. 2 is a schematic diagram of a software structure for implementing a smoothness detection method provided in the present application on the basis of fig. 1.
As shown in FIG. 2, the software architecture diagram illustrates the application framework layer, the system libraries, and the kernel layer. The application framework layer, the system library and the kernel layer correspond to the application framework layer, the system library and the kernel layer in fig. 1, and are not described herein.
In fig. 2, the application framework layer may include a fluency detector interface (fluency detector api), a synchronization interface. The fluency detection interface can be used for an application program or a client of an application layer to call the fluency detection method provided by the embodiment of the application and realize the fluency detection function. The synchronization interface may be used to send field data in the problem scenario to the cloud. Such as the above-mentioned synchronous interfaces The following hiviewtunnel interface.
The system library may include a display module, a fluency detection module, and a synchronization module. The display module may be used to draw and display images. The fluency detection module can be used for detecting whether the problem of frame dropping occurs in the image display process. The synchronization module can be used for reporting field data in a problem scene.
Specifically, the display module may include a receiving module (SurfaceFlinger ex), a Layer management module (Layer), and a compositing module (SurfaceFlinger). The receiving module can be used for receiving and transmitting the fluency detection instruction. The smoothness detection instruction includes an instruction to start detection and an instruction to end detection. A Layer management module (Layer) may manage the window layers. In the embodiment of the application, the Layer can be used for determining the Layer with the image content changed in the screen refreshing process.
At some point, one frame of image displayed in the screen of the electronic device 100 may be composed of a plurality of image layers.
For example, fig. 3A illustrates a one-frame desktop (homepage) image of the electronic device 100. The desktop includes a status bar 201, an application icon tray 202. The application icon tray has a plurality of application icons presented therein. The desktop shown in fig. 3A may be split into multiple layers. Fig. 3B illustrates a layer structure of the desktop. As shown in fig. 3B, the desktop image shown in fig. 3A may include 3 layers: a control layer 20, a status bar layer 21, and a background layer 22. The control layer 20 is a layer where controls such as application icons and the like are located in the desktop. The status bar layer 21 is the layer of the desktop where the status bar 201 is located. The background layer 22 is the layer in which the desktop wallpaper is located. It will be appreciated that the desktop wallpaper shown in the background layer 20 is not specifically shown in fig. 3A and 3B.
The composition module may be used to compose and display images obtained from one or more image combinations. Generally, a frame of image displayed by the electronic device in the bright screen state includes a plurality of layers, and reference is made to the layer structure of the desktop shown in fig. 3B. The combined display of the multiple layers is achieved by the composition module described above. For example, the composition module combines the respective layer images shown in fig. 3B, and then displays the desktop image shown in fig. 3A in the screen.
The fluency detection module may include a detection module (fluency detector) and an analysis module (fluency detector thread). The detection module can be used for acquiring drawing information of the layers. The drawing information includes, but is not limited to, an image identification (e.g., frame number) of the belonging image, a layer identification (e.g., layer name), a drawing time, and the like. The analysis module (fluency detector thread) can count and analyze specific data in the drawing information, so as to determine whether the screen display is smooth in the detection process.
The synchronization module corresponds to a synchronization interface of the application framework layer and can be used for reporting field data in a problem scene. The synchronization module can acquire the field data from the log module of the kernel layer, and send the field data to the synchronization interface of the application program framework layer, and then the field data is uploaded to the cloud end through the synchronization interface for subsequent analysis and use. The synchronization module may be a hiview corresponding to a synchronization interface hiview tunnel used by the application framework layer.
The log module may be used to record field data in a problem scenario. When the analysis module determines that the detected frame-brushing scene has a frame dropping condition, namely that the screen display is stuck, the analysis module can instruct the log module to record the field data. At this time, the recorded field data is the field data in the problem scene. Then, the log module can send the field data to the synchronization module, and the field data is transmitted to the synchronization interface through the synchronization module, and then reported to the cloud for storage.
Fig. 4 is a timing chart of a smoothness detection method according to an embodiment of the present application.
As shown in fig. 4, the timing chart includes: a receiving module (surfeflinger ex), a Layer management module (Layer), a synthesizing module (surfeflinger), a detecting module (fluency detector) and an analyzing module (fluency detector thread). The above modules may be described with reference to fig. 2, and are not described herein.
When an application or a client (such as a camera application, a video player, etc., hereinafter referred to as an upper layer application) in an application layer needs to monitor whether a frame is dropped in the running process of a program, the smoothness detection method provided by the embodiment of the application can be called to realize the function of detecting the smoothness of screen display.
S401, receiving an instruction for starting detection by SurfaceFlingerEx.
The upper layer application may determine when to begin detection and end detection. For example, the upper layer application may determine to begin detecting the smoothness of the screen refresh when switching pages, playing video. Referring to fig. 5A, the upper layer application may perform a smoothness detection method when an event of a left-hand switching desktop is detected, and determine whether the action of the left-hand switching desktop corresponds to a frame brushing process. After determining to start the test, the upper layer application may send an instruction to start the test to the surfeflinger ex.
S402. surfeflinger ex sends an instruction to start detection to fluency detector.
The surfeflingerex direction may be used to receive and transmit fluency detection instructions. After receiving the start detection instruction issued by the upper layer application, the surfeflinger ex may send the start detection instruction to the fluency detector.
S403.fluency detector creates a detection information table (DetectionInfo).
After receiving an instruction to start detection, the fluency detector may first create a detection information table (DetectionInfo). The detection information table may be used to record a scene in which detection occurs, detection information, and the like. The detection information can be used for later determining whether the screen smoothly displays the frame brushing image in the detection process.
Illustratively, the detection information table may be as shown in table 1:
TABLE 1
Scene number, layer name and scene description are used to record where the smoothness detection occurs. The scene number may be used to mark an application or client where smoothness detection occurs. The application or client may be any application installed on the electronic device 100, a client, or a functional module therein. The layer name may be used to mark the layer being detected. The scene description is optional and may be used to further describe where the smoothness detection occurs, such as a page switch scene from interface a to interface B.
The system refresh rate refers to the number of image frames that can be refreshed by the electronic device 100 in a theoretical unit of time. Taking a 60Hz refresh rate as an example, the electronic device 100 may refresh the image 60 times on the screen, i.e., brush the frame 60 times, within 1 second. The frame spacing of two adjacent frames of images displayed on a screen corresponding to the system refresh rate may be referred to as the system frame spacing. For example, a 60Hz refresh rate corresponds to a system frame spacing of 16.6 milliseconds, i.e., a time interval of 16.6 milliseconds between two frames of images displayed one after the other on the screen.
The start time refers to the time at which detection is started. Here, the fluency detector may take the time of the received start detection instruction as the start time. The end time refers to the time at which the detection is ended. Correspondingly, the fluency detector may take the time of the received end detection instruction as the end time.
The number of frames refers to the number of image frames refreshed on the screen during detection (between the instruction to start detection and the instruction to end detection). The longest inter-frame distance refers to the inter-frame distance having the longest time interval among all inter-frame distances during detection. The time when the frame interval of two adjacent frame images is longer than the system frame interval may be referred to as a frame loss time. The accumulated frame loss time length is the sum of the frame loss time of all frame intervals in the detection period compared with the frame interval of the system.
The refresh completion rate refers to the ratio of the number of image frames that the actual screen refreshes to the number that should theoretically be refreshed. For example, a system refresh rate of 60Hz indicates that electronic device 100 should theoretically display 60 frames of images within 1 second. During the actual swipe frame detection, the electronic device 100 displays 58-frame images within 1 second. Correspondingly, the refresh completion rate of the electronic device 100 during the above detection is 58/60, i.e. 0.9666.
When the detection information table is created, the values of part of the attributes in the table are currently determinable, and the rest part needs to wait for detection to finish to be determined. For example, upon receiving an instruction to BEGIN detection, the fluency detector may record a timestamp of the receipt of the instruction, denoted as T-BEGIN. After creating the detection information table shown in table 1, the fluency detector may determine the start time in table 1 as the above-described T-BEGIN. The end time in table 1 needs to be determined after the fluency detector receives the instruction to end the detection. Therefore, at this time, the end time in table 1 is to be determined.
S404. surfeflinger delivers VSync signals to Layer. The above-described instruction for transferring the VSync signal is, for example, a douaction instruction.
VSync is a periodic signal. The period length of the VSync signal coincides with the system frame spacing. For example, in a 60Hz refresh rate scene, the period of the vertical synchronization signal VSync is 16.6 milliseconds, and the time for which the screen displays one frame image is also 16.6 milliseconds.
SurfaceFlinger is a module for synthesis and display. Upon arrival of a VSync signal, the SurfaceFlinger of the electronic device 100 may compose an image frame and send the frame image into the display buffer. The screen may acquire an image from the buffer and display it when the next VSync signal arrives.
In the embodiment of the present application, after the start of detection, surfeflinger may synthesize and display a frame of image when each VSync signal arrives. Meanwhile, upon arrival of each VSync signal, surfeflinger may transfer the VSync signal to Layer.
S405.layer determines the layer in which the image content changes.
After each time a VSync signal is received, the Layer can determine the Layer in which the change occurs in the newly synthesized image. The latest synthesized image is the image synthesized by surfeflinger in the period of the VSync signal.
FIG. 5B illustrates two frames of images synthesized by SurfaceFlinger during the sliding switch desktop illustrated in FIG. 5A: f1, F2. F1 and F2 respectively correspond to one VSync signal period. The VSync signal period of the surface efliger composite F1 is denoted as VSync1, and the VSync signal period of the composite F2 is denoted as VSync2.
The Layer may traverse the layers in the surfeflinger synthesized image frame and determine which layers have changed in image content and which layers have not. The layer in which the image content changes may be referred to as a change layer. For example, after receiving VSync2, the Layer may obtain layers of SurfaceFlinger synthesized F2, as shown in FIG. 5C. At this time, the Layer can determine that the control Layer 20 has changed, compared to the respective layers of F1 (refer to fig. 3B).
S406.layer sends drawing information of the change layer to the fluency detector.
After determining the change Layer, the Layer may send drawing information of the change Layer to the fluency detector. The drawing information includes, but is not limited to, layer name, drawing time, VSync period, and the like. The layer name is used to identify the change layer, e.g., control layer 20 of F2 above. When sursurface is redrawing a Layer, layer may determine that the Layer is a change Layer. Thus, the Layer determines the time to change the Layer, i.e., the drawing time of the Layer.
Each time a frame of image is synthesized, the Layer can determine a change Layer in the frame of image, and send drawing information of the change Layer to the fluency detector. Also taking F2 as an example, after determining the control Layer 20 of the change Layer in F2, the Layer may send drawing information of the control Layer 20 in the VSync period to the fluency detector. Drawing information of the control layer 20 is, for example: control layers 20, T2, VSync2. Where t_f2 determines the time at which the control Layer 20 changes for the Layer.
It will be appreciated that there are a plurality of layers that change during a screen refresh. In this case, there are also a plurality of Layer-defined change layers. Accordingly, the Layer may send drawing information of the plurality of change layers to the fluency detector. The Layer may transmit all the drawing information of the plurality of layers at one time, or may sequentially transmit the drawing information of each Layer by Layer.
Also taking the scene of switching the desktop as shown in fig. 5A as an example, when the wallpaper in the desktop is dynamic wallpaper, the Layer needs to draw the image of the corresponding background Layer 22 in addition to drawing the control Layer 20 described in F2. At this time, the Layer-determined change Layer further includes a background Layer 22. At this time, the Layer may also send rendering information of the background Layer 22 to the fluency detector.
S407.fluency detector records drawing information.
After each receipt of the drawing information, the fluency detector may record the drawing information. Taking the 2-frame image shown in fig. 5B as an example, the fluency detector may receive the drawing information of the 2-frame image as shown in table 2 below:
TABLE 2
Frame numbering | Layer name | Drawing time | VSync period | |
| Control layer | 20 | T1 | VSync1 |
| Control layer | 20 | T2 | VSync2 |
…… | …… | …… | …… |
Wherein T1 and T2 respectively represent the time when the Layer determines that the control Layer 20 in F1 and F2 has changed, i.e. redrawing has occurred.
When multiple layers are updated simultaneously, for example, the control layer 20 and the background layer 22 are updated simultaneously, the drawing information of the 2 frames of images received by the fluency detector may be as shown in table 3:
TABLE 3 Table 3
Wherein T3 and T4 respectively represent the time when the Layer determines the redrawing of the background Layer 22 in F1 and F2.
S408, surfaceFlingerEx receives an instruction for ending detection.
Referring to the description of S401, the upper layer application may determine when to start detection and end detection. Thus, at any time, the upper layer application may determine to end the detection. At this point, the upper layer application may send an instruction to end the detection to the surfaceflingreex. After receiving the instruction of ending detection issued by the upper layer application, s409. Surfeflinger ex sends the instruction of ending detection to the fluency detector.
S410, waking up a FluencyDetector thread. After receiving the instruction for ending the detection, the fluency detector may send an indication information to the fluency detector thread to wake up the fluency detector thread. Meanwhile, after receiving an instruction for ending detection, the fluency detector does not receive new drawing information any more.
S411, determining detection information based on the recorded drawing information, and writing the detection information into a detection information table.
After waking up the fluency detector thread, the fluency detector thread can obtain and count drawing information recorded by the fluency detector. In the statistics process, the fluency detector thread can determine the frame spacing between each image frame in the detection process, and further determine whether the frame dropping condition exists in the detection process, and whether the screen display is smooth.
Fig. 6 is a schematic diagram of determining detection information by drawing information provided in an embodiment of the present application.
As shown in FIG. 6, F1-F7 may be used to represent 7 frame images that control layer 20 continuously updates during detection. In connection with the description of S407, the fluency detector may record drawing information of the above 7 frame images, including a layer name, drawing time, VSync period, and the like of each frame image.
The fluency detector thread may determine the frame interval between two adjacent frames of the 7-frame image in the refresh process based on the drawing time in the drawing information. The fluency detector thread may obtain a preset frame spacing threshold for electronic device 100. When the frame interval between two adjacent frame images is greater than the frame interval threshold, the fluency detector thread can determine that frame dropping occurs between the two frame images. Furthermore, the fluency detector thread may mark the detection scenario as a problem scenario.
The inter-frame distance threshold is preset, and the electronic device 100 may set inter-frame distance thresholds with different sizes according to various factors such as a refresh rate supported by the electronic device 100, a user experience, and a power policy. Preferably, the inter-frame distance threshold is consistent with the system inter-frame distance indicated by the system refresh rate. Alternatively, in a scenario where the user's refresh smoothness is relatively low, the frame spacing threshold may also be slightly greater than the system frame spacing described above.
For example, fluency detector thread may determine that the drawing time of the 7 frame images is respectively: T1-T7. According to the drawing time, the fluency detector thread can respectively determine the frame spacing between two adjacent frame images in the 7 frame images: S1-S6. For example, the preset frame spacing threshold is 17 milliseconds, s1=s2=s3=s4=s6=16.6 milliseconds, s5=19 milliseconds. At this time, since the inter-frame distance S5 between the images F5 and F6 exceeds the inter-frame distance threshold for 17 milliseconds, fluency detector thread can determine that a frame drop occurred while the electronic device 100 displays F6.
Optionally, after determining that there is a frame spacing longer than the frame spacing threshold (a frame drop occurs) during the detection, the fluency detector thread may determine that the screen display is stuck. The fluency detector thread may mark the detection scenario as a problem scenario.
Optionally, a latitude may also be provided in the electronic device 100. The latitude may indicate the number of times the allowed inter-frame distance is longer than the inter-frame distance threshold. For example, in a scene with a tolerance of 3, if the inter-frame distance is longer than the inter-frame distance threshold only once in the entire detection scene, fluency in the screen display can be determined by the fluency detector thread. The detection scenario is not a problem scenario. If the frame spacing is longer than the frame spacing threshold for 4 times in the whole detection process, the fluency detector thread can determine that the screen display is stuck. At this time, the detection scene may be marked as a problem scene.
In some embodiments, fluency detector thread may also be used in conjunction with parameters such as average frame spacing, accumulated frame loss time, refresh completion rate, etc. to determine whether to drop frames and whether to smooth the screen display.
For example, if there is a longest inter-frame distance longer than the inter-frame distance threshold, or a plurality of inter-frame distances longer than the inter-frame distance threshold, but the average inter-frame distance is below the inter-frame distance threshold, then fluent display may be determined that a frame drop occurred in the detection process. Similarly, if there is a longest frame spacing longer than the frame spacing threshold, or if a plurality of frame spacings longer than the frame spacing threshold, but the accumulated frame loss time does not reach the preset value, fluency in screen display can also be determined. If the longest frame interval is longer than the frame interval threshold, or a plurality of frame intervals are longer than the frame interval threshold, and the refresh completion rate meets the preset value, the fluency of screen display can be determined.
In some embodiments, the fluency detector thread may also determine whether a frame drop occurred during detection based on the VSync period in the drawing information. The SurfaceFlinger should compose and send one frame image in each VSync period. This means that the identification within the VSync period of the images displayed in two adjacent frames should be continuous. If the identifiers of VSync periods of two adjacent frames in the drawing information recorded by the fluency detector are discontinuous, the electronic device 100 may determine that a frame is dropped in the detection process.
For example, in theory, in fig. 6, each frame corresponds to one VSync period, and then the VSync periods of F1 to F7 are VSync1 to VSync7, respectively. If, during the detection process, the VSync period in which F5 is located is VSync5 and the VSync period in which F6 is located is VSync7, then the fluency detector thread may also determine that a frame drop occurs between F5 and F6.
In some embodiments, the fluency detector thread may also obtain processor load based on determining that the screen display is stuck. The fluency detector thread can determine whether the stuck detection scenario is a problem scenario according to the processor load. Optionally, the processor load may include a central processing unit (central processing unit, CPU) load and/or a GPU load.
Screen display jamming is understood in the event that the processor load is too high. At this time, the problem of screen display blocking can be correspondingly solved by improving the processing capacity of the processor and optimizing the processing strategy. If the processor is not overloaded, screen display stuck still occurs. The problem of causing the screen display to jam is unknown. The developer needs to further study and analyze the field data in the scene to determine the cause of the problem. Therefore, in view of this, when the processor load is below the overload threshold, i.e., not overloaded, the fluency detector thread marks the detection scene of the screen display stuck as a problem scene. The overload threshold is preset.
When the fluency detector thread determines the parameters of the frame interval, the longest frame interval, the average frame interval, the accumulated frame loss time, the refresh completion rate, and the like, the fluency detector thread may write specific values of the parameters into the created detection information table (detectionInfo). In the method for determining whether the processor is overloaded, the detection information table may further include a processor load for recording a processor load condition of the electronic device 100 during the detection.
Taking the detection scenario shown in fig. 6 as an example, after the detection is finished, the electronic device 100 may obtain the detection information table shown in table 4:
TABLE 4 Table 4
Wherein, T-BEGIN is the time when the FluencyDetector receives the start detection instruction; T-END is the time when the FluencyDetector receives the END detection instruction. As shown in fig. 6, the number of image frames refreshed by the electronic device 100 during detection is 7. The inter-frame distances between the 7 frames of images are in turn: 16.6, 19, 16.6. The longest frame spacing is 19. The accumulated drop time is 2.4. The refresh completion rate is 0.9833 (59/60). It will be appreciated that the values of the attributes in table 4 above are exemplary.
S412, confirming the problem scene and storing the field data of the problem scene.
S411 describes that the fluency detector thread determines the detection information: the method for determining whether the detected scene is a problem scene by combining the above information is also introduced by the interframe space, the longest interframe space, the average interframe space, the accumulated frame loss time, the refresh completion rate and the like, and is not repeated here.
The field data of the problem scene can be saved and reported by the fluency detector thread after the problem scene is determined. The field data includes at least the detection information described in the detection information table. Further, the field data may further include each frame of image of the change layer and corresponding drawing information thereof. For example, after determining that the detected scene shown in fig. 6 is a problem scene, the detection information shown in table 4 is field data. Further, the field data may further include each frame of images F1-F7 updated by the change layer in the detected scene shown in fig. 6, and corresponding rendering information thereof.
The log module may then generate a detection log corresponding to each problem scenario. The detection log includes the field data. Furthermore, the log module can upload the detection log to the cloud for storage through the synchronization module and the synchronization interface for subsequent analysis. The synchronization module may report the field data of the problem scene stored in the electronic device 100 to the cloud after the field data reaches a certain scale, so as to improve the data transmission efficiency.
In fig. 4, the instruction to start detection shown in S401 may be referred to as a first instruction; the instruction of ending the detection shown in S408 may be referred to as a second instruction. The first information includes inter-frame distances, inter-frame distance thresholds, system inter-frame distances, latitude, average inter-frame distances, and the like. The first information may further include a system refresh rate, a start time, an end time, a number of frames, a longest frame interval, an accumulated frame loss duration, a refresh completion rate, etc., as described with reference to S411. As described with reference to S406, the change layer may be referred to as a first layer, such as control layer 20.
By implementing the smoothness detection method provided in the above embodiment, an upper layer application installed on the electronic device 100 may call the smoothness detection method provided in the present application at any time, obtain the drawing information during the screen refreshing period, and then determine whether the frame dropping problem occurs during the screen refreshing period based on the drawing information. After confirming the problem of frame dropping, the electronic device 100 can save the field data and upload the field data to the cloud for storage, thereby providing effective data support for subsequent analysis of the problem scene and optimization of screen refreshing.
Fig. 7 is a timing chart of smoothness detection performed by a desktop application in a scenario of sliding a switching desktop according to an embodiment of the present application.
As shown in fig. 7, the timing diagram includes a smooth scrolling module, a layer normalization module, a smoothness detection module, and a synchronization module. The smooth scrolling module is used for realizing smooth scrolling of the window view. In this embodiment, the smooth scrolling module corresponds to an upper layer application. The smooth scrolling module may detect whether a frame drop problem occurs in the smoothly scrolled scene in a process of implementing smooth scrolling of the window view. The layer normalization module may be used to determine a target layer at which a user operates the focus. In general, a target layer in which a user operates focus is a layer in which a user can intuitively feel that a change in image content occurs in a detection scene.
Wherein, inIn the architecture, the smooth scrolling module may be an overscaler; the smooth detection module flingjank detector may be used to determine the target layer on which the user operates the focus, i.e. to determine the detected layer. Furthermore, the FlingJankDetector can be regarded as a receiving module (SurfaceFlingerEx) in the overscaler. The global window management module windowGlobalManager may be used to provide all layers of a window.
S701. overscaler sends an instruction to start detection to flingjank detector.
As shown in fig. 7, the overscaler may send an instruction to start detection to the FlingJankDetector in the course of performing smooth scrolling. For example, referring to the user interface shown in fig. 5A, after detecting a sliding operation by the user, the overscaler may determine that a smooth scrolling action is to be performed. At this point, the overscaler may trigger a fluency detection program, issuing instructions to begin detection.
Upon receiving the instruction to start detection, s702.Flingjankdetector may send an instruction to the windowGlobalManager to acquire a window view, such as GetWindowView ().
S703, returning all layers of all window views. In response to an instruction to acquire a window view, the windowglobalsanaager may determine all layers of all windows currently and return all layers to the FlingJankDetector.
S704, traversing all layers by the FlingJankDetector to determine a target layer in which a user operates to focus.
After receiving all the layers of the current window view returned by the windowglobalpanaager, the FlingJankDetector may determine the layer having the window focus among all the layers, and determine the layer to be subjected to sliding refresh. The layer with the window focus is the target layer focused by the user operation. The FlingJankDetector may determine the layer having the focus of the window according to information of the position where the user operation acts on the screen, the type of operation (sliding, clicking, double clicking, long pressing, etc.), the scene where the operation occurs, and the like.
Assuming the current view is the user interface shown in FIG. 5A, the FlingJankDetector may determine that all layers of the current view include: a control layer 20, a status bar layer 21, and a background layer 22. At this time, the flingjank detector may determine that the focused layer acted by the user sliding operation is the control layer 20, i.e., the control layer 20 is the layer with the window focus. The control layer 20 is the target layer.
S705.flingjank detector instructs the fluency detection module to start detecting the dropped frame condition of the target layer.
Specifically, the flingjank detector may send an instruction to start detection to the fluency detection module. In response to the above instruction, referring to the timing diagram shown in fig. 4, the fluency detector in the fluency detection module may acquire the drawing information of the control Layer 20 from the Layer after each Layer determines to change the control Layer 20, and record the drawing information, specifically referring to the description of S404-S407 in fig. 4.
At the moment when the upper layer application determines to end the detection, for example, after one sliding operation ends, s706. Overscaler sends an instruction to the flingjank detector to end the detection. In response to the above instruction, s707.Flingjankdetector instructs the fluency detection module to stop detecting the frame dropping condition of the target layer.
In combination with the timing chart shown in fig. 4, after receiving the above-mentioned instruction to stop detecting the target Layer, the Layer no longer can send the currently refreshed Layer and its drawing information to the smoothness detection module. I.e. the fluency detection module no longer obtains the new control Layer 20 image from Layer and the drawing information of the frame image.
Meanwhile, the fluency detector thread in the fluency detection module can start counting the frame brushing condition in the detection process.
S708, determining a problem scene of screen display blocking based on the recorded drawing information.
The fluency detection module can carry out statistical analysis according to recorded drawing information to determine detection information such as frame spacing and the like, so as to determine whether the problem of frame dropping exists in the detection process and whether screen display is fluent. Scenes in which the screen display is stuck may be marked as problem scenes.
The fluency detection module may record the detection information, for example, write the detection information into a detection information table, while determining the detection information. When the detected scene is determined to be a problem scene, the detection information is field data. The fluency detection module can directly acquire the field data of the problem scene from the detection information table.
S709, reporting field data of the problem scene.
After confirming that the detected scene is a problem scene, the fluency detection module can send the field data of the problem scene to the synchronization module, and then report the field data to the cloud for storage for subsequent analysis.
Fig. 8 exemplarily shows a hardware configuration diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD). The display panel may also be manufactured using organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED) or active-matrix organic light-emitting diode (active-matrix organic light emitting diode), flexible light-emitting diode (FLED), mini, micro-OLED, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
In the embodiment of the present application, the electronic device 100 implements a screen refresh function depending on the GPU, the display screen 194, and the display function provided by the application processor or the like. Further, the electronic device 100 may detect whether a frame dropping problem affecting the smoothness of screen display occurs in the above-described screen refreshing process.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (dynamic random access memory, DRAM), synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, e.g., fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc. The nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC), multi-level memory cells (MLC), triple-level memory cells (TLC), quad-level memory cells (QLC), etc. divided according to a storage specification, may include universal FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory cards (embedded multi media Card, eMMC), etc. divided according to a storage specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
In the embodiment of the present application, the program code for implementing smoothness detection by the electronic device 100 may be stored in the nonvolatile memory; when the smoothness detection method is called for detection, the electronic device 100 may load the program code into a random access memory and execute the program code, thereby realizing smoothness detection provided by the application.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary.
The distance sensor 180F is used to measure a distance. The electronic device 100 may measure the distance by infrared or laser. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In the embodiment of the present application, the electronic device 100 may receive a touch operation of a user, such as a sliding operation shown in fig. 5A, through the touch sensor 180K.
The bone conduction sensor 180M may acquire a vibration signal. The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a picture, characters, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being toolbars (toolbars), menu bars (menu bars), text boxes (text boxes), buttons (buttons), scroll bars (scrollbars), pictures and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifies the controls contained in the interface by nodes of < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application program interface, and is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, GTML), cascading style sheets (cascading style sheets, CSS), java script (JavaScript, JS), etc., and the web page source code may be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser function. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as GTML defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this application refers to and encompasses any or all possible combinations of one or more of the listed items. As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
Claims (12)
1. A smoothness detection method applied to an electronic device with a screen, the method comprising:
after a first instruction is acquired, starting to acquire the drawing time of the image displayed by the screen;
after the second instruction is acquired, stopping acquiring the drawing time of the image displayed by the screen;
determining first information according to the drawing time of the image acquired between the first instruction and the second instruction; the first information is used for reflecting the fluency of the screen display image between the first instruction and the second instruction;
when the first information indicates that the screen display image is not smooth, saving field data of the screen display image between the first instruction and the second instruction; the field data includes the first information.
2. The method of claim 1, wherein the first information includes a frame spacing, the frame spacing being a time interval during which the screen displays two frames of images in succession;
when the inter-frame distance is longer than an inter-frame distance threshold, the first information indicates that the screen display image is not smooth.
3. The method of claim 2, wherein a system refresh rate is preset in the electronic device, the system refresh rate corresponds to a system frame spacing, and a frame spacing threshold is greater than or equal to the system frame spacing.
4. A method according to claim 2 or 3, characterized in that a latitude M is also provided in the electronic device, said M being used to indicate the number of times that the inter-frame distance is allowed to be longer than the inter-frame distance threshold;
when the frame interval is longer than the frame interval threshold, the first information indicates that the screen display image is not smooth, specifically: for a plurality of frame intervals corresponding to the images displayed on the screen between the first instruction and the second instruction, when N frame intervals are longer than a frame interval threshold, the first information indicates that the images displayed on the screen are not smooth, and N is larger than M, and N and M are positive integers.
5. The method of claim 2, wherein the first information further comprises an average inter-frame distance, and wherein the first information indicates that the screen display image is smooth when at least one of the inter-frame distances is longer than a inter-frame distance threshold, but the average inter-frame distance is less than the inter-frame distance threshold.
6. The method of any of claims 2-5, wherein prior to saving the live data of the on-screen display image between the first instruction and the second instruction, the method further comprises:
acquiring a processor load between the first instruction and the second instruction;
when the first information indicates that the screen display image is not smooth, saving the field data of the screen display image between the first instruction and the second instruction, specifically: and when the first information indicates that the screen display image is not smooth and the processor load is lower than an overload threshold, saving the field data of the screen display image between the first instruction and the second instruction.
7. The method of any one of claims 1-6, wherein the field data further comprises one or more of: the system refresh rate, the image displayed on the screen, and the drawing time of the image displayed on the screen.
8. The method according to any one of claims 1-7, further comprising: and sending the field data to a cloud for storage.
9. The method of claim 1, wherein a frame of image for the screen display is comprised of one or more layers; the step of obtaining the drawing time of the image displayed by the screen comprises the following steps: and acquiring drawing time of a first layer, wherein the first layer is a layer with image content changed in the detection process.
10. The method of claim 9, wherein prior to obtaining the drawing time for the first layer, the method further comprises: and determining the first image layer through window focus.
11. An electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the method of any of claims 1-10 to be performed.
12. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the method of any one of claims 1-10 to be performed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210950178.7A CN116048933B (en) | 2022-08-09 | 2022-08-09 | Fluency detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210950178.7A CN116048933B (en) | 2022-08-09 | 2022-08-09 | Fluency detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116048933A true CN116048933A (en) | 2023-05-02 |
CN116048933B CN116048933B (en) | 2024-05-03 |
Family
ID=86127868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210950178.7A Active CN116048933B (en) | 2022-08-09 | 2022-08-09 | Fluency detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116048933B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116916093A (en) * | 2023-09-12 | 2023-10-20 | 荣耀终端有限公司 | Method for identifying clamping, electronic equipment and storage medium |
CN117097883A (en) * | 2023-10-20 | 2023-11-21 | 荣耀终端有限公司 | Frame loss fault cause determining method, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109165154A (en) * | 2018-09-10 | 2019-01-08 | 广东小天才科技有限公司 | Display interface fluency statistical method and system, mobile terminal and server |
CN111159042A (en) * | 2019-12-31 | 2020-05-15 | 可牛网络技术(北京)有限公司 | Fluency testing method and device and electronic equipment |
CN112711519A (en) * | 2019-10-25 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Method and device for detecting fluency of picture, storage medium and computer equipment |
CN114648951A (en) * | 2022-02-28 | 2022-06-21 | 荣耀终端有限公司 | Method for controlling dynamic change of screen refresh rate and electronic equipment |
-
2022
- 2022-08-09 CN CN202210950178.7A patent/CN116048933B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109165154A (en) * | 2018-09-10 | 2019-01-08 | 广东小天才科技有限公司 | Display interface fluency statistical method and system, mobile terminal and server |
CN112711519A (en) * | 2019-10-25 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Method and device for detecting fluency of picture, storage medium and computer equipment |
CN111159042A (en) * | 2019-12-31 | 2020-05-15 | 可牛网络技术(北京)有限公司 | Fluency testing method and device and electronic equipment |
CN114648951A (en) * | 2022-02-28 | 2022-06-21 | 荣耀终端有限公司 | Method for controlling dynamic change of screen refresh rate and electronic equipment |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116916093A (en) * | 2023-09-12 | 2023-10-20 | 荣耀终端有限公司 | Method for identifying clamping, electronic equipment and storage medium |
CN116916093B (en) * | 2023-09-12 | 2023-11-17 | 荣耀终端有限公司 | Method for identifying clamping, electronic equipment and storage medium |
CN117097883A (en) * | 2023-10-20 | 2023-11-21 | 荣耀终端有限公司 | Frame loss fault cause determining method, electronic equipment and storage medium |
CN117097883B (en) * | 2023-10-20 | 2024-04-12 | 荣耀终端有限公司 | Frame loss fault cause determining method, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN116048933B (en) | 2024-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115473957B (en) | Image processing method and electronic equipment | |
CN110597512B (en) | Method for displaying user interface and electronic equipment | |
CN109559270B (en) | Image processing method and electronic equipment | |
CN116048933B (en) | Fluency detection method | |
CN113132526B (en) | Page drawing method and related device | |
CN113553130A (en) | Method for executing drawing operation by application and electronic equipment | |
WO2024083014A1 (en) | Interface generation method and electronic device | |
WO2024067551A1 (en) | Interface display method and electronic device | |
CN115964231A (en) | Load model-based assessment method and device | |
CN116483734B (en) | Pile inserting method and system based on compiler and related electronic equipment | |
WO2023016014A1 (en) | Video editing method and electronic device | |
WO2023071482A1 (en) | Video editing method and electronic device | |
CN114690975B (en) | Dynamic effect processing method and related device | |
CN115994006A (en) | Animation effect display method and electronic equipment | |
CN114816028A (en) | Screen refreshing method, electronic device and computer-readable storage medium | |
CN117692714B (en) | Video display method, electronic device, computer program product, and storage medium | |
CN117764853B (en) | Face image enhancement method and electronic equipment | |
CN116193275B (en) | Video processing method and related equipment | |
WO2024046010A1 (en) | Interface display method, and device and system | |
CN117221713B (en) | Parameter loading method and electronic equipment | |
WO2024083009A1 (en) | Interface generation method and electronic device | |
WO2022166550A1 (en) | Data transmission method and electronic device | |
WO2024061292A1 (en) | Interface generation method and electronic device | |
US20240061549A1 (en) | Application switching method, graphical interface, and related apparatus | |
CN117909000A (en) | Interface generation method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |