CN117746762A - Display processing method and device and electronic equipment - Google Patents

Display processing method and device and electronic equipment Download PDF

Info

Publication number
CN117746762A
CN117746762A CN202311777273.2A CN202311777273A CN117746762A CN 117746762 A CN117746762 A CN 117746762A CN 202311777273 A CN202311777273 A CN 202311777273A CN 117746762 A CN117746762 A CN 117746762A
Authority
CN
China
Prior art keywords
frame
display
image
equal
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311777273.2A
Other languages
Chinese (zh)
Other versions
CN117746762B (en
Inventor
张斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311777273.2A priority Critical patent/CN117746762B/en
Publication of CN117746762A publication Critical patent/CN117746762A/en
Application granted granted Critical
Publication of CN117746762B publication Critical patent/CN117746762B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The application discloses a display processing method, a display processing device and electronic equipment. When the electronic equipment exits from the standby idle state, determining that a starting point T1 of a first time length T1 and a starting point T1 of the T1 are time stamps corresponding to a vertical synchronous Vsync signal of a first frame, and a terminal point T2 of the T1 is a time stamp corresponding to the earliest time of display driving capable of modifying a blanking area earch in a second frame; determining T2 according to T1; t1=n×t+t2, t=1/f 1; when T2 is less than V1, executing the operation of modifying the point at the time T4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; t2 is less than or equal to t4 and less than or equal to t3. Based on the scheme of the application, the electronic equipment can avoid screen splash when the standby state is exited and the frame rate of the image to be displayed is changed.

Description

Display processing method and device and electronic equipment
Technical Field
The present disclosure relates to the field of video playing technologies, and in particular, to a display processing method, a device, and an electronic device.
Background
The splash screen refers to a phenomenon that irregular stripes, snowflakes or other abnormal pictures appear on the display screen. When the electronic device exits from the standby idle state and the frequency of the image to be displayed is different from the frequency in the standby state, a screen display phenomenon sometimes occurs on the display screen of the electronic device, and the display effect is affected.
How to avoid the splash screen is a problem to be solved.
Disclosure of Invention
The application provides a display processing method, a display processing device and electronic equipment, which can avoid screen display when the electronic equipment exits from a standby state and the rate changes.
In a first aspect, a display processing method is provided, applied to an electronic device, where a display screen of the electronic device is in a standby idle state at a first frame rate f1, and the method includes: when the image is updated, the idle state is exited, the frame rate corresponding to the updated image is a second frame rate f2, and f2 is not equal to f1; determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of the electronic equipment entering an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of the electronic equipment which exits the idle state; determining T2 according to the T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T; executing the operation of modifying the point at the time T4 when the T2 is less than V1; the V1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that screen display can be avoided when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes.
In another possible implementation, the method further includes: and when T2 is less than V1, displaying an image by the display screen at f2 in the next frame of the second frame.
In another possible implementation manner, when T2 is equal to or greater than V1, the operation of modifying the point is performed at a time T6, where T6 is in a third frame, the third frame is a next frame adjacent to the second frame, and a timestamp corresponding to the Vsync signal of the third frame is t5, t5=t3+ (T-V1); t6=t5+t3, said T VSYNC +T VBP T3 < V1, said T VSYNC Is the duration of the Vsync signal pair of the third frame, the T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
In another possible implementation, the method further includes: and when T2 is more than or equal to V1, displaying an image by the f2 on the display screen in the next frame of the third frame.
In another possible implementation manner, the determining T2 according to the T1 includes: determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and in the step subsequent to determining T2 from said T1, using the value of said T2 determined in this step.
In a second aspect, a display processing apparatus is provided, which is applied to an electronic device, where a display screen of the electronic device is in a standby idle state at a first frame rate f1, and the apparatus includes: the first processing unit is used for exiting the idle state when the image is updated, the frame rate corresponding to the updated image is a second frame rate f2, and f2 is not equal to f1; a first determining unit, configured to determine a first time length T1, where a start point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of the electronic device entering an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of the electronic equipment which exits the idle state; a second determining unit, configured to determine T2 according to the T1; wherein, in theory, t1=n×t+t2, where t=1/f 1, n is an integer, n is greater than or equal to 0, and 0 < T2 < T; a modifying unit, configured to execute the operation of modifying the point at time T4 when T2 < V1; the V1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
In another possible implementation, the apparatus further includes: and the first display unit is used for displaying an image at f2 in the next frame of the second frame when T2 is less than V1.
In another possible implementation manner, the modifying unit is further configured to perform the operation of modifying the point at time T6 when the T2 is greater than or equal to V1, where T6 is in a third frame, the third frame is a next frame adjacent to the second frame, and a timestamp corresponding to the Vsync signal of the third frame is t5, t5=t3+ (T-V1); t6=t5+t3, said T VSYNC +T VBP T3 < V1, said T VSYNC Is the duration of the Vsync signal pair of the third frame, the T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
In another possible implementation, the apparatus further includes: and the second display unit is used for displaying an image at f2 in the next frame of the third frame when T2 is more than or equal to V1.
In another possible implementation manner, in determining T2 according to the T1, the second determining unit is specifically configured to determine that a greatest common divisor of the f1 and 1000 is x, y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and the value of T2 in the modification unit uses the value of T2 determined by this step.
In a third aspect, an electronic device is provided, the electronic device comprising a processor, a display and a memory, the memory being configured to store a computer program, the display being configured to display an image, the processor being configured to invoke and run the computer program from the memory, such that the processor performs the display processing method according to the first aspect or any of the possible implementations of the first aspect.
In a fourth aspect, a chip is provided, which includes a processor, when the processor executes instructions, the processor executes the display processing method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fifth aspect, there is provided a computer readable storage medium storing a computer program, which when executed by a processor, causes the processor to perform the display processing method of the first aspect or any one of the possible implementation manners of the first aspect.
In the embodiment of the application, when the electronic device performs display processing, a display screen of the electronic device is in a standby idle state at a first frame rate f1, and when an image is updated, the electronic device exits the idle state, and the frame rate corresponding to the updated image is changed to f2, wherein f2 is not equal to f1; then, determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of which the electronic device enters an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of which the electronic equipment exits the idle state; determining T2 according to T1; t1=n×t+t2, t=1/f 1, n is an integer, n is not less than 0,0 < T2 < T; when T2 is less than V1, executing the operation of modifying the point at the time T4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; t2 is less than or equal to t4 and less than or equal to t3, and V3 is less than or equal to V2. When the scheme is adopted, the operation of modifying the power and the DSI response cmd command are completed in the same frame, so that the electronic equipment can avoid screen display when the electronic equipment exits from a standby state and the frame rate of the image to be displayed changes.
Drawings
FIG. 1 is a schematic diagram of a prior art electronic device with a splash screen;
FIG. 2A is a schematic diagram of time lengths corresponding to a rear shoulder of a blanking area, an effective area and a front shoulder of the blanking area in the vertical direction of each frame of image;
FIG. 2B is a schematic view of a Video screen process flow;
FIG. 3A is a flow chart of a display processing method according to an embodiment of the present disclosure;
FIG. 3B is a flowchart illustrating a display processing method according to another embodiment of the present disclosure;
FIG. 4A is a flowchart of a display processing method according to another embodiment of the present disclosure;
FIG. 4B is a flowchart illustrating a display processing method according to another embodiment of the present disclosure;
FIG. 5A is a timing diagram corresponding to a display processing method according to an embodiment of the present application, including related time nodes;
FIG. 5B is a timing diagram including related time nodes according to a display processing method according to an embodiment of the present disclosure;
FIG. 5C is a timing diagram including related time nodes according to a display processing method according to an embodiment of the present disclosure;
FIG. 5D is a timing diagram including related time nodes corresponding to a display processing method according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device that performs the display processing method provided in the embodiment of the present application;
Fig. 7 is a schematic diagram of a software system of an electronic device executing the display processing method provided in the embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
First, an application scenario of the display processing method provided in the embodiment of the present application will be described with reference to the accompanying drawings.
The display processing method provided by the embodiment of the application can be applied to electronic equipment such as a mobile phone and the like with a Video screen, and in the embodiment of the application, the electronic equipment is described by taking the mobile phone as an example, and it can be understood that the electronic equipment for executing the display processing method provided by the application is not limited to the mobile phone, but can be other terminals with the Video screen, and the specific type of the electronic equipment is not limited here.
When the mobile phone is not operated for more than a preset time, the mobile phone enters a standby state for saving electricity. In the standby state, if a preset idle-exit event is detected, for example, a user slides a mobile phone display screen, the electronic equipment exits the standby state, and in the idle-exit state, the mobile phone occasionally has a short screen-display phenomenon, and an image displayed in a mobile phone display area in the screen-display process is shown in fig. 1.
As shown in fig. 2A, the synchronization signal in the vertical direction of each frame of image is Vsync, the period is T, the periods corresponding to the rear shoulder of the blanking area, the effective area, and the front shoulder of the blanking area are T VBP 、T DE 、T VFP
As shown in fig. 2B, the process of displaying an image on a mobile phone generally includes the following steps: an Application (APP) obtains content to be displayed, a surface renderer service synthesizes the content to be displayed into a page to be displayed, a hardware hybrid renderer (Hardware Composer, HWC) performs format conversion on the page to be displayed, the output is sent to a Display Driver, the Display Driver performs operations such as coding and decoding, compression and the like on a plurality of bottom layers to load and store a picture into a buffer of the Display Driver, and then the picture is sent to a Display Driver chip (DDIC) through a mobile industry processor interface (Mobile Industry Processor Interface, mipi) of a Display serial interface (Display serial interface, dsi). Vsync is a frame synchronization signal of each frame, and after displaying the image with the number 1, displaying the image with the number 2, if the mobile phone is in a standby state after displaying the image with the number 2, the upper layer does not transmit a new image any more, and dsi hardware continues to transmit the image with the number 2 stored in the buffer memory to the DDIC, and after ending the standby state, the upper layer continues to receive the new image, and transmits and displays the new image.
When the frame rate of the display image changes, the display of the video screen changes accordingly. In particular, two methods are included, one is high handover and the other is real handover, and in order to reduce power consumption, real handover is adopted in the application, and in frame rate handover, a command for modifying the power and sending the cmd needs to be completed.
Fig. 3A is a flow chart of a display processing method according to an embodiment of the present application. As shown in fig. 3A, the display processing method provided in the embodiment of the present application includes the steps of: 301 to 304. In the following, steps 301 to 304 will be described in detail, where the embodiment is applied to an electronic device, and a display screen of the electronic device is in a standby idle state at a first frame rate f1, where the method includes:
301. and when the image is updated, exiting the idle state, wherein the frame rate corresponding to the updated image is the second frame rate f2, and f2 noteq f1.
302. Determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of which the electronic equipment enters an idle state; the end point T2 of T1 is a timestamp corresponding to the time at which the display driver can modify the blanking area garch earliest in the second frame, which is the frame at which the electronic device exits the idle state.
303. Determining T2 according to T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T.
Since the computer is a floating point calculation and cannot be accurate to an infinite number of bits, t1=n×t+t2 is a theoretical value in the actual calculation, and when T is an infinite number of cycles or an infinite number of cycles, the larger the n value is, the larger the accumulated error is, and in order to reduce the error, T2 can be determined as follows. Determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and in the step after determining T2 from T1, using the value of T2 determined in this step.
304. When T2 is less than V1, executing the operation of modifying the point at the time T4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
For example, in one possible implementation, as shown in fig. 5A, in this embodiment T2 < V1, t4=t2, in this embodiment the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
For example, in another possible implementation, as shown in fig. 5B, in this embodiment T2 < V1, t4=t3, in this embodiment the modification arch and the hardware interface DSI response cmd command are all completed within the same frame.
For example, in another possible implementation, as shown in FIG. 5C, T2 < V1, T2 < T4 < T3 in this embodiment, and the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that screen display can be avoided when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes.
Referring to fig. 3B, fig. 3B is a flowchart illustrating a display processing method according to another embodiment of the present application. As shown in fig. 3B, the display processing method provided in the embodiment of the present application includes the steps of: 301 to 305. Next, steps 301 to 305 will be described in detail.
301. And when the image is updated, exiting the idle state, wherein the frame rate corresponding to the updated image is the second frame rate f2, and f2 noteq f1.
302. Determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of which the electronic equipment enters an idle state; the end point T2 of T1 is a timestamp corresponding to the time at which the display driver can modify the blanking area garch earliest in the second frame, which is the frame at which the electronic device exits the idle state.
303. Determining T2 according to T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T.
Since the computer is a floating point calculation and cannot be accurate to an infinite number of bits, t1=n×t+t2 is a theoretical value in the actual calculation, and when T is an infinite number of cycles or an infinite number of cycles, the larger the n value is, the larger the accumulated error is, and in order to reduce the error, T2 can be determined as follows. Determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and in the step after determining T2 from T1, using the value of T2 determined in this step.
304. When T2 is less than V1, executing the operation of modifying the point at the time T4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
For example, in one possible implementation, as shown in fig. 5A, in this embodiment T2 < V1, t4=t2, in this embodiment the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
For example, in another possible implementation, as shown in fig. 5B, in this embodiment T2 < V1, t4=t3, in this embodiment the modification arch and the hardware interface DSI response cmd command are all completed within the same frame.
For example, in another possible implementation, as shown in FIG. 5C, T2 < V1, T2 < T4 < T3 in this embodiment, and the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
305. And when T2 < V1, the display screen displays the image at the frame rate f2 in the next frame of the second frame.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that the screen can be prevented from being jumped when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes; in addition, in the next frame of the second frame, the display screen is caused to display the image at the frame rate f2, and the display screen can be caused to display the image at the new frame rate as early as possible.
Referring to fig. 4A, fig. 4A is a flowchart illustrating a display processing method according to another embodiment of the present application. As shown in fig. 4A, the display processing method provided in the embodiment of the present application includes the steps of: 401 to 406. Next, steps 401 to 406 will be described in detail.
401. And when the image is updated, exiting the idle state, wherein the frame rate corresponding to the updated image is the second frame rate f2, and f2 noteq f1.
402. Determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of which the electronic equipment enters an idle state; the end point T2 of T1 is a timestamp corresponding to the time at which the display driver can modify the blanking area garch earliest in the second frame, which is the frame at which the electronic device exits the idle state.
403. Determining T2 according to T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T.
Since the computer is a floating point calculation and cannot be accurate to an infinite number of bits, t1=n×t+t2 is a theoretical value in the actual calculation, and when T is an infinite number of cycles or an infinite number of cycles, the larger the n value is, the larger the accumulated error is, and in order to reduce the error, T2 can be determined as follows. Determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and in the step after determining T2 from T1, using the value of T2 determined in this step.
404. It is determined whether T2 is less than V1.
If yes, go to step 405. If the determination result is negative, step 406 is performed.
405. Performing an operation of modifying the point at the time t 4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; t2 is less than or equal to t4 and less than or equal to t3.
For example, in one possible implementation, as shown in fig. 5A, in this embodiment T2 < V1, t4=t2, in this embodiment the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
For example, in another possible implementation, as shown in fig. 5B, in this embodiment T2 < V1, t4=t3, in this embodiment the modification arch and the hardware interface DSI response cmd command are all completed within the same frame.
For example, in another possible implementation, as shown in FIG. 5C, T2 < V1, T2 < T4 < T3 in this embodiment, and the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
406. Performing a modify operation at time T6, where T6 is in a third frame, the third frame being a next frame adjacent to the second frame, and a timestamp corresponding to the Vsync signal of the third frame being t5, t5=t3+ (T-V1); t6=t5+t3, T VSYNC +T VBP <T3<V1,T VSYNC Is the duration, T, of the Vsync signal pair of the third frame VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
For example, in one possible implementation, as shown in fig. 5D, in this embodiment, T2 > V1, in the third frame, the third frame is the next frame adjacent to the second frame, and the timestamp corresponding to the Vsync signal of the third frame is T5, t5=t3+ (T-V1); t6=t5+t3, T VSYNC +T VBP <T3<V1,T VSYNC Is the duration, T, of the Vsync signal pair of the third frame VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame. In this embodiment both the modification arch and the hardware interface DSI response cmd command are completed in the third frame.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that screen display can be avoided when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes.
Referring to fig. 4B, fig. 4B is a flowchart illustrating a display processing method according to another embodiment of the present application. As shown in fig. 4B, the display processing method provided in the embodiment of the present application includes the steps of: 401 to 406. Steps 401 to 408 are described in detail below.
401. When the image is updated, the idle state is exited, and the frame rate corresponding to the updated image is the second frame rate f2, f2 noteq 1.
402. Determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of which the electronic equipment enters an idle state; the end point T2 of T1 is a timestamp corresponding to the time at which the display driver can modify the blanking area garch earliest in the second frame, which is the frame at which the electronic device exits the idle state.
403. Determining T2 according to T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T.
Since the computer is a floating point calculation and cannot be accurate to an infinite number of bits, t1=n×t+t2 is a theoretical value in the actual calculation, and when T is an infinite number of cycles or an infinite number of cycles, the larger the n value is, the larger the accumulated error is, and in order to reduce the error, T2 can be determined as follows. Determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and in the step after determining T2 from T1, using the value of T2 determined in this step.
404. It is determined whether T2 is less than V1.
If yes, go to step 405. If the determination result is negative, step 406 is performed.
405. Performing an operation of modifying the point at the time t 4; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; t2 is less than or equal to t4 and less than or equal to t3.
For example, in one possible implementation, as shown in fig. 5A, in this embodiment T2 < V1, t4=t2, in this embodiment the modification arch and hardware interface DSI response cmd commands are all completed within the same frame.
For example, in another possible implementation, as shown in fig. 5B, in this embodiment T2 < V1, t4=t3, in this embodiment the modification arch and the hardware interface DSI response cmd command are all completed within the same frame.
For example, in another possible implementation, as shown in FIG. 5C, T2 < V1, T2 < T4 < T3 in this embodiment, and the modification arch and hardware interface DSI response cmd commands are all completed within the same frame. Then, step 407 is performed.
406. Performing a modify operation at time T6, where T6 is in a third frame, the third frame being a next frame adjacent to the second frame, and a timestamp corresponding to the Vsync signal of the third frame being t5, t5=t3+ (T-V1); t6=t5+t3, T VSYNC +T VBP <T3<V1,T VSYNC Is the thirdThe duration of the Vsync signal of a frame, T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
For example, in one possible implementation, as shown in fig. 5D, in this embodiment, T2 > V1, in the third frame, the third frame is the next frame adjacent to the second frame, and the timestamp corresponding to the Vsync signal of the third frame is T5, t5=t3+ (T-V1); t6=t5+t3, T VSYNC +T VBP <T3<V1,T VSYNC Is the duration, T, of the Vsync signal pair of the third frame VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame. In this embodiment, both the modification arch and the hardware interface DSI response cmd command are completed in the third frame, and then step 408 is performed.
407. In the next frame of the second frame, the display screen is caused to display an image at a frame rate f 2.
408. In the next frame of the third frame, the display screen is caused to display an image at a frame rate f 2.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that the screen can be prevented from being jumped when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes; in addition, in the next frame of the second frame, the display screen is caused to display the image at the frame rate f2, and the display screen can be caused to display the image at the new frame rate as early as possible.
Corresponding to the foregoing method embodiment, the embodiment of the present application further provides a display processing device, which is applied to an electronic device, where a display screen of the electronic device is in a standby idle state at a first frame rate f1, and the device includes: the device comprises a first processing unit, a first determining unit, a second determining unit and a modifying unit. The first processing unit is used for exiting the idle state when the image is updated, and the frame rate corresponding to the updated image is the second frame rate f2, and f2 is not equal to f1; a first determining unit, configured to determine a first time length T1, where a start point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame in which the electronic device enters an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of which the electronic equipment exits the idle state; a second determining unit for determining T2 according to T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T; a modifying unit for executing the operation of modifying the point at the time T4 when T2 < V1; v1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
In one possible implementation, the display processing apparatus further includes: and the first display unit is used for displaying the image at the next frame of the second frame when T2 < V1.
In one possible implementation manner, the modifying unit is further configured to perform, when T2 is greater than or equal to V1, an operation of modifying the point at time T6, where T6 is in a third frame, the third frame is a next frame adjacent to the second frame, a timestamp corresponding to the Vsync signal of the third frame is T5, t5=t3+ (T-V1); t6=t5+t3, said T VSYNC +T VBP T3 < V1, said T VSYNC Is the duration of the Vsync signal pair of the third frame, the T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
In one possible implementation, the display processing apparatus further includes: and the second display unit is used for displaying the image with f2 in the next frame of the third frame when T2 is more than or equal to V1.
In one possible implementation manner, in determining T2 according to T1, the second determining unit is specifically configured to determine that a greatest common divisor of f1 and 1000 is x, y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds), and modifying the value of T2 in the unit uses the value of T2 determined by the second determining unit.
When the technical scheme is adopted for display processing, the operation of modifying the arch and the DSI response cmd command are completed in the same frame, so that the screen can be prevented from being jumped when the electronic equipment exits from a standby state and the frame rate of an image to be displayed changes; in addition, in the next frame of the second frame, the display screen is caused to display the image at the frame rate f2, and the display screen can be caused to display the image at the new frame rate as early as possible.
Referring to fig. 6, an embodiment of the present application further provides an electronic device 600 implementing the above display processing method, as shown in fig. 6, the electronic device 600 may include a processor 610, an external memory interface 620, an internal memory 621, a universal serial bus (universal serial bus, USB) interface 630, a charging management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a speaker 670A, a receiver 670B, a microphone 670C, an earphone interface 670D, a sensor module 680, a key 690, a motor 691, an indicator 692, a camera 693, a screen 694, and a subscriber identity module (subscriber identification module, SIM) card interface 695. Among other things, the sensor module 680 may include a pressure sensor 680A, a gyroscope sensor 680B, a barometric pressure sensor 680C, a magnetic sensor 680D, an acceleration sensor 680E, a distance sensor 680F, a proximity light sensor 680G, a fingerprint sensor 680H, a temperature sensor 680J, a touch sensor 680K, an ambient light sensor 680L, a bone conduction sensor 680M, and the like.
The processor 610 may include one or more processing units, such as: the processor 610 may include an AP, CP, modem processor, graphics processor (graphics processing unit, GPU), image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 600, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. In some embodiments, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that the processor 610 has just used or recycled. If the processor 610 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided, reducing the latency of the processor 610 and thus improving the efficiency of the system. The processor may also be provided with a low power memory (e.g., island low power, etc.) to reduce power consumption.
The electronic device 600 implements display functions through a GPU, a screen 694, and an application processor, among others. The GPU is a microprocessor for image processing, connected to the screen 694 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or change display information.
Screen 694 is used to display images, video, etc. Screen 694 includes a display panel. The display panel may employ a liquid crystal screen (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 600 may include 1 or N screens 694, N being an integer greater than 1.
The electronic device 600 may implement shooting functions through an ISP, a camera 693, a video codec, a GPU, a screen 694, an application processor, and the like.
The ISP is used to process the data fed back by the camera 693. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 693.
The camera 693 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 600 may include 1 or N cameras 693, N being an integer greater than 1.
The external memory interface 620 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 600. The external memory card communicates with the processor 610 through an external memory interface 620 to implement data storage functions. Such as storing files of music, video, etc. in an external memory card.
The internal memory 621 may be used to store computer-executable program code that includes instructions. The processor 610 performs various functional applications of the electronic device 600 as well as data processing by executing instructions stored in the internal memory 621. The internal memory 621 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created by the electronic device 600 during use (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The acceleration sensor 680E may detect the magnitude of acceleration of the electronic device 600 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 600 is stationary. Acceleration sensor 680E may also be used to identify the pose of electronic device 600, for applications such as landscape switching, pedometers, and the like. Of course, the acceleration sensor 680E may also be combined with the gyro sensor 680B to recognize the gesture of the electronic device 600, and be applied to the landscape switching.
The gyro sensor 680B may be used to determine a motion gesture of the electronic device 400. In some embodiments, the angular velocity of the electronic device 600 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 680B. The gyro sensor 680B may be used to capture anti-shake. For example, when the shutter is pressed, the gyro sensor 680B detects the shake angle of the electronic device 600, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 600 by the reverse motion, so as to realize anti-shake. The gyro sensor 680B can also be used for horizontal and vertical screen switching, navigation, and somatosensory of game scenes.
It should be understood that the structures illustrated in the embodiments of the present application do not constitute a particular limitation of the electronic device 600. In other embodiments of the present application, electronic device 600 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The electronic device provided in the embodiment of the present application may be a User Equipment (UE), for example, a mobile terminal (such as a mobile phone) and other devices.
In addition, an operating system is run on the components. For example, android open source operating systems developed by google corporation may be used.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like. In order to more clearly illustrate the identification method of the touch operation provided by the embodiment of the application, the embodiment of the application takes an Android (Android) system with a layered architecture as an example, and illustrates a software system of the electronic device.
Fig. 6 described above is only an example of the composition of an electronic device. Referring to fig. 7, a composition example of still another electronic device is provided in an embodiment of the present application.
In this example, the software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the present application are in a layered architectureThe system is an example illustrating the software architecture of an electronic device.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, it will The system is divided into five layers, namely an application program layer, an application program framework layer, an Android Run (ART) and a native C/C++ library from top to bottom, a hardware abstraction layer (Hardware Abstract Layer, HAL) and a kernel layer.
The application layer may include a series of application packages.
As shown in fig. 7, the electronic device may include a hardware layer and a software layer, where an Android system of the layered architecture may include an application layer, an application framework layer, a system library layer, and a kernel layer. In some alternative embodiments, the system of the electronic device may also include a hierarchy not mentioned by the above technical architecture, such as Android Runtime (Android run). The application layer may include a series of application packages, such as a navigation application, a music application, a video application, a finger joint tapping screen application, and the like. The application packages may include video, chat, etc. applications, and System user interfaces (System user interface, system UI), and the finger joint tapping screen application may be used for screen shots, recordings, long screen shots, area screen shots, etc. In this example, the application package also includes applications such as contact lookup.
Video, chat, etc. applications are used to provide corresponding services to users. For example, a user views a video using a video application, chatts with other users using a chat application, listens to music using a music application, generates recall video using existing images and video using video composition, and the like.
The system UI is used for managing a human-computer interface (UI) of the electronic device, and in the embodiment of the present application, the system UI is used for monitoring touch operations on a touch screen.
The application framework layer provides an application programming interface (applicationprogramming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer may include a window management service module (window manage service, WMS), a display rotation module (also known as displayport), an application management service module (activity manage service, AMS), an Input management module (also known as Input), an image processing module, and the like.
WMSs are used to manage windows. The window manager can acquire the size of the screen, judge whether a status bar exists, cut out the screen by matting the image in the screen, and the like. In the embodiment of the application, the WMS can create and manage the window corresponding to the application.
The display rotation module is used for controlling the screen to rotate, and the screen displays the layout of a vertical screen or a horizontal screen through rotation. And for example, when the screen rotation is determined to be needed, notifying the Surfaceflinger to switch the transverse screen and the vertical screen of the application interface.
The AMS serves to launch a specific application according to a user's operation. For example, after the image is synthesized, the primary key of the image is triggered to be displayed in the screen, after the image is displayed, the image which is determined to need to be subjected to the matting operation is triggered to be subjected to the matting operation, and an application stack corresponding to the video application is created, so that the video application can normally run.
The system library layer may include a plurality of functional modules, such as: a sensor module (also known as a sensor) and a SurfaceFlinger.
The sensor module is used for acquiring data acquired by the sensor, such as acquiring ambient light under a screen. And collecting the gravity direction information of the electronic equipment. Or, the sensor module can also adjust the brightness of the screen according to the ambient light and determine the horizontal and vertical screen state information of the electronic device according to the gravity direction information of the electronic device, wherein the horizontal and vertical screen state information is used for indicating whether the electronic device is in a horizontal screen state or a vertical screen state.
Surfaceflinger is a system service used for the functions of creation, control, management and the like of a layer.
In addition, the system library layer may further include: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. In this embodiment of the present application, the kernel layer at least includes a touch driving module and a display driving module.
The display driving module is used for displaying the synthesized image in the screen according to the module of the application framework layer and the image data provided by the application program of the application layer. For example, the video application communicates a frame of image data of the video to a display driver module, which displays a frame of image of the video on the touch screen based on the image data. The SystemUI transmits the image data to a display driving module, and the display driving module displays the synthesized image in a screen.
The touch control driving module is used for monitoring capacitance data of each area of the touch screen. When a user clicks or slides on the touch screen, the capacitance value of the clicked or slid area can be changed, the touch control driving module can monitor the change of the capacitance value of each area on the touch screen and send a capacitance value change message to the input management module, and the capacitance value data change message carries information such as the change amplitude of the capacitance value of each area of the touch screen and the change time.
The input management module can determine touch operation according to the reported capacitance value change message, and then sends the identified touch operation to other modules. The touch operation herein may include a knuckle tap operation, a click operation, a drag operation, and a specific gesture operation (e.g., a swipe gesture operation, a sideslip gesture operation, etc.).
The hardware layer includes a screen, an ambient light sensor, etc., for detecting ambient light information under the screen, etc. The method comprises the steps that an application processor monitors touch operation on a touch screen, a baseband processor monitors acceleration data and stores the monitored acceleration data into a storage module, and when an AP monitors the touch operation, a CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data stored in the storage module; and the CP sends the identification result to the AP.
The above technical architecture exemplifies modules and devices in an electronic device that may be involved in the present application. In practical applications, the electronic device may include all or part of the modules and devices of the above technical architecture, and other modules and devices not mentioned in the above technical architecture, and of course, may also include only the modules and devices of the above technical architecture, which is not limited in this embodiment.
The embodiment of the application also provides electronic equipment, which comprises a processor, a display and a memory, wherein the memory is used for storing a computer program, the display is used for displaying images, and the processor is used for calling and running the computer program from the memory, so that the processor executes the display processing method according to any one of the method embodiments.
The embodiment of the application also provides a chip, which comprises a processor, wherein when the processor executes instructions, the processor executes the display processing method according to any one of the previous method embodiments.
The present application also provides a computer-readable storage medium storing a computer program, which when executed by a processor is capable of implementing the steps in the above-described method embodiments.
The present application provides a computer program product comprising a computer program enabling the implementation of the steps of the various method embodiments described above, when the computer program is executed by a processor.
All or part of the process in the method of the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and which, when executed by a processor, implements the steps of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/electronic apparatus, recording medium, computer memory, read-only memory (ROM), random access memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed method and electronic device may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (13)

1. A display processing method, applied to an electronic device, where a display screen of the electronic device is in a standby idle state at a first frame rate f1, the method comprising:
when the image is updated, the idle state is exited, the frame rate corresponding to the updated image is a second frame rate f2, and f2 is not equal to f1;
determining a first time length T1, wherein a starting point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of the electronic equipment entering an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of the electronic equipment which exits the idle state;
Determining T2 according to the T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T;
executing the operation of modifying the point at the time T4 when the T2 is less than V1; the V1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
2. The method according to claim 1, wherein the method further comprises:
and when T2 is less than V1, displaying an image by the display screen at f2 in the next frame of the second frame.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
executing the operation of modifying the point at the time T6 when the T2 is more than or equal to V1, wherein the T6 is in a third frame, the third frame is the next frame adjacent to the second frame, and the timestamp corresponding to the Vsync signal of the third frame is T5, t5=t3+ (T-V1); t6=t5+t3, said T VSYNC +T VBP T3 < V1, said T VSYNC Is the duration of the Vsync signal pair of the third frame, the T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
4. A method according to claim 3, characterized in that the method further comprises:
and when T2 is more than or equal to V1, displaying an image by the f2 on the display screen in the next frame of the third frame.
5. The method according to any one of claims 1 to 4, wherein said determining T2 from said T1 comprises: determining that the greatest common divisor of f1 and 1000 is x, and y=1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds);
the value of T2 in the step after T2 is determined from the T1 uses the value of T2 determined in this step.
6. A display processing apparatus, applied to an electronic device, a display screen of which is in a standby idle state at a first frame rate f1, comprising:
the first processing unit is used for exiting the idle state when the image is updated, the frame rate corresponding to the updated image is a second frame rate f2, and f2 is not equal to f1;
a first determining unit, configured to determine a first time length T1, where a start point T1 of the T1 is a timestamp corresponding to a vertical synchronization Vsync signal of a first frame, and the first frame is a frame of the electronic device entering an idle state; the terminal T2 of the T1 is a timestamp corresponding to the earliest time that the display driver can modify the blanking area arch in a second frame, and the second frame is the frame of the electronic equipment which exits the idle state;
A second determining unit, configured to determine T2 according to the T1; wherein, T1=n is T+T2, T=1/f 1, n is an integer, n is more than or equal to 0,0 < T2 < T;
a modifying unit, configured to execute the operation of modifying the point at time T4 when T2 < V1; the V1 is the time length corresponding to the time stamp corresponding to the Vsync signal of the second frame at the latest time t3 of the modified arch when the modified arch and the hardware interface DSI response cmd command are completed in the second frame; wherein t2 is less than or equal to t4 is less than or equal to t3.
7. The apparatus of claim 6, wherein the apparatus further comprises:
and the first display unit is used for displaying an image at f2 in the next frame of the second frame when T2 is less than V1.
8. The apparatus of claim 6, wherein the device comprises a plurality of sensors,
the modification unit is further configured to execute, when T2 is greater than or equal to V1, the operation of modifying the point at time T6, where T6 is in a third frame, the third frame is a next frame adjacent to the second frame, and a timestamp corresponding to the Vsync signal of the third frame is T5, t5=t3+ (T-V1); t6=t5+t3, said T VSYNC +T VBP T3 < V1, said T VSYNC Is the duration of the Vsync signal pair of the third frame, the T VBP Is the duration corresponding to the back shoulder of the vertical blanking region of the third frame.
9. The apparatus of claim 8, wherein the apparatus further comprises:
and the second display unit is used for displaying an image at f2 in the next frame of the third frame when T2 is more than or equal to V1.
10. The apparatus according to any one of claims 6 to 9, wherein in determining T2 from the T1, the second determining unit is specifically configured to determine that a greatest common divisor of the f1 and 1000 is x, y = 1000/x; determining t2= (b% (y x 1000)% T, where t1=a (seconds) +b (microseconds);
the value of T2 in the modification unit uses the value of T2 determined in this step.
11. An electronic device comprising a processor, a display and a memory, the memory for storing a computer program, the display for displaying an image, the processor for calling and running the computer program from the memory, causing the processor to execute the display processing method of any one of claims 1 to 5.
12. A chip comprising a processor which, when executing instructions, performs the display processing method of any one of claims 1 to 5.
13. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program, which when executed by a processor, causes the processor to execute the display processing method of any one of claims 1 to 5.
CN202311777273.2A 2023-12-21 2023-12-21 Display processing method and device and electronic equipment Active CN117746762B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311777273.2A CN117746762B (en) 2023-12-21 2023-12-21 Display processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311777273.2A CN117746762B (en) 2023-12-21 2023-12-21 Display processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN117746762A true CN117746762A (en) 2024-03-22
CN117746762B CN117746762B (en) 2024-10-11

Family

ID=90252463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311777273.2A Active CN117746762B (en) 2023-12-21 2023-12-21 Display processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117746762B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168836A1 (en) * 2013-04-11 2014-10-16 Qualcomm Incorporated Apparatus and method for displaying video data
CN106205460A (en) * 2016-09-29 2016-12-07 京东方科技集团股份有限公司 The driving method of display device, time schedule controller and display device
JP2018155846A (en) * 2017-03-16 2018-10-04 パイオニア株式会社 Projection device, control method, program, and storage media
CN116092452A (en) * 2023-01-05 2023-05-09 荣耀终端有限公司 Refresh rate switching method and electronic device
CN116414337A (en) * 2021-12-29 2023-07-11 荣耀终端有限公司 Frame rate switching method and device
CN117174014A (en) * 2023-10-31 2023-12-05 荣耀终端有限公司 Display control circuit, display control method and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168836A1 (en) * 2013-04-11 2014-10-16 Qualcomm Incorporated Apparatus and method for displaying video data
CN106205460A (en) * 2016-09-29 2016-12-07 京东方科技集团股份有限公司 The driving method of display device, time schedule controller and display device
JP2018155846A (en) * 2017-03-16 2018-10-04 パイオニア株式会社 Projection device, control method, program, and storage media
CN116414337A (en) * 2021-12-29 2023-07-11 荣耀终端有限公司 Frame rate switching method and device
CN116092452A (en) * 2023-01-05 2023-05-09 荣耀终端有限公司 Refresh rate switching method and electronic device
CN117174014A (en) * 2023-10-31 2023-12-05 荣耀终端有限公司 Display control circuit, display control method and electronic equipment

Also Published As

Publication number Publication date
CN117746762B (en) 2024-10-11

Similar Documents

Publication Publication Date Title
CN114780012B (en) Display method and related device of screen locking wallpaper of electronic equipment
WO2023001163A1 (en) Screen refreshing method and device capable of improving dynamic effect performance
WO2023005751A1 (en) Rendering method and electronic device
US20230377306A1 (en) Video Shooting Method and Electronic Device
CN116826892A (en) Charging method, charging device, electronic apparatus, and readable storage medium
CN115018692B (en) Image rendering method and electronic equipment
CN116546274B (en) Video segmentation method, selection method, synthesis method and related devices
CN116225274A (en) Identification method and device for touch operation, electronic equipment and storage medium
CN115361468B (en) Display optimization method and device during screen rotation and storage medium
CN117746762B (en) Display processing method and device and electronic equipment
CN117724781A (en) Playing method for starting animation by application program and electronic equipment
US20240064397A1 (en) Video Shooting Method and Electronic Device
CN117894256B (en) Display processing method and device and electronic equipment
CN116688495A (en) Frame rate adjusting method and related device
CN117135253B (en) Contact searching method, terminal equipment and storage medium
CN118550497A (en) Ambient light determination method, screen brightness adjustment method and electronic device
EP4455986A1 (en) Data processing method and apparatus, and device and storage medium
CN117097883B (en) Frame loss fault cause determining method, electronic equipment and storage medium
US20240040235A1 (en) Video shooting method and electronic device
CN117114964B (en) Method for caching image frames, electronic equipment and storage medium
CN116700578B (en) Layer synthesis method, electronic device and storage medium
CN116887388A (en) Communication processing method, electronic device and storage medium
CN116688494B (en) Method and electronic device for generating game prediction frame
CN116955208B (en) Test method, terminal equipment, chip and storage medium
WO2023072113A1 (en) Display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant