WO2024112005A1 - Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé - Google Patents

Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé Download PDF

Info

Publication number
WO2024112005A1
WO2024112005A1 PCT/KR2023/018492 KR2023018492W WO2024112005A1 WO 2024112005 A1 WO2024112005 A1 WO 2024112005A1 KR 2023018492 W KR2023018492 W KR 2023018492W WO 2024112005 A1 WO2024112005 A1 WO 2024112005A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
display
electronic device
processor
tearing
Prior art date
Application number
PCT/KR2023/018492
Other languages
English (en)
Korean (ko)
Inventor
박준영
김승진
마경희
이성준
장창재
정지순
조광래
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020230003537A external-priority patent/KR20240076338A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to EP23805839.0A priority Critical patent/EP4398233A1/fr
Publication of WO2024112005A1 publication Critical patent/WO2024112005A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • Embodiments of the present disclosure relate to an electronic device including a display and a method of operating the same.
  • electronic devices e.g., smartphones, wearable electronic devices, or other portable electronic devices
  • the electronic device may include a display capable of displaying information related to the execution of various functions.
  • the display may include various display devices such as a liquid crystal display (LCD), an organic light emitting diode display (OLED display), or an electrophoretic display.
  • LCD liquid crystal display
  • OLED display organic light emitting diode display
  • electrophoretic display an electrophoretic display
  • An electronic device may include a memory, a processor, a display controller, and a display.
  • the memory when executed by the processor, causes the electronic device to send a first frame to the display controller based on the first tearing signal output from the display at the first output time. At least one instruction causing transmission may be stored.
  • the memory when executed by the processor, causes the electronic device to obtain a second frame for updating some of the plurality of layers related to the first frame.
  • One instruction can be saved.
  • the memory when executed by the processor, causes the electronic device to determine that some of the layers among the plurality of layers correspond to a region of interest (ROI) of the first frame. If it is confirmed that it is a layer, at least one instruction that causes it to be confirmed that screen tearing occurs when the second frame is transmitted to the display through the display controller can be stored.
  • ROI region of interest
  • the second frame is not written to the memory included in the display while the first frame is being scanned on the display. It may not be possible.
  • a first frame is transmitted to a display controller included in the electronic device. It may include a transmitting operation.
  • a method of operating an electronic device may include obtaining a second frame for updating some layers related to the first frame.
  • the second frame when it is confirmed that the second frame is a layer corresponding to a region of interest (ROI) of the first frame, the second frame is transmitted to the display.
  • ROI region of interest
  • the second frame is transmitted to the display.
  • it may include an operation to check that a screen is broken.
  • the second frame is written to the memory included in the display while the first frame is being scanned on the display. It may not be written.
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display a display included in the electronic device. Based on the first tearing signal output at the first output time, an operation of transmitting the first frame to the display controller included in the electronic device may be caused to be performed.
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display a plurality of instructions related to the first frame. It may cause an operation to obtain a second frame to update some of the layers of .
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display the second frame as the first Based on confirmation that it is a layer corresponding to the region of interest (ROI) of 1 frame, it may be caused to perform an operation to determine that screen tearing occurs when transmitting the second frame to the display.
  • ROI region of interest
  • a storage medium storing at least one computer-readable instruction according to an embodiment, based on confirmation that the screen is broken, the second frame is displayed while the first frame is being scanned on the display.
  • the memory included in the display may not be written.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments.
  • FIG. 2 is a block diagram of a display module, according to various embodiments.
  • FIG. 3A is a schematic block diagram of the configuration of an electronic device according to an embodiment.
  • FIG. 3B is a schematic block diagram of the configuration of an electronic device according to an embodiment.
  • FIG. 4A is a flowchart showing a method of operating an electronic device according to an embodiment.
  • FIG. 4B is a flowchart showing a method of operating an electronic device according to an embodiment.
  • Figure 5 is a timing diagram showing the frame transmission, writing, and scanning processes after adjusting the output timing of the tearing signal according to a comparative example.
  • FIG. 6A is a flowchart illustrating a method of operating an electronic device to prevent screen cracking according to an embodiment.
  • FIG. 6B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • FIG. 7A is a flowchart illustrating a method of operating an electronic device to prevent screen cracking, according to an embodiment.
  • FIG. 7B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • FIG. 8A is a flowchart illustrating a method of operating an electronic device to prevent screen cracking according to an embodiment.
  • FIG. 8B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a second network 199.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself, where artificial intelligence is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access to multiple terminals (massive machine type communications (mMTC)), or ultra-reliable and low-latency (URLLC). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing.
  • MIMO massive array multiple-input and multiple-output
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199). According to one embodiment, the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern created on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally produced as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may generate a mmWave antenna module.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side)
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram 200 of a display device 260 according to various embodiments.
  • the display device 260 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210 .
  • the DDI 230 may include an interface module 231, a memory 233 (eg, buffer memory), an image processing module 235, or a mapping module 237.
  • the DDI 230 receives image information including image data or an image control signal corresponding to a command for controlling the image data from other components of the electronic device 101 through the interface module 231. can do.
  • the image information is stored in the processor 120 (e.g., the main processor 121 (e.g., an application processor) or the auxiliary processor 123 (
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 276 through the interface module 231.
  • the image processing module 235 may store at least a portion of the received image information in the memory 233, for example, in units of frames.
  • the mapping module 237 may perform pre-processing or post-processing (eg, resolution, brightness, or size adjustment) based at least on the characteristics of the display 210.
  • a voltage value or a current value corresponding to the image data may be generated, for example, through properties of pixels of the display 210 (e.g., an arrangement of pixels). At least some pixels of the display 210 may be based at least in part on the RGB stripe or pentile structure (or the size of each subpixel), for example, at least in part based on the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 210.
  • the display device 260 may further include a touch circuit 250.
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251.
  • the touch sensor IC 253 may control the touch sensor 251 to detect a touch input or a hovering input for a specific position of the display 210.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or charge amount) for a specific position of the display 210.
  • the touch sensor IC 253 may provide information (e.g., location, area, pressure, or time) about the detected touch input or hovering input to the processor 120.
  • At least a portion of the touch circuit 250 (e.g., touch sensor IC 253) is disposed outside the display driver IC 230, a part of the display 210, or the display device 260. It may be included as part of other components (e.g., auxiliary processor 123).
  • the display device 260 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illumination sensor) of the sensor module 276, or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display device 260 (eg, the display 210 or the DDI 230) or a part of the touch circuit 250.
  • the sensor module 276 embedded in the display device 260 includes a biometric sensor (e.g., a fingerprint sensor)
  • the biometric sensor transmits biometric information associated with a touch input through a portion of the display 210. (e.g. fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information associated with a touch input through part or the entire area of the display 210. You can.
  • the touch sensor 251 or the sensor module 276 may be disposed between pixels of a pixel layer of the display 210, or above or below the pixel layer.
  • FIG. 3A is a schematic block diagram of the configuration of an electronic device according to an embodiment.
  • the electronic device 301 may include a first processor 320, a memory 330, and a display 360.
  • the electronic device 301 may be implemented identically or similarly to the electronic device 101 of FIG. 1 .
  • the display 360 may be implemented the same or similar to the display module 160 of FIG. 1 or the display device 260 of FIG. 2.
  • the display 360 may include a display driver integrated circuit (IC) (DDI) 370 and a display panel 380.
  • a display driver integrated circuit (IC) (DDI) 370 may control the display panel 380.
  • the display driver IC 370 may be implemented the same or similar to the display driver IC (DDI) 230 of FIG. 2.
  • the first processor 320 may control the overall operation of the electronic device 301.
  • the first processor 320 may be implemented identically or similarly to the processor 120 of FIG. 1 .
  • the first processor 320 may be implemented as an application processor (AP).
  • the first processor 320 may include a second processor 321, a third processor 322, and a display controller 350.
  • the display controller 350 may be implemented as a display processing unit (DPU) or data processing unit (DPU).
  • the second processor 321 may be implemented as a graphics processing unit (GPU).
  • the third processor 322 may be implemented as a central processing unit (CPU).
  • the third processor 322 may execute a display driver.
  • the display controller 350 may receive a tearing signal for a frame (or image) from the display 360.
  • a tearing signal may mean a tearing effect (TE) signal.
  • the tearing effect (TE) may refer to a visual artifact that appears when a plurality of frames are displayed as one screen on the display 360. For example, if the screen refresh rate of the display 360 and the frame rate of the GPU are not synchronized, a tearing effect (TE) may occur.
  • the tearing signal is used to prevent the tearing effect of the frame being scanned (or displayed) on the display panel 380 by dividing the time period in which the frame is written and the frame being scanned (or displayed). It may include a control signal for determining the time interval.
  • the display controller 350 may transmit a frame to the display driver IC 370.
  • the operation of the display controller 350 transmitting the frame to the display driver IC 370 is performed when the display driver IC 370 operates as a memory for the display 360 (e.g., memory 233 in FIG. 2). ) (e.g., graphic ram (GRAM)), the display driver IC 370 reads the frame written to the memory 233.
  • GRAM graphic ram
  • the cycle of the tearing signal may match the cycle of the synchronization signal (eg, the Vsync signal).
  • the output time of may be adjusted based on the output time of the synchronization signal, the tearing signal rises at the rising edge time of the synchronization signal, and the tearing signal rises at the falling point of the synchronization signal (e.g.
  • the point at which the tearing signal rises may be the point at which the tearing signal is output.
  • the frame is transmitted to the display driver IC 370 when the signal rises, and the display driver IC 370 writes the frame to the memory 233. ) can read a frame from the memory 233 at the falling point of the tearing signal and scan (or display) the frame on the display panel 380.
  • the second processor 321 may render a frame (or image).
  • the second processor 321 may transmit the rendered frame (or image) to the third processor 322.
  • the third processor 322 may transmit a frame to the display controller 350.
  • the display controller 350 may transmit a frame (or image) to a display driver integrated circuit (IC) 370.
  • the display controller 350 converts the first frame received from the third processor 322 to a display driver IC (integrated circuit) 370 based on a tearing signal output at regular intervals from the display 360. ) can be transmitted to.
  • the tearing signal may be output at a point in time advanced by a specified time based on the output point of the reference tearing signal.
  • the reference tearing signal may refer to a tearing signal output from the display 360 at the output time before adjustment.
  • the designated time may be determined based on at least one of the number of layers constituting the frame, the resolution of the display 360, or the frequency (frame rate).
  • the resolution of the display 360 may be implemented as 3,088 (number of horizontal pixels) ⁇ number of vertical pixels).
  • the frequency of the display 360 may be implemented at 120Hz.
  • the specified time may be 2 ms. However, this is an example, and embodiments of the present invention may set the designated time based on various methods.
  • the display controller 350 may transmit the first frame to the display driver IC 370 at the time of output of the tearing signal.
  • the first frame may be written to the memory 233 (eg, the memory 233 in FIG. 2) at the time of output of the tearing signal.
  • the memory 233 may be implemented as graphic RAM (GRAM).
  • the display driver IC 370 may read the first frame from the memory 233 and scan (or display) the first frame on the display panel 380.
  • the third processor 322 may receive a second frame for updating some layers related to the first frame from the second processor 321.
  • the second frame may be a frame rendered by the second processor 321 after the first frame.
  • the third processor 322 may check whether the second frame is a frame for updating part of the first frame.
  • the third processor 322 may check whether the second frame is a frame for updating some of the plurality of layers constituting the first frame. At this time, the second frame may include only some layers rather than all layers. If the third processor 322 determines that the second frame is a frame for updating a portion of a layer of the first frame, the third processor 322 determines that the second frame is a region of interest of the first frame.
  • the area of interest may include the bottom area of the first frame displayed (or scanned) on the display panel 380.
  • the area of interest may be set based on user input.
  • the area of interest is not limited to the bottom area of the first frame and may include at least a portion of the first frame displayed on the display panel 380.
  • the third processor 322 when the third processor 322 determines that the second frame is for updating a layer corresponding to the region of interest of the first frame, the third processor 322 updates the second frame based on the tearing signal to a display driver IC (integrated circuit) (370), it can be confirmed that screen cracking occurs (or is expected).
  • the display driver IC (integrated circuit) 370 when transmitting the second frame to the display driver IC (integrated circuit) 370 based on the tearing signal, the display driver IC (integrated circuit) 370 transmits the first frame written to the memory 233 to the display panel.
  • the time of scanning (or displaying) at 380 and the time of writing the second frame to the memory 233 by the display driver integrated circuit (IC) 370 may overlap. As a result, the image or frame displayed (or scanned) through the display 360 may be broken.
  • the third processor 322 and the display controller 350 may perform an operation to prevent screen tearing based on an operation that confirms that screen tearing has occurred.
  • the display controller 350 may transmit the second frame to the display driver IC 370 after the first frame is scanned (or displayed) on the display panel 380. According to one embodiment, the display controller 350 may transmit the second frame to the display driver IC 370 after a specified first time from the point of output of the tearing signal. According to one embodiment, the designated first time (e.g., 2ms) is the time between the output of the tearing signal and the time when the first frame is scanned (or displayed) on the display panel 380. It can mean. Alternatively, according to one embodiment, the third processor 322 may transmit the second frame to the display controller 350 after a designated second time from the point of output of the tearing signal.
  • the designated first time e.g., 2ms
  • the third processor 322 may transmit the second frame to the display controller 350 after a designated second time from the point of output of the tearing signal.
  • the designated second time (e.g., 2ms) is the time between the output of the tearing signal and the time when the first frame is scanned (or displayed) on the display panel 380. It can mean. According to one embodiment, the designated first time and the designated second time may be the same time or may be different times. According to one embodiment, the designated first time and the designated second time are the time interval in which the region of interest of a portion of the first frame is scanned (or displayed) on the display panel 380 and the layer corresponding to the region of interest is updated. This may refer to a time set so that the time intervals in which the second frame is transmitted to the display driver IC 370 do not match.
  • the operation of transmitting the second frame to the display driver IC 370 may include the display driver IC 370 writing the second frame to the memory 233.
  • the third processor 322 may adjust the timing of transmitting the second frame to the display driver IC 370. Through this, screen tearing may not occur even when only a portion of the frame is updated.
  • the third processor 322 may transmit a third frame to update the entire first frame to the display controller 350 based on an operation that confirms that the screen is broken.
  • the display controller 350 may receive the third frame from the third processor 322 based on the tearing signal.
  • the display controller 350 may transmit the third frame to the display driver IC 370 at the time of output of the tearing signal.
  • the display controller 350 may transmit the third frame to the display driver IC 370 while the first frame is being scanned (or displayed) on the display panel 380.
  • the operation of transmitting the third frame to the display driver IC 370 may include the display driver IC 370 writing the third frame to the memory 233.
  • the third processor 322 requests the second processor 321 for a third frame to update the first frame as a whole, and receives the third frame from the second processor 321. 3 frames can be transmitted to the display controller 350.
  • the third processor 322 may receive the first frame in response to the output of the first tearing signal. However, if the first frame is not received from the output time of the first tearing signal to the output time of the subsequent second tearing signal (e.g., a tearing signal output after the first tearing signal), frame delay occurs. It can be. For example, when a frame delay occurs, the third processor 322 may transmit the first frame to the display controller 350 in a time section following the originally allocated time section. To solve this problem, the third processor 322 may advance the output timing of the tearing signal by a designated time (eg, 2 ms). Through this, the third processor 322 can wait for a frame rendered from the second processor 321 for a specified amount of time. Additionally, the third processor 322 can reduce the occurrence of frame delay.
  • a designated time eg, 2 ms
  • FIG. 3B is a schematic block diagram of the configuration of an electronic device according to an embodiment.
  • the first processor 320 (e.g., the first processor 320 in FIG. 3A) includes a second processor 321 (e.g., the second processor 321 in FIG. 3A), and a third processor ( 322) (eg, the third processor 322 in FIG. 3A) and a display controller 350 (eg, the display controller 350 in FIG. 3A).
  • a second processor 321 e.g., the second processor 321 in FIG. 3A
  • a third processor e.g, the third processor 322 in FIG. 3A
  • a display controller 350 eg, the display controller 350 in FIG. 3A.
  • the third processor 322 may execute Surface flinger 381, HW composer 382, and display driver 383.
  • the second processor 321 may render a frame (or image).
  • the rendered frame (or image) may be stored in a graphics buffer (eg, a frame storage area of the memory 330).
  • the Surface flinger 381 may transmit graphics buffer information (e.g., address of the frame storage area, resolution information, etc.) to the HW composer 382.
  • the HW composer 382 may transmit information about the graphics buffer to the display driver 383.
  • the display driver 383 may transmit the rendered frame to the display controller 350.
  • the display driver 383 may determine when the display controller 350 will transmit a frame to the display 360 (eg, the display 360 of FIG. 3A).
  • the time to transmit the frame may be the time when the tearing signal is output (or the time when the display controller 350 checks the tearing signal).
  • the display driver 383 may advance the output timing of the reference tearing signal by adjusting it by a specified time (eg, 2 ms).
  • the display controller 350 may transmit a frame to the display 360 at the time of output of the adjusted tearing signal.
  • the operation of the display controller 350 to transmit a frame may include an operation of the display 360 to write the frame to the memory 233 (eg, memory 233 in FIG. 2).
  • the display 360 may read a frame from the memory 233 and scan (or display) the frame on the display 360.
  • FIG. 4A is a flowchart showing a method of operating an electronic device according to an embodiment.
  • the flow diagram of FIG. 4A includes a conditional branch statement such as operation 405, each of the branching operations may form a separate additional embodiment, and each of the branching operations may be optional, as described below. can be regarded as
  • the display controller 350 performs tearing output at a certain period from the display 360 (e.g., the display 360 in FIG. 3A).
  • the first frame may be transmitted to the display 360.
  • the first frame may refer to a frame rendered by the second processor 321 (eg, the second processor 321 in FIG. 3A).
  • the output timing of the tearing signal may be adjusted to advance by a specified time (eg, 2 ms) from the output timing of the reference tearing signal.
  • the reference tearing signal may refer to a tearing signal output from the display 360 at the output time before adjustment.
  • the designated time may be determined based on at least one of the number of layers constituting the frame, the resolution of the display 360, or the frequency. However, this is an example, and embodiments of the present invention may set the designated time based on various methods.
  • the first frame may be written to the memory 233 (eg, the memory 233 in FIG. 2) at the time the tearing signal is output.
  • the display controller 350 may transmit the first frame to the display 360 within a preset time (eg, 2 ms) from the time the tearing signal is output.
  • the display 360 may read the first frame from the memory 233 and scan (or display) the first frame.
  • the preset time may be set by the user or may be set by the first processor 320.
  • a third processor 322 may receive a second frame from the second processor 321 to update some layers related to the first frame.
  • the second frame may include a frame rendered by the second processor 321 after the first frame.
  • the second frame may include only some layers rather than all layers.
  • the third processor 322 determines that the second frame is a layer corresponding to the region of interest of the first frame (e.g., the bottom region of the first frame). You can check whether it is a frame for updating. If operations 411 and 413 are not part of the above-described embodiment, operation 405 may cause the third processor 322 (e.g., display driver 383) to convert the second frame to the ROI of the first frame (e.g., It can be expressed as a step of determining that the frame is for updating the layer corresponding to the lower area of the first frame.
  • operation 405 may cause the third processor 322 (e.g., display driver 383) to display the ROI of the first frame (e.g., It can be expressed as a step of determining that the frame is not for updating the layer corresponding to the lower area of the first frame.
  • the third processor 322 e.g., display driver 383
  • the third processor 322 (e.g., display driver 383) is configured to update the layer that the second frame corresponds to the region of interest of the first frame (e.g., the bottom region of the first frame). If it is confirmed to be a frame (operation 405 - Yes), in operation 407, it can be confirmed that when the second frame is transmitted to the display 360 through the display controller 350, screen tearing occurs (or is expected). For example, when transmitting the second frame to the display 360 based on a tearing signal, the time when the display 360 scans (or displays) the first frame written in the memory 233 and the display 360 The timing of writing the second frame to the memory 233 may overlap. As a result, the image or frame displayed (or scanned) through the display 360 may be broken.
  • the third processor 322 may transmit the second frame to the display controller 350.
  • the third processor 322 sends information related to the time when the display controller 350 transmits the second frame to the display 360 (e.g., the first time specified from the time of output of the tearing signal) to the display controller 350. ) can be transmitted to.
  • the third processor 322 may transmit the second frame to the display controller 350 after a specified second time (eg, 2 ms) from the point of output of the tearing signal.
  • the display controller 350 may transmit a second frame to the display 360 after the first frame is scanned (or displayed) on the display.
  • the display controller 350 may transmit the second frame to the display 360 after a designated first time (eg, 2 ms) from the point of output of the tearing signal.
  • the display controller 350 receives a second frame from the third processor 322 (e.g., the display driver 383) after a specified second time (e.g., 2 ms) from the point of output of the tearing signal. can receive.
  • the designated first time and the designated second time may be the same time or may be different times.
  • the designated first time and the designated second time are a time interval in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and a layer corresponding to the region of interest is updated. This may mean a time set so that the time intervals for which the second frame is transmitted to the display 360 do not match.
  • the electronic device 301 determines that the time interval at which the ROI of the first frame is scanned (or displayed) on the display 360 is used to update the layer corresponding to the ROI. It can be ensured that the time interval at which the second frame is transmitted to the display 360 (or the time interval at which the second frame is read) does not match, thus avoiding screen tearing when updating only part of the screen.
  • the operation of the display controller 350 transmitting the second frame to the display 360 after a designated first time (e.g., 2 ms) from the point of output of the tearing signal will be described in detail in FIG. 6A. will be.
  • the operation of the third processor 322 transmitting a second frame to the display controller 350 after a specified second time (e.g., 2 ms) from the point of output of the tearing signal is described in detail in FIG. 7A. I will explain.
  • the third processor 322 determines that the second frame corresponds to the area of interest of the first frame (e.g., the bottom area of the first frame). If it is not confirmed as a frame for updating the layer (operation 405 - No), in operation 411, it can be confirmed that the screen is not broken when the second frame is transmitted to the display 360 through the display controller 350. there is.
  • the display controller 350 may transmit the second frame to the display 360 at the time of output of the tearing signal. According to one embodiment, the display controller 350 may transmit the second frame to the display 360 while the first frame is scanned (or displayed) on the display 360. According to one embodiment, the second frame may be written to the memory 233 at the time the tearing signal is output. According to one embodiment, the display controller 350 may transmit the second frame to the display 360 after a preset time from when the tearing signal is output. For example, the display 360 may read the second frame from the memory 233 at the falling point of the tearing signal and scan (or display) the second frame.
  • the electronic device 301 (e.g., the electronic device 301 in FIG. 3A) displays a frame that updates only part of the screen on the display, thereby reducing the value of current consumption consumed to update the entire screen. there is. Additionally, the electronic device 301 may not cause screen tearing when updating only part of the screen.
  • FIG. 4B is a flowchart showing a method of operating an electronic device according to an embodiment.
  • the flow diagram of FIG. 4B includes a conditional branch statement such as operation 425, each of the branching operations may form a separate additional embodiment, and each of the branching operations may be optional, as described below. can be regarded as
  • the display controller 350 performs a tearing signal output at regular intervals from the display 360 (e.g., the display 360 in FIG. 3a). Based on the signal, the first frame may be transmitted to the display 360.
  • the output timing of the tearing signal may be adjusted to advance by a specified time (eg, 2 ms) from the output timing of the reference tearing signal.
  • a third processor 322 (e.g., third processor 322 in Figure 3A) (e.g., display driver 383) (e.g., display driver 383 in Figure 3B) may receive a second frame from the second processor 321 to update some layers related to the first frame.
  • a third processor 322 e.g., third processor 322 in Figure 3A
  • display driver 383 e.g., display driver 383 in Figure 3B
  • the third processor 322 determines that the second frame is a layer corresponding to the region of interest of the first frame (e.g., the bottom region of the first frame). You can check whether it is a frame for updating.
  • operation 425 may cause the third processor 322 (e.g., display driver 383) to display the ROI (e.g., ROI) of the first frame by the second frame. It can be expressed as a step of determining that the frame is for updating the layer corresponding to the lower area of the first frame. If operations 427 and 429 are not part of the above-described embodiment, operation 425 may cause the third processor 322 (e.g., display driver 383) to display the ROI of the first frame (e.g., It can be expressed as a step of determining that the frame is not for updating the layer corresponding to the lower area of the first frame.
  • the third processor 322 e.g., display driver 383
  • the third processor 322 determines that the second frame is a frame for updating the layer corresponding to the region of interest in the first frame (operation 425 - yes). , In operation 427, when the second frame is transmitted to the display 360 through the display controller 350, it can be confirmed that screen tearing occurs (or is expected).
  • the display controller 350 may transmit a third frame to update the entire first frame to the display 360 at the time of output of the tearing signal. For example, if the first frame is updated entirely rather than partially updated, screen tearing may not occur. Accordingly, the third processor 322 (e.g., display driver 383) may request a third frame from the second processor 321 to update the first frame as a whole when screen corruption is expected. The third processor 322 (eg, display driver 383) may transmit the third frame to the display controller 350, and the display controller 350 may transmit the third frame to the display 360. According to one embodiment, the third frame may include a frame rendered by the second processor 321 (eg, the second processor 321 in FIG. 3A).
  • the display controller 350 may transmit the third frame to the display 360 while the first frame is being scanned (or displayed) on the display 360. According to one embodiment, the operation of the display controller 350 transmitting the third frame to the display 360 at the time of output of the tearing signal will be described in detail with reference to FIG. 8A.
  • the third processor 322 determines that if the second frame is not identified as a frame for updating the layer corresponding to the region of interest in the first frame (operation 425 - No ), in operation 431, it can be confirmed that the screen is not broken when the second frame is transmitted to the display 360 through the display controller 350.
  • the display controller 350 may transmit the second frame to the display 360 at the time of output of the tearing signal. According to one embodiment, the display controller 350 may transmit the second frame to the display 360 while the first frame is scanned (or displayed) on the display 360. According to one embodiment, the second frame may be written to the memory 233 at the time the tearing signal is output.
  • Figure 5 is a timing diagram showing the frame transmission, writing, and scanning processes after adjusting the output timing of the tearing signal according to a comparative example.
  • the display driver displays the first frame at the time of output of the second tearing signal (not shown) (e.g., a tearing signal after the first tearing signal). It may not be transmitted to the controller.
  • the display driver receives the first frame from the output time of the first tearing signal (not shown) to the output time of the second tearing signal (not shown) (e.g., a tearing signal after the first tearing signal). If not, it can be confirmed that there is a delay in the first frame.
  • the display driver when frame delay occurs, sends the first frame to the display controller at the time of output of the third tearing signal (not shown) (e.g., a tearing signal output after the second tearing signal). can be transmitted to.
  • the third tearing signal e.g., a tearing signal output after the second tearing signal.
  • the display driver may adjust the output timing of the tearing signal so that the tearing signal is output at a time adjusted by a specified time (T1) based on the output timing of the vertical synchronization signal.
  • T1 a specified time
  • the electronic device can relatively reduce the occurrence of frame delay. Referring to FIG. 5 , the output time 561 of the tearing signal before display adjustment and the output time 511, 512, and 513 of the tearing signal adjusted for a designated time (T1) are shown.
  • the display driver may wait to receive the first frame from the first time point 511 of the tearing signal 560 output from the display until the second time point 512.
  • the display driver may receive the first frame rendered by the GPU based on the synchronization signal 540 between the first viewpoint 511 and the second viewpoint 512.
  • the first time point 511, the second time point 512, and the third time point 513 may mean the time point at which the tearing signal is output from the display and rises.
  • the display driver may transmit the first frame to the display controller between the first viewpoint 511 and the second viewpoint 512.
  • the display controller may transmit the first frame to the display at a second time point 512.
  • the display can scan (or display) the first frame by reading it from the memory.
  • the display driver may wait to receive the second frame from the second time point 512 until the third time point 513.
  • the display driver may receive the second frame rendered by the GPU based on the synchronization signal 540.
  • the second frame may be a frame for updating some layers of the first frame.
  • a screen tearing phenomenon occurs in the electronic device. It can be.
  • FIG. 6A is a flowchart illustrating a method of operating an electronic device to prevent screen cracking according to an embodiment.
  • a third processor 322 e.g., third processor 322 in Figure 3A
  • display driver 383 e.g., display driver 383 in Figure 3B
  • the display controller 350 displays the second frame based on the tearing signal.
  • the display controller 350 of FIG. 3A it can be confirmed that screen cracking occurs (or is expected) when transmitting to the display 360 (e.g., the display 360 of FIG. 3A).
  • the second frame may refer to a frame rendered by the second processor 321 (eg, the second processor 321 in FIG. 3A) after the first frame.
  • the second frame may include only some layers rather than all layers.
  • screen tearing occurs between a time period in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and a time period in which the second frame is transmitted to the display 360 (or memory (or memory) 233) (e.g., the time interval for writing the second frame to the memory 233 in FIG. 2) may mean that at least part of the time interval matches.
  • the display controller 350 may transmit the second frame to the display 360 after a specified time (eg, 2 ms) from the point of output of the tearing signal.
  • the operation of the display controller 350 transmitting the second frame to the display 360 may include the operation of the display 360 writing the second frame to the memory 233.
  • the designated time may mean the time between the output of the tearing signal and the start of scanning (or display) of the first frame on the display 360.
  • the designated time is a time period in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and a second frame for updating the layer corresponding to the region of interest is displayed.
  • the time intervals transmitted to (360) may be set to not match. However, this is an example, and embodiments of the present invention may set the designated time based on various methods.
  • the electronic device 301 determines the time period in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and the region of interest.
  • the time section in which the second frame for updating the layer corresponding to the area is transmitted to the display 360 (or the time section in which the second frame is written) does not match, the phenomenon of screen tearing occurs. may not occur.
  • FIG. 6B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • the display driver 383 controls the tearing signal 660 output from the display 360 (e.g., the display 360 in FIG. 3A). After the first time point (611), reception of the first frame may be waited until the second time point (612). According to one embodiment, the display driver 383 converts the first frame rendered by the second processor 321 (e.g., the second processor 321 in FIG. 3A) to the second frame based on the synchronization signal 640. It can be received at time 612.
  • the synchronization signal 640 (eg, Vsync signal) may be a clock signal including a rising point (eg, rising edge point) and a falling point (eg, falling edge point).
  • the tearing signal 660 may coincide with the period of the synchronization signal 640 (eg, Vsync signal).
  • the tearing signal may be a clock signal that rises at the rising point of the synchronization signal (eg, rising edge time) and falls at the falling point of the synchronization signal (eg, falling edge time).
  • the time when the tearing signal rises may be the time when the tearing signal is output.
  • the first time point 611, the second time point 612, and the third time point 613 mean the time point when the tearing signal 660 is output from the display 360 and rises (e.g., the rising edge time point). You can.
  • the display driver 383 may transmit the first frame to the display controller 350 between the first viewpoint 611 and the second viewpoint 612.
  • the display controller 350 may transmit the first frame to the display 360 at a second time point 612.
  • the display 360 may write the first frame to the memory 233 (eg, the memory 233 in FIG. 2) at the second viewpoint 612.
  • the display 360 may scan (or display) the first frame by reading it from the memory 233 at the falling point (e.g., falling edge point) of the tearing signal 660. there is.
  • the display driver 383 may wait to receive the second frame from the second time point 612 until the third time point 613. According to one embodiment, the display driver 383 may receive the second frame rendered by the second processor 321 based on the synchronization signal 640.
  • the second frame may be a frame for updating some layers of the first frame.
  • the display driver 383 may check whether the second frame is for updating a layer corresponding to the region of interest (eg, bottom region) of the first frame. According to one embodiment, if it is confirmed that the second frame is to update the layer corresponding to the region of interest (e.g., bottom region) of the first frame, the display controller 350 selects a specified time from the third viewpoint 613 ( After T2) has elapsed, the second frame may be transmitted to the display 360.
  • the designated time T2 is a time interval in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and a second frame for updating the layer corresponding to the region of interest is displayed (360). ) can be set so that the time intervals transmitted do not match.
  • the designated time (T2) may be set to 2ms.
  • the display controller 350 may transmit the second frame to the display 360.
  • the display controller 350 may write the second frame to the memory 233 on the display 360 as at least part of a transmission operation of the second frame.
  • the display 360 may scan (or display) the second frame by reading it from the memory 233 at the falling point (e.g., falling edge point) of the tearing signal 660. there is.
  • the display controller 350 can adjust the transmission timing of the second frame when screen tearing of the display 360 is expected. Through this, the display controller 350 can prevent the screen of the display 360 from being broken.
  • FIG. 7A is a flowchart illustrating a method of operating an electronic device to prevent screen cracking, according to an embodiment.
  • the third processor 322 determines that the second frame is of interest to a portion of the first frame. If it is confirmed that the layer corresponding to the area (e.g., the bottom area of the first frame) is to be updated, the second frame is sent to the display controller 350 (e.g., the display controller 350 in FIG. 3A) based on the tearing signal. It can be confirmed that screen cracking occurs (or is expected) when transmitting to the display 360 through .
  • the third processor 322 (e.g., display driver 383) sends a second frame to the display controller 350 after a specified time (e.g., 2 ms) from the point of output of the tearing signal. can be transmitted.
  • the designated time is the time when the first frame begins to be scanned (or displayed) on the display 360 (e.g., the display 360 of FIG. 3A) from the time of output of the tearing signal. It can mean the time between.
  • the designated time is a time period in which a portion of the region of interest of the first frame is scanned on the display 360 and a second frame for updating the layer corresponding to the region of interest is transmitted to the display 360.
  • the time intervals may be set to not match.
  • the display controller 350 may transmit the second frame to the display 360.
  • FIG. 7B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • the display driver 383 controls the tearing signal 760 output from the display 360 (e.g., the display 360 in FIG. 3A). It may wait to receive the first frame from the first time point 711 to the second time point 712.
  • the display driver 383 converts the first frame rendered by the second processor 321 (e.g., the second processor 321 in FIG. 3A) to the second frame based on the synchronization signal 740. It can be received at time 712.
  • the synchronization signal 740 may be a vertical synchronization signal.
  • the first time point 711, the second time point 712, and the third time point 713 may mean the time point at which the tearing signal 760 is output from the display 360 and rises.
  • the display driver 383 may transmit the first frame to the display controller 350 between the first viewpoint 711 and the second viewpoint 712.
  • the display controller 350 may transmit the first frame to the display 360 at the second time point 712.
  • the display 360 may write the first frame to the memory 233 (eg, the memory 233 in FIG. 2) at the second viewpoint 712.
  • the display 360 may read the first frame from the memory 233 and scan (or display) the first frame at the falling point of the tearing signal 760.
  • the display driver 383 may wait to receive the second frame from the second time point 712 until the third time point 713. According to one embodiment, the display driver 383 may receive the second frame rendered by the second processor 321 based on the synchronization signal 740.
  • the second frame may be a frame for updating some layers of the first frame.
  • the display driver 383 determines that the second frame is for updating the layer corresponding to the region of interest (e.g., the bottom region) of the first frame
  • the display driver 383 selects a specified time from the third viewpoint 713.
  • the second frame may be transmitted to the display controller 350.
  • the designated time T2 is a time interval in which a portion of the region of interest of the first frame is scanned (or displayed) on the display 360 and a second frame for updating the layer corresponding to the region of interest is displayed (360).
  • the designated time (T2) may be set to 2ms.
  • the display controller 350 may transmit the second frame to the display 360.
  • the display 360 may write the second frame to the memory 233.
  • the display 360 may read the second frame from the memory 233 at the falling point of the tearing signal 760 and scan (or display) the second frame.
  • the display driver 383 can adjust the timing of transmitting the second frame to the display controller 350 when the screen of the display 360 is expected to be broken.
  • the electronic device 301 eg, the electronic device 301 in FIG. 3A
  • the display driver 383 can adjust the timing of transmitting the second frame to the display controller 350 when the screen of the display 360 is expected to be broken.
  • FIG. 8A is a flowchart showing a method of operating a display controller to prevent screen tearing according to an embodiment.
  • third processor 322 determines that the second frame is a portion of the region of interest of the first frame. If it is confirmed that the layer corresponding to It can be confirmed as occurring (or expected).
  • the third processor 322 (e.g., display driver 383) sends a third frame to update the entire first frame to the second processor 321 (e.g., FIG. 3A).
  • the second processor 321 e.g., FIG. 3A
  • the display driver 383 may request the second processor 321 for a third frame to entirely update the first frame when screen disruption is expected.
  • the processor 321 may transmit the rendered third frame to the third processor 322 (eg, display driver 383).
  • the third processor 322 may receive the third frame from the second processor 321. According to one embodiment, the third processor 322 may transmit the third frame to the display controller 350.
  • the display controller 350 may transmit the third frame to the display 360 at the time of output of the tearing signal.
  • the display 360 may write the third frame to the memory 233 (eg, the memory 233 in FIG. 2) at the time of output of the tearing signal.
  • the display 360 may scan (or display) the third frame.
  • the display controller 350 may transmit the third frame to the display 360 while the first frame is scanned (or displayed) on the display 360.
  • the electronic device 301 displays the upper part of the third frame while the lower area of the first frame is being scanned (or displayed) on the display 360.
  • the area (or middle area) is written to the memory 233 to prevent screen tearing of the image displayed by the display 360.
  • FIG. 8B is a timing diagram illustrating frame transmission, writing, and scanning processes according to an embodiment.
  • the display driver 383 controls the tearing signal 860 output from the display 360 (e.g., the display 360 in FIG. 3A). It may wait to receive the first frame from the first time point 821 to the second time point 822.
  • the display driver 383 converts the first frame rendered by the second processor 321 (e.g., the second processor 321 in FIG. 3A) to the second frame based on the synchronization signal 840. It can be received at time 822.
  • the synchronization signal 840 may be a vertical synchronization signal.
  • the first time point 821, the second time point 822, and the third time point 823 may mean the time point at which the tearing signal 860 is output from the display 360 and rises.
  • the display driver 383 may transmit the first frame to the display controller 350 between the first viewpoint 821 and the second viewpoint 822.
  • the display controller 350 may transmit the first frame to the display 360 at a second time point 822.
  • the display 360 may read the first frame from the memory 233 and scan (or display) the first frame at the falling point of the tearing signal 860.
  • the display driver 383 waits to receive the next frame (e.g., the second frame or the third frame) to update the first frame until the third time point 823 after the second time point 822. You can.
  • the next frame e.g., the second frame or the third frame
  • the display driver 383 may receive the second frame rendered by the second processor 321 based on the synchronization signal 840.
  • the second frame may be a frame for updating some layers of the first frame.
  • the display driver 383 when the display driver 383 determines that the second frame is a frame for updating a partial layer of the first frame, the display driver 383 sends a third frame for updating the entire first frame to the second processor 321. You can request it.
  • the display controller 350 may transmit a third frame to the display 360 at a third time point 823.
  • the display 360 may write the third frame to the memory 233 at the third time point 823.
  • the display 360 may read the third frame from the memory 233 at the falling point of the tearing signal 860 and scan (or display) the third frame.
  • the display controller 350 displays the upper area (or middle area) of the third frame on the display 360 while the lower area of the first frame is scanned (or displayed) on the display 360.
  • the upper area (or middle area) of the third frame can be written to the memory 233 by the display 360.
  • the display driver 383 when the screen of the display 360 is expected to be broken, the display driver 383 sends a third frame capable of updating the entire first frame to the display 360 through the display controller 350. ) can be transmitted to. Through this, the display driver 383 can prevent the screen of the display 360 from being broken.
  • An electronic device may include a memory, a processor, a display controller, and a display.
  • the memory when executed by the processor, causes the electronic device to send a first frame to the display controller based on the first tearing signal output from the display at the first output time. At least one instruction causing transmission may be stored.
  • the memory when executed by the processor, causes the electronic device to obtain a second frame for updating some of the plurality of layers related to the first frame.
  • One instruction can be saved.
  • the memory when executed by the processor, causes the electronic device to determine that some of the layers among the plurality of layers correspond to a region of interest (ROI) of the first frame. Based on confirmation that the second frame is transmitted to the display through the display controller, at least one instruction that causes it to be confirmed that screen tearing occurs may be stored.
  • ROI region of interest
  • the second frame is stored in the memory included in the display while the first frame is being scanned on the display. It may not be written.
  • the memory when executed by the processor, causes the electronic device to display the first frame after the first frame is scanned on the display based on an operation that confirms that the screen is broken. At least one instruction that causes two frames to be transmitted to the display controller may be stored.
  • the memory when executed by the display controller, causes the electronic device to detect the screen after the first frame is scanned on the display based on an operation that confirms that the screen is broken. At least one instruction that causes a second frame to be transmitted to the display may be stored.
  • the memory when executed by the display controller, causes the electronic device to detect a second tearing signal after the first tearing signal based on an operation that determines that the screen is broken. At least one instruction causing the second frame to be transmitted to the display after a first specified time from the second output point of the signal may be stored.
  • the memory when executed by the display controller, causes the electronic device to display the second frame after a second designated time from the point of output of the second touring signal after the first touring signal. At least one instruction that causes to be written to the memory of the display may be stored.
  • the region of interest may include a lower area of the first frame displayed on the display.
  • the memory when executed by the processor, causes the electronic device to detect a second tearing signal after the first tearing signal based on an operation that determines that the screen is broken. At least one instruction causing the second frame to be transmitted to the display controller after a third designated time from the second output point of the signal may be stored.
  • the memory when executed by the processor, causes the electronic device to update the plurality of layers related to the first frame based on an operation that confirms that the screen is broken. At least one instruction that causes the third frame to be obtained may be stored.
  • the memory when executed by the processor, causes the electronic device to display the third frame at the second output time of the second tearing signal output after the first tearing signal. At least one instruction that causes transmission to the display controller may be stored.
  • the memory when executed by the display controller, causes the electronic device to transmit the third frame to the display while the first frame is scanned to the display. Instructions can be saved.
  • a first frame is transmitted to a display controller included in the electronic device. It may include a transmitting operation.
  • a method of operating an electronic device may include obtaining a second frame for updating some of a plurality of layers related to the first frame.
  • the display controller based on confirming that the second frame corresponds to a region of interest (ROI) of the first frame, the display controller sends the second frame to the display controller. It may include an operation to confirm that screen cracking occurs when transmitting to the display through .
  • ROI region of interest
  • the second frame is written to the memory included in the display while the first frame is being scanned on the display. It may not be written.
  • the second frame is transmitted to the controller of the display after the first frame is scanned on the display.
  • the display controller sends the second frame to the display after the first frame is scanned on the display. It may include the operation of transmitting to .
  • the display controller is configured to operate from a second output point of the second tearing signal after the first tearing signal based on the operation of confirming that the screen is broken. It may include transmitting the second frame to the display after a first designated time.
  • the display controller writes the second frame to the memory of the display after a second designated time from the time of output of the second touring signal after the first touring signal. It may include the action of (write).
  • the region of interest may include a lower area of the first frame displayed on the display.
  • a third designated time is elapsed from the second output time of the second tearing signal after the first tearing signal. It may later include transmitting the second frame to the display controller.
  • a method of operating an electronic device may include an operation of obtaining a third frame for updating the entire first frame based on the operation of confirming that the screen is broken.
  • a method of operating an electronic device may include transmitting the third frame to the display controller at a second output time of a second touring signal output after the first touring signal. there is.
  • the display controller may include transmitting the third frame to the display while the first frame is being scanned to the display.
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display a display included in the electronic device.
  • An operation of transmitting a first frame to a display controller included in the electronic device may be caused to be performed based on the first tearing signal output at the first output time.
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display a plurality of instructions related to the first frame. It may cause an operation to obtain a second frame to update some of the layers of .
  • the at least one instruction when executed by a processor of an electronic device, causes the electronic device to display the second frame as the first Based on confirmation that it is a layer corresponding to the region of interest (ROI) of 1 frame, to perform an operation to determine that screen tearing occurs when the second frame is transmitted to the display through the display controller.
  • ROI region of interest
  • a storage medium storing at least one computer-readable instruction according to an embodiment, based on confirmation that the screen is broken, the second frame is displayed while the first frame is being scanned on the display. It may not be written to the memory included in the display.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one component from another, and to refer to that component in other respects (e.g., importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”.
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • a storage medium e.g., internal memory 136 or external memory 138
  • a machine e.g., electronic device 101, 201, 301
  • It may be implemented as software (e.g., program 140) including one or more instructions, for example, in a processor (e.g., processor 120, 320) of a device (e.g., electronic device 101, 201, 301). ) may call at least one command among one or more commands stored from a storage medium, and execute it, enabling the device to be operated to perform at least one function according to the called at least one command.
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' simply means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term means that data is stored semi-permanently in the storage medium.
  • signals e.g. electromagnetic waves
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Selon un mode de réalisation, un dispositif électronique comprend une mémoire, un processeur, un contrôleur d'affichage et un dispositif d'affichage, la mémoire stockant au moins une instruction dont l'exécution par le processeur est destinée à amener le dispositif électronique à : transmettre une première trame au contrôleur d'affichage ; acquérir une deuxième trame servant à mettre à jour certaines couches d'une pluralité de couches associées à la première trame ; et, lorsqu'il est confirmé que les certaines couches de la pluralité de couches sont des couches correspondant à une région d'intérêt (ROI) de la première trame, confirmer qu'une interruption d'écran se produit si la deuxième trame est transmise au dispositif d'affichage. Lorsqu'il est confirmé que l'interruption d'écran se produit, la deuxième trame peut ne pas être écrite dans la mémoire incluse dans le dispositif d'affichage pendant que la première trame est balayée vers le dispositif d'affichage.
PCT/KR2023/018492 2022-11-23 2023-11-16 Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé WO2024112005A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP23805839.0A EP4398233A1 (fr) 2022-11-23 2023-11-16 Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0158050 2022-11-23
KR20220158050 2022-11-23
KR10-2023-0003537 2023-01-10
KR1020230003537A KR20240076338A (ko) 2022-11-23 2023-01-10 디스플레이를 포함하는 전자 장치 및 이의 동작 방법

Publications (1)

Publication Number Publication Date
WO2024112005A1 true WO2024112005A1 (fr) 2024-05-30

Family

ID=88920839

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/018492 WO2024112005A1 (fr) 2022-11-23 2023-11-16 Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé

Country Status (2)

Country Link
EP (1) EP4398233A1 (fr)
WO (1) WO2024112005A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080062492A (ko) * 2006-12-29 2008-07-03 엘지디스플레이 주식회사 액정표시장치와 그 구동방법
KR20190101659A (ko) * 2018-02-23 2019-09-02 삼성전자주식회사 디스플레이 패널을 통해 표시되는 콘텐트의 저장을 제어하기 위한 전자 장치 및 방법
KR20200002626A (ko) * 2018-06-29 2020-01-08 에이알엠 리미티드 데이터 처리 시스템
KR20220079381A (ko) * 2020-12-04 2022-06-13 삼성전자주식회사 디스플레이의 잔상을 예측 및 보상하는 전자 장치 및 방법
KR20220081161A (ko) * 2020-12-08 2022-06-15 삼성전자주식회사 디스플레이 구동 회로 및 이의 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080062492A (ko) * 2006-12-29 2008-07-03 엘지디스플레이 주식회사 액정표시장치와 그 구동방법
KR20190101659A (ko) * 2018-02-23 2019-09-02 삼성전자주식회사 디스플레이 패널을 통해 표시되는 콘텐트의 저장을 제어하기 위한 전자 장치 및 방법
KR20200002626A (ko) * 2018-06-29 2020-01-08 에이알엠 리미티드 데이터 처리 시스템
KR20220079381A (ko) * 2020-12-04 2022-06-13 삼성전자주식회사 디스플레이의 잔상을 예측 및 보상하는 전자 장치 및 방법
KR20220081161A (ko) * 2020-12-08 2022-06-15 삼성전자주식회사 디스플레이 구동 회로 및 이의 동작 방법

Also Published As

Publication number Publication date
EP4398233A1 (fr) 2024-07-10

Similar Documents

Publication Publication Date Title
WO2021162436A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2022030757A1 (fr) Dispositif électronique et procédé de mise à jour rapide de région partielle d'écran
WO2022010116A1 (fr) Procédé et appareil permettant de commander une fréquence de rafraîchissement d'un écran de visualisation
WO2022030996A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2022158887A1 (fr) Dispositif électronique pour exciter une pluralité de zones d'affichage d'un dispositif d'affichage à différentes fréquences de commande
WO2023008854A1 (fr) Dispositif électronique comprenant un capteur optique intégré dans une unité d'affichage
WO2022211307A1 (fr) Dispositif électronique prenant en charge l'affichage de contenu d'affichage toujours actif et son procédé de commande
WO2022196924A1 (fr) Dispositif électronique pour afficher un contenu sur la base d'une pluralité de vitesses de balayage et procédé de fonctionnement associé
WO2022177166A1 (fr) Procédé de commande de fréquence de rafraîchissement, et dispositif électronique prenant en charge celui-ci
WO2022114885A1 (fr) Procédé de commande d'un dispositif électronique à l'aide d'un stylet, et dispositif électronique destiné à recevoir une entrée à partir d'un stylet en utilisant le procédé
WO2024112005A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2024085642A1 (fr) Dispositif électronique comprenant une unité d'affichage et son procédé de fonctionnement
WO2024071932A1 (fr) Dispositif électronique et procédé de transmission à un circuit d'attaque d'affichage
WO2024072173A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé pour changement de modes
WO2024072176A1 (fr) Dispositif électronique changeant une transmission d'image sur la base d'un taux de rafraîchissement
WO2024072171A1 (fr) Dispositif électronique et procédé d'affichage d'image initiale sur un panneau d'affichage
WO2024072055A1 (fr) Dispositif électronique et procédé pour commander un signal fourni à un processeur
WO2024034774A1 (fr) Dispositif électronique comprenant de multiples dispositifs d'affichage et procédé de réduction d'écart dans la qualité d'écran de multiples dispositifs d'affichage
WO2024154934A1 (fr) Dispositif électronique et procédé de remplacement de circuit connecté à un panneau d'affichage
WO2024072177A1 (fr) Dispositif électronique et procédé dans lequel une instruction d'affichage est commandée
WO2024029686A1 (fr) Appareil électronique et procédé de changement de fréquence de rafraîchissement
WO2023210991A1 (fr) Procédé de commande de dispositif de mise à l'échelle d'affichage en mode vidéo et dispositif électronique utilisant ledit procédé
WO2024158207A1 (fr) Dispositif électronique de commutation d'état de fenêtre et son procédé de fonctionnement
WO2022119227A1 (fr) Dispositif électronique et procédé pour le faire fonctionner
WO2024101879A1 (fr) Dispositif électronique et procédé de commande de synchronisation de trame d'image dans un dispositif électronique

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2023805839

Country of ref document: EP

Effective date: 20231123

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23805839

Country of ref document: EP

Kind code of ref document: A1