WO2017010822A1 - Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique - Google Patents

Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique Download PDF

Info

Publication number
WO2017010822A1
WO2017010822A1 PCT/KR2016/007658 KR2016007658W WO2017010822A1 WO 2017010822 A1 WO2017010822 A1 WO 2017010822A1 KR 2016007658 W KR2016007658 W KR 2016007658W WO 2017010822 A1 WO2017010822 A1 WO 2017010822A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
display
partial image
driving circuit
display driving
Prior art date
Application number
PCT/KR2016/007658
Other languages
English (en)
Korean (ko)
Inventor
배종곤
김동휘
한동균
김태성
이요한
염동현
마테우스 파리아스 미란다
김한여울
김호진
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to EP16824739.3A priority Critical patent/EP3324388B1/fr
Priority to CN201680041421.6A priority patent/CN107851415B/zh
Priority to US15/743,899 priority patent/US10672097B2/en
Priority claimed from KR1020160089039A external-priority patent/KR102576961B1/ko
Publication of WO2017010822A1 publication Critical patent/WO2017010822A1/fr
Priority to US16/889,179 priority patent/US11017496B2/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/04Partial updating of the display screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • G09G2330/022Power management, e.g. power saving in absence of operation, e.g. no data being entered during a predetermined time
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/18Use of a frame buffer in a display terminal, inclusive of the display panel
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/08Details of image data interface between the display device controller and the data line driver circuit

Definitions

  • the present invention relates to a display drive circuit, a display drive method, and an electronic device provided with the display drive circuit.
  • portable electronic devices such as smartphones and tablet PCs may support various functions such as Internet access and multimedia content playback in addition to call and message transmission and reception functions.
  • the electronic device is also implemented in the form of a wearable device attached to a part of the user's body.
  • the wearable device may have a form such as a wrist watch attached to a wrist of a user of the user or glasses mounted on the head of the user.
  • an electronic device implemented in various forms generally includes a display and may visually provide various contents (eg, images, videos, etc.) to the user through the display.
  • the display includes a display panel and a display driver integrated circuit (DDI) for driving the panel.
  • DCI display driver integrated circuit
  • the display driving circuit mounted in the electronic device may receive the image data from the processor and drive the display panel.
  • the display driving circuit may display an image on the display panel at a preset number of frames per second (for example, 60 frames per second).
  • the processor in order for the electronic device to provide the user with useful information (eg, clock, weather, news article, etc.) dynamically or continuously, the processor generates image data of the entire display panel every frame and displays the display. It has to be provided to the display panel via a driving circuit. Since this corresponds substantially to video playback, the processor consumed a relatively large amount of power to generate a large amount of image data in a short time.
  • useful information eg, clock, weather, news article, etc.
  • the processor may not newly generate image data under the “sleep mode” to reduce power consumption. Since the newly generated image data is not loaded in the display driving circuit, the display driving circuit could only provide an image corresponding to any one frame previously stored. Therefore, the electronic device can only provide a fixed image under the sleep mode, but cannot provide the image dynamically or continuously.
  • a display driving circuit, a display driving method, and a display driving circuit capable of specifying (or selecting) some of image data stored in a graphics RAM by itself without an intervention of a processor even when the electronic device is in a sleep mode.
  • An electronic device including a display driving circuit can be provided.
  • An electronic device includes a display, a processor for generating image data, a Graphic Random Access Memory (GRAM) for storing the image data, and a display driving circuit for driving the display. can do.
  • the display driving circuit may be configured to select a portion of the image data and output the selected portion to a designated area of the display.
  • a display driving circuit for driving a display may include a graphics RAM for storing image data generated by a processor, a control module for selecting a portion of the image data, and a portion corresponding to the selected portion. It may include a timing controller for supplying an image signal to the display. The control module may control the timing controller such that the selected portion is output to a designated area of the display.
  • a method of driving a display may include: storing, by a processor, image data in a graphics RAM, selecting, by a display driving circuit, a portion of the image data stored in the graphics RAM, and displaying a display.
  • Driving circuitry may include outputting the selected portion to a designated area of the display.
  • the display driving circuit may specify (or select) at least some image data (eg, partial image data) among the image data stored in the graphic RAM and output the same to the display panel.
  • the processor may provide the image data to the graphic RAM and maintain the sleep mode without involving the operation of the display driving circuit.
  • a low power AOD Always On Display
  • a self-display without processor intervention may be implemented.
  • FIG. 1A illustrates a smartphone to which various embodiments of the present invention are applied.
  • FIG. 1B illustrates a smart watch to which various embodiments of the present disclosure are applied.
  • FIG. 2 is a block diagram of an electronic device according to an embodiment of the present disclosure.
  • 3A is a block diagram of a display driving circuit according to an exemplary embodiment of the present invention.
  • 3B is a block diagram of an electronic device including a display driving circuit according to another exemplary embodiment of the present disclosure.
  • FIG. 4 is a flowchart illustrating a display driving method according to an exemplary embodiment.
  • FIG. 5 illustrates that image data is provided in a specified order according to an embodiment of the present invention.
  • FIG. 6A illustrates an example in which a display driving method according to an embodiment of the present invention is applied to a smartphone.
  • 6B illustrates an example in which the display driving method according to an embodiment of the present invention is applied to a smart watch.
  • FIG. 7 illustrates that image data is provided based on control information according to an embodiment of the present invention.
  • FIG. 8A illustrates an example in which a display driving method according to an embodiment of the present invention is applied to a smartphone.
  • 8B illustrates an example in which the display driving method according to an embodiment of the present invention is applied to a smart watch.
  • FIG 9 illustrates output of image data according to an embodiment of the present invention.
  • FIG. 10 is a diagram for describing an output of image data according to an exemplary embodiment.
  • FIG 11 illustrates an electronic device according to various embodiments of the present disclosure.
  • FIG. 12 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • expressions such as “have”, “may have”, “include”, or “may contain” include the presence of a corresponding feature (e.g., numerical, functional, operational, or component such as a component). Does not exclude the presence of additional features.
  • expressions such as “A or B”, “at least one of A or / and B”, or “one or more of A or / and B” may include all possible combinations of items listed together.
  • “A or B”, “at least one of A and B”, or “at least one of A or B” includes (1) at least one A, (2) at least one B, Or (3) both of cases including at least one A and at least one B.
  • first,” “second,” “first,” or “second,” as used herein may modify various components, regardless of order and / or importance, and may modify one component to another. It is used to distinguish a component and does not limit the components.
  • the first user device and the second user device may represent different user devices regardless of the order or importance.
  • the first component may be called a second component, and similarly, the second component may be renamed to the first component.
  • One component (such as a first component) is "(functionally or communicatively) coupled with / to" to another component (such as a second component) or " When referred to as “connected to”, it should be understood that any component may be directly connected to the other component or may be connected through another component (eg, a third component).
  • a component e.g., a first component
  • another component e.g., a second component
  • the expression “configured to” used in this document is, for example, “suitable for”, “having the capacity to” depending on the situation. It may be used interchangeably with “designed to”, “adapted to”, “made to”, or “capable of”.
  • the term “configured to” may not necessarily mean only “specifically designed to” in hardware. Instead, in some situations, the expression “device configured to” may mean that the device “can” along with other devices or components.
  • the phrase “processor configured (or configured to) perform A, B, and C” may be implemented by executing a dedicated processor (eg, an embedded processor) to perform its operation, or one or more software programs stored in a memory device. It may mean a general-purpose processor (eg, a CPU or an application processor) capable of performing corresponding operations.
  • An electronic device may include, for example, a smartphone, a tablet personal computer, a mobile phone, a video phone, an e-book reader, Desktop personal computer (PC), laptop personal computer (PC), netbook computer, workstation, server, personal digital assistant (PDA), portable multimedia player (PMP), MP3 player, mobile medical It may include at least one of a device, a camera, or a wearable device.
  • a wearable device may be an accessory type (eg, a watch, ring, bracelet, anklet, necklace, glasses, contact lens, or head-mounted-device (HMD)), a fabric, or a clothing type (for example, it may include at least one of an electronic garment, a body attachment type (eg, a skin pad or a tattoo), or a living implantable type (eg, an implantable circuit).
  • HMD head-mounted-device
  • the electronic device may be a home appliance.
  • Home appliances are, for example, televisions, digital video disk (DVD) players, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwaves, washing machines, air purifiers, set-top boxes, home automation controls Panel (home automation control panel), security control panel, TV box (e.g. Samsung HomeSyncTM, Apple TVTM, or Google TVTM), game console (e.g. XboxTM, PlayStationTM), electronic dictionary, electronic key, camcorder It may include at least one of a (camcorder), or an electronic picture frame.
  • DVD digital video disk
  • the electronic device may include various medical devices (eg, various portable medical measuring devices (such as blood glucose meters, heart rate monitors, blood pressure monitors, or body temperature meters), magnetic resonance angiography (MRA), magnetic resonance imaging (MRI), Such as computed tomography (CT), imaging or ultrasound), navigation equipment, global navigation satellite system (GNSS), event data recorder (EDR), flight data recorder (FDR), automotive infotainment ) Devices, ship's electronic equipment (e.g. ship's navigational devices, gyro compasses, etc.), avionics, security devices, vehicle head units, industrial or home robots, automatic teller's machines (financial institutions) Point of sales, point of sales, or Internet of things (e.g. light bulbs, sensors, electricity or gas meters, sprinkler devices, fire alarms, thermostats, street lights, It may include at least one of (toaster), exercise equipment, hot water tank, heater, boiler.
  • MRA magnetic resonance angiography
  • MRI magnetic resonance imaging
  • CT computed to
  • an electronic device may be a furniture or part of a building / structure, an electronic board, an electronic signature receiving device, a projector, or various measuring devices (eg, Water, electricity, gas, or radio wave measuring instrument).
  • the electronic device may be one or a combination of the aforementioned various devices.
  • An electronic device according to an embodiment may be a flexible electronic device.
  • the electronic device according to an embodiment of the present disclosure is not limited to the above-described devices, and may include a new electronic device according to technology development.
  • the term user may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) that uses an electronic device.
  • FIG. 1A illustrates a smartphone to which various embodiments of the present invention are applied.
  • the electronic device may include, for example, a smartphone 11, 12, 13, and 14.
  • the electronic device may support a wake-up mode, which is a mode in which a user can use the functions of the electronic device, and a sleep mode, which is a mode in which the user waits for use.
  • the various hardware modules and / or software modules included in the electronic device receive sufficient power from the battery (power required to express the color of the preset gray scale) from the battery so as to fully perform its functions.
  • the battery power required to express the color of the preset gray scale
  • the display may receive sufficient power in the wakeup mode to provide various contents requested by a user
  • the processor may provide various functions of the electronic device based on sufficient power supply.
  • various hardware modules and / or software modules included in the electronic device may be deactivated or may be supplied with minimal power to perform only specified limited functions.
  • the photo and video capturing functions may be deactivated, and the processor may be configured to drive only limited functions of the application program when the camera module is switched to the sleep mode.
  • Each of the smartphones 11, 12, 13, and 14 shown in FIG. 1A may all be operating in a sleep mode.
  • the smartphone 11 operating in the sleep mode may output a digital clock, a date, and a weather indicating the current time to a designated area of the display panel.
  • the smartphone 12 can output an analog clock indicating the current time and a calendar displaying the current date to a designated area of the display panel.
  • the smartphone 13 operating in a portrait display mode may output a news article to a designated area of the display panel (eg, the bottom of the display panel) in the sleep mode.
  • the smartphone 13 operating in a landscape display mode may output a news article to a designated area of the display panel (for example, a curved area provided on the side of the electronic device).
  • each pixel for the current time, date, weather, and news article output to the respective display panel may be displayed in a specified color, and the like.
  • the remaining pixels of may be set to a specified color (eg, black). For example, when the display panel corresponds to an OLED panel, the remaining pixels may be turned off.
  • FIG. 1B illustrates a smart watch to which various embodiments of the present disclosure are applied.
  • an electronic device to which various embodiments of the present disclosure are applied may include, for example, a smart watch 15 and 16.
  • the smart watches 15 and 16 may support a wake-up mode and a sleep mode like the smartphones 11, 12, 13, and 14.
  • the smart watches 15 and 16 shown in FIG. 1B may be operating in a sleep mode.
  • the smart watch 15 operating in the sleep mode may output a news article to a designated area of the display panel (eg, one area of the display area).
  • the smart watch 16 operating in the sleep mode may output a digital clock, a date, and a weather indicating the current time to a designated area of the display panel (eg, the entire area of the display panel).
  • each pixel for the current time, date, weather, and news article output on each display panel may be displayed in a specified color, and the remaining pixels may be displayed in black. Can be. For example, when the display panel corresponds to an OLED panel, the remaining pixels may be turned off.
  • the electronic device may provide useful information (eg, time, date, weather, news, etc.) to the user on the display panel even when operating in the sleep mode. have.
  • the electronic device operating in the sleep mode may be switched to the wakeup mode in response to a predetermined user input (for example, pressing a home button / power button or touching a touch panel).
  • the computational load on the processor may be significantly reduced.
  • the display panel is an OLED panel, since only pixels for outputting useful information are turned on, power consumption for outputting the useful information can also be minimized. This can also reduce overall battery power consumption to the maximum.
  • the display output method as described above may be referred to as always-on display (AOD) from the viewpoint that useful information is always provided.
  • AOD always-on display
  • the electronic device may have a configuration as shown in FIG. 2.
  • the electronic device may include a display driving circuit having a configuration as shown in FIG. 3A or 3B.
  • FIG. 2 is a block diagram of an electronic device according to an embodiment of the present disclosure.
  • an electronic device 1000 may include a display driving circuit 100, a display panel 200, a processor (eg, an application processor (AP) 300), and a communication processor (CP). ), A sensor hub 500, a touch controller IC 600, and the like.
  • a display driving circuit 100 e.g, a display panel 200
  • a processor e.g, an application processor (AP) 300
  • CP communication processor
  • the display driving circuit 100 may drive the display panel 200.
  • outputting image data to a "display” may be referred to interchangeably with outputting image data to the display panel 200.
  • the display driving circuit 100 may supply an image signal corresponding to the image data received from a processor (host) (for example, the AP 300, the CP 400, the sensor hub 500, or the touch controller IC 600) to the display panel 200 at a preset frame rate. Can be.
  • a processor for example, the AP 300, the CP 400, the sensor hub 500, or the touch controller IC 600
  • the display driving circuit 100 may include at least a Graphic Random Access Memory (GRAM) 110 and a control module 120 (see FIGS. 3A and 3B for more detailed description of the display driving circuit 100). .
  • GRAM Graphic Random Access Memory
  • the graphics RAM 110 may store image data provided from the processor (eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600).
  • the graphics RAM 110 may include a memory space corresponding to the resolution and / or number of color gradations of the display panel 200.
  • the graphic RAM 110 may be referred to as a frame buffer or a line buffer.
  • the image data may correspond to one image data formed by concatenating a plurality of independent partial image data (for example, graphic RAM 110a of FIG. 6A, graphic RAM 110w of FIG. 6B, and FIG. 8). Graphic RAM 110c).
  • the image data may include at least one low resolution image data having a resolution lower than that of the display panel 200.
  • the image data may be encoded in a manner specified by the processor (eg, AP 300, CP 400, sensor hub 500, and touch controller IC 600) and then stored in the graphic RAM 110.
  • the control module 120 may be configured to select a portion of the image data stored in the graphic RAM 110 and output the selected portion to a designated area of the display panel 200. In this case, the control module 120 may output the selected part to the designated area of the display panel 200 by the operation of the display driving circuit 100 itself. Meanwhile, in this document, operations performed by the control module 120 may be understood as operations performed by the display driving circuit 100.
  • control module 120 of the display driving circuit 100 may be configured to select at least one of the plurality of partial image data stored in the graphics RAM 110 and output the selected at least one partial image data to the display panel 200. have.
  • the control module 120 may use a data address on the graphics RAM 110 and / or a data size of the (partial) image data to be output in selecting the partial image data to be output.
  • the control module 120 of the display driving circuit 100 may select image data corresponding to a specified data size from the specific data address as the image data to be output.
  • the control module 120 of the display driving circuit 100 may output two or more partial image data to different regions.
  • the graphic RAM 110 may store a first group of partial image data and a second group of partial image data.
  • the control module 120 of the display driving circuit 100 may select at least one partial image data among the partial image data of the first group and at least one partial image data among the partial image data of the second group. Can be. Thereafter, the control module 120 of the display driving circuit 100 outputs at least one partial image data selected from the partial image data of the first group to the first region of the display panel 200, and the partial image data of the second group. At least one selected partial image data may be output to the second area of the display panel 200.
  • the control module 120 of the display driving circuit 100 may change at least one partial image data to be output to a designated area of the display panel 200 in a predetermined order. That is, the control module 120 of the display driving circuit 100 sequentially selects one of the plurality of partial image data stored in the graphics RAM 110 according to a specified order (or random order), and outputs the same to a designated area of the display panel 200. can do. In this way, a constant animation effect can be achieved.
  • control module 120 sequentially shifts the partial image data by sequentially shifting the start data address on the graphic RAM 110 of the partial image data to be read (scan read) at predetermined intervals at predetermined intervals.
  • the preset period and interval may be set based on a user's setting.
  • the control module 120 of the display driving circuit 100 may store the partial image data of the second group. (Or partial image data of the first group) may be selected and output according to a specified order.
  • the processor may generate the image data, encode the generated image data in a designated method (eg, the DSC (Display Stream Compression) method defined by the Video Electronics Standards Association), and store the image data in the graphics RAM 110.
  • a designated method eg, the DSC (Display Stream Compression) method defined by the Video Electronics Standards Association
  • the processor may generate the image data, encode the generated image data in a designated method (eg, the DSC (Display Stream Compression) method defined by the Video Electronics Standards Association), and store the image data in the graphics RAM 110.
  • a designated method eg, the DSC (Display Stream Compression) method defined by the Video Electronics Standards Association
  • the control module 120 of the display driving circuit 100 may select a portion of the encoded image data (s) stored in the graphic RAM 110, decode the selected portion, and output the decoded portion to a designated area of the display panel 200.
  • the graphic RAM 110 may include two or more (eg, n) images having a size corresponding to the display panel 200 in a compressed form, and thus may be selected by the display driving circuit 100 (the control module 120). The range of may increase n times.
  • the processor may generate low resolution image data lower than the resolution of the display panel 200 and store the generated low resolution image data in the graphics RAM 110. Since the low resolution image data has a smaller data size than the image data corresponding to the full resolution of the display panel 200, the graphic RAM 110 may store one or more low resolution image data in a concatenated form. For example, the graphic RAM 110 may store image data, which is lowered by 1 / m compared to the resolution of the display panel 200, in the form of m concatenated images.
  • the control module 120 of the display driving circuit 100 may select a portion from the one or more low resolution image data, enlarge the selected portion at a specified magnification, and output the selected portion to the designated area of the display panel 200. Since the graphic RAM 110 may include two or more low resolution image data (eg, m pieces), more various images may be output to the display panel 200 only by the operation of the display driving circuit 100.
  • the control module 120 of the display driving circuit 100 is based on control information received from a processor (eg, AP 300, CP 400, sensor hub 500, or touch controller IC 600) of the display driving circuit 100. Some of the image data stored in the graphics RAM 110 may be selected and output.
  • the control information may include a data address on the graphics RAM 110 and / or information about the size of the partial image data to be output.
  • the processor may provide the control module 120 of the display driving circuit 100 with the data address and / or data size corresponding to the number and symbol associated with the “digital clock” as the control information. have.
  • the control module 120 may select and output image data of numbers and symbols related to the "digital clock" stored in the graphic RAM 110 based on the data address and / or data size.
  • the control module 120 of the display driving circuit 100 may be configured to dynamically output a part of the selected image data (eg, the selected partial image data).
  • the control module 120 may continuously provide the display panel 200 to the display panel 200 by shifting the selected (partial) image data in block units using a timing controller (not shown) (so-called Panel Self-Refresh).
  • a timing controller not shown
  • Panel Self-Refresh a timing controller
  • image effects such as fade-in and fade-out can be achieved.
  • the (partial) image data to be output is a news article, an effect similar to a news ticker that is constantly scrolled in one direction can be achieved.
  • the display panel 200 may display various information (eg, multimedia data or text data) to the user.
  • the display panel 200 may include, for example, a liquid crystal panel (LCD) panel or an active-matrix organic light-emitting diode (AM-OLED) panel.
  • the display panel 200 may be implemented to be, for example, flexible, transparent, or wearable.
  • the display panel 200 may be included in, for example, a cover of a case electrically coupled to the electronic device 1000.
  • the display panel 200 may receive an image signal corresponding to the image data from the display driving circuit 100 and display a screen according to the image data.
  • a plurality of data lines and a plurality of gate lines may cross each other, and a plurality of pixels may be disposed in the crossing area.
  • each of the plurality of pixels may include at least one switching element (eg, an FET) and one OLED.
  • Each pixel may receive an image signal or the like from the display driving circuit 100 at a predetermined timing to generate light.
  • the “processor” may include the AP 300, the CP 400, the sensor hub 500, and / or the touch controller IC 600. According to various embodiments of the present disclosure, the processor may be referred to as a “host”.
  • the AP 300 may receive a command from other components through, for example, an internal bus, decode the received command, and execute an operation or data generation and processing according to the decrypted command.
  • the CP 400 may perform a function of managing a data link and converting a communication protocol in communication between the electronic device 1000 and other electronic devices connected through a network.
  • the CP 400 may provide communication services such as voice call, video call, text message (eg, SMS, MMS) or packet data to the user.
  • the sensor hub 500 may include at least one sensor 510 or 520 by including a micro controller unit (MCU).
  • the sensor hub 500 may collect sensing information detected by various sensors 510 and 520 and control operations of the various sensors 510 and 520.
  • the sensors 510 and 520 may include, for example, a temperature / humidity sensor, a biometric sensor, a barometric pressure sensor, a gyro sensor, and the like.
  • the touch controller IC 600 may control, for example, the touch panel 610 coupled to the display panel 200.
  • the touch controller IC 600 may process touch gesture information input from the touch panel 610 or control the operation of the touch panel 610.
  • the touch controller IC 600 may include a driver circuit, a sensor circuit, a control logic, an oscillator, a delay table, an analog-to-digital converter, an MCU, and the like.
  • the processor (eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600) generates image data according to various embodiments and drives the display of the generated image data.
  • Circuit 100 (of graphics RAM 110).
  • the image data may include image data in which a plurality of partial image data are concatenated, or image data in which low resolution image data having a lower resolution than that of the display panel 200 is concatenated.
  • the processor may encode the generated image data in a specified manner and provide the generated image data to the display driving circuit 100 (graphic RAM 110).
  • the processor may be configured to enter a sleep mode after providing the image data to the display driving circuit 100. That is, the processor may not intervene in the operation of the display driving circuit 100 after storing the image data in the graphics RAM 110 of the display driving circuit 100 (except for transmitting control information for selecting a part of the image data).
  • the processor may transmit the image data to the graphics RAM 110 of the display driving circuit 100 through a high speed serial interface (HiSSI), for example, a mobile industry processor interface (MIPI). Can provide.
  • HiSSI high speed serial interface
  • MIPI mobile industry processor interface
  • the processor may transmit control information for selecting a part of the image data through a low speed serial interface (LoSSI), for example, a serial peripheral interface (SPI) and an inter-integrated circuit (I2C).
  • HiSSI high speed serial interface
  • MIPI mobile industry processor interface
  • the processor may transmit control information for selecting a part of the image data through a low speed serial interface (LoSSI), for example, a serial peripheral interface (SPI) and an inter-integrated circuit (I2C).
  • LiSSI low speed serial interface
  • SPI serial peripheral interface
  • I2C inter-integrated circuit
  • the AP 300 may generate image data to be output through the display panel 200.
  • the AP 300 may generate text images regarding the content of the news article as image data to be output through the display panel 200.
  • the AP 300 may transmit, for example, a data address on the graphic RAM 110 of the text images regarding the content of the news article to the display driving circuit 100 via the low speed serial interface as control information (CTRL Info.).
  • the display driving circuit 100 may output an image related to the content of the news article to a designated area of the display panel 200 according to the control information.
  • the CP 400 may generate image data to be output through the display panel 200 based on various communication services. For example, when the CP 400 receives the text message, the CP 400 may generate icons of the text message and text images regarding the contents of the text message as image data to be output through the display panel 200.
  • the CP 400 may transmit, for example, a data address on the graphic RAM 110 of the icon and text images to the display driving circuit 100 through the low speed serial interface as control information.
  • the display driving circuit 100 may output an icon of a text message and the text images to the display panel 200 according to the control information.
  • the sensor hub 500 may generate image data to be output through the display panel 200 based on sensing information detected by the sensors 510 and 520. For example, when the sensor hub 500 receives the temperature information from the temperature sensor, the sensor hub 500 may generate a numerical image of the temperature value and an image of the temperature value unit as image data to be output through the display panel 200.
  • the sensor hub 500 transmits, for example, a data address on the graphics RAM 110 and the like of the numerical image relating to the determined temperature value and the image relating to the temperature unit to the display driving circuit 100 via the low speed serial interface as control information.
  • the display driving circuit 100 may output a numerical image of the temperature value and an image of the temperature value unit to the display panel 200 according to the control information.
  • the touch controller IC 600 may generate image data to be output through the display panel 200 based on the touch sensing information detected by the touch panel 610. For example, when the touch controller IC 600 receives the touch gesture information from the touch panel 610, the touch controller IC 600 may generate an image corresponding to the touch gesture information as image data to be output through the display panel 200.
  • the touch controller IC 600 may transmit, for example, a data address on the graphic RAM 110 of the determined image data to the display driving circuit 100 through the low speed serial interface as control information.
  • the display driving circuit 100 may output an image corresponding to the touch gesture information to the display panel 200 according to the control information.
  • the host providing the image data and the control information is not limited to the aforementioned types of processors (eg, the AP 300, the CP 400, the sensor hub 500, or the touch controller IC 600).
  • the display driving circuit 100 may be set to receive image data and / or control information, for example, from a GPS module (not shown).
  • 3A is a block diagram of a display driving circuit according to an exemplary embodiment of the present invention.
  • a display driving circuit 100a may include a graphics RAM 110a, a controller 120a, an interface module 130a, an image processing unit 140a, and a multiplexing module.
  • a multiplexer (MUX) 150a, a display timing controller (T-con) 160a, a source driver 170a, and a gate driver 180a may be included.
  • the display driving circuit 100a may further include an oscillator, a frame control module, a pixel power supply module, or the like.
  • the source driver 170a and the gate driver 180a may not be included in the display driving circuit 100a but may be coupled to a display panel. Meanwhile, duplicate descriptions with respect to FIG. 2 may be omitted.
  • the graphics RAM 110a may store image data received through the interface module 130a from a processor (eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600).
  • the graphics RAM 110a may include a memory space corresponding to the resolution and / or color tone of the display panel 200a.
  • the control module 120a may select a part of the image data stored in the graphic RAM 110a and control the display timing controller 160a to output the selected part to a designated area of the display panel 200a.
  • the control module 120a may be referred to as control logic.
  • the control module 120a may be implemented by embedding a circuit (so-called self-display generator) for performing the display driving method of the present invention.
  • the interface module 130a may receive image data and / or control information from an external device (eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600).
  • the interface module 130a may include an Rx side high speed serial interface (HiSSI) 131a capable of receiving the image data, an Rx side low speed serial interface 132a capable of receiving the control information, and the Rx side high speed serial interface 131a and the Rx.
  • an interface controller 133a for controlling the side low speed serial interface 132a.
  • the image processing unit 140a may improve the image quality of the image data.
  • the image processing unit 140a may include a pixel data processing circuit, a pre-processing circuit, a gating circuit, and the like.
  • the multiplexer 150a may multiplex the signal output from the image processing unit 140a and the signal output from the control module 120a and transmit the multiplexer 150a to the display timing controller 160a.
  • the display timing controller 160a receives image data multiplexed by the multiplexer 150a under the control of the control module 120a, and controls a data control signal for controlling the operation timing of the source driver 170a and an operation timing of the gate driver 180a. It is possible to generate a gate control signal for. According to an embodiment of the present disclosure, the display timing controller 160a may be included in the control module 120a.
  • the source driver 170a and the gate driver 180a are supplied to a scan line and a data line (not shown) of the display panel 200a based on the source control signal and the gate control signal received from the display timing controller 160a, respectively. To generate a signal.
  • 3B is a block diagram of an electronic device including a display driving circuit according to another exemplary embodiment of the present disclosure.
  • an electronic device may include a display driving circuit 100b, a display, and a processor 300b.
  • a display driving circuit 100b may be included in the description of FIG. 3B.
  • a processor 300b may be included in the description of FIG. 3B.
  • the display may include a source driver 170b, a gate driver 180b, and a display panel 200b.
  • the display driving circuit 100b may include a graphics RAM 110b, a control module 120b, an interface module 130b, an image processing unit 140b, a decoder 153b, an up-scaler 157b, and a display timing controller 160b.
  • the processor 300b eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600 illustrated in FIG. 2
  • the display controller 310b of the processor 300b may generate image data.
  • the image data may include image data obtained by concatenating a plurality of partial image data.
  • the plurality of partial image data may include partial image data of a first group, partial image data of a second group, or more.
  • the display controller 310b includes one low resolution image data having a resolution lower than the resolution of the display panel 200b (eg, 1 / m of the resolution of the display panel 200b), or the low resolution image data is two or more.
  • m concatenated image data D1 (see 1010 of FIG. 10) may be generated.
  • the encoder 310b of the processor 300b may encode image data generated by the display controller 310b in a designated manner (eg, a DSC method determined by VESA). As a result, the image data generated by the display controller 310b may be compressed to reduce the data size (see D2: 1020 of FIG. 10). For example, the size of the image data generated by the display controller 310b may be reduced to 1 / n by the encoding. According to various embodiments of the present disclosure, the encoder 310b may be omitted. That is, the image data may be transferred to the display driving circuit 100b without encoding or compression.
  • the processor 300b may transfer the image data encoded by the encoder 310b to the display driving circuit 100b through the Tx side high speed serial interface 330b. In addition, the processor 300b may transmit control information for selecting or controlling an image to be output to the display panel 200b to the display driving circuit 100b through a Tx side low speed serial interface (not shown).
  • the display driving circuit 100b may receive encoded image data and control information from the processor 300b through the interface module 130.
  • the encoded image data may be received through an Rx side high speed serial interface (HiSSI) 131b under the control of the interface controller 133b
  • the control information may be received under an Rx side low speed serial interface (LoSSI) under the control of the interface controller 133b. May be received via 132b.
  • HiSSI Rx side high speed serial interface
  • LoSSI Rx side low speed serial interface
  • the graphics RAM 110b may store at least one encoded image data received through the Rx side high speed serial interface 131b. For example, if image data is compressed to 1 / n by the encoder 320b of the processor 300b, n encoded image data may be stored in the graphic RAM 110b (see D3: 1030 of FIG. 10). According to various embodiments of the present disclosure, the encoded image data may include at least one encoded low resolution image data (eg, two or more).
  • the control module 120b may select some of the image data stored in the graphic RAM 110b. For example, when image data stored in the graphic RAM 110b is encoded, the control module 120b may select some of the encoded image data (see D3: 1031 and 1032 of FIG. 10).
  • the control module 120b may select at least one of the plurality of partial image data.
  • the control module 120b may use the partial image data of the first group. At least one partial image data and at least one partial image data of the second group may be selected.
  • the control module 120b may include one low resolution image data. You can also select some of them.
  • the interface module 130b may receive image data and control information from the processor 300b.
  • the interface module 130b includes an Rx side high speed serial interface (HiSSI) 131b capable of receiving the image data, an Rx side low speed serial interface (LoSSI) 132b capable of receiving the control information, and the Rx side high speed serial interface 131b. And an interface controller 133b for controlling the Rx side low speed serial interface 132b.
  • HiSSI high speed serial interface
  • LoSSI low speed serial interface
  • the image processing unit 140b may improve the image quality of the image data.
  • the image processing unit 140b may include a pixel data processing circuit, a preprocessing circuit, a gating circuit, and the like.
  • the decoder 153b may decode the selected part in a specified manner and transfer the decoded data to the display timing controller 160b (D4: 1041 of FIG. 10). , 1042). For example, if the size of the image data is compressed to 1 / n by the encoder 310b of the processor 300b, the decoder 153b may decompress the selected portion and restore the image data before compression.
  • An upscaler 157b and / or an image processing unit 140b may be disposed between the decoder 153b and the display timing controller 160b. According to various embodiments of the present disclosure, when a part selected by the control module 120b is not encoded, the decoder 153b may be omitted or bypassed.
  • the up-scaler 157b may enlarge the image at a designated magnification.
  • the up-scaler 157b may enlarge the selected portion when the portion selected by the control module 120b is a low resolution video image or needs to be enlarged according to a user setting (D5: FIG. 10). 1051, 1052).
  • a portion selected by the control module 120b may be enlarged to a predetermined magnification (for example, m times).
  • the image data enlarged by the up-scaler 157b may be transmitted to the display timing controller 160b.
  • an up image processing unit 140b may be disposed between the up-scaler 157b and the display timing controller 160b.
  • the up-scaler 157b may be omitted or bypassed.
  • the display timing controller 160b may convert the image data received from the graphics RAM 110b into an image signal through the decoder 153b, the up-scaler, and / or the image processing unit 140b, and supply the converted image signal to a display (eg, the source driver 170b and the gate driver 180b). have.
  • the display timing controller 160b may supply an image signal corresponding to a portion selected by the control module 120b to the display (eg, the source driver 170b and the gate driver 180b) under the control of the control module 120b.
  • the selected partial image data may be output to the designated area of the display panel 200b (see D6: 1060 of FIG. 10).
  • the display timing controller 160b may output an image signal corresponding to the selected at least one partial image data. It can be generated and supplied to a display (eg, source driver 170b and gate driver 180b).
  • the display timing controller when the control module 120b selects at least one of the partial image data of the first group stored in the graphics RAM 110b and selects at least one of the partial image data of the second group, the display timing controller.
  • the 160b may generate an image signal corresponding to each of the 160b and supply the image signal to the display (eg, the source driver 170b and the gate driver 180b).
  • the display eg, the source driver 170b and the gate driver 180b.
  • the display may include a source driver 170b, a gate driver 180b, and a display panel 200b.
  • the source driver 170b and the gate driver 180b may supply electrical signals to scan lines and data lines (not shown) of the display panel 200b based on the image signals received from the display timing controller 160b, respectively.
  • the display panel 200b may provide various images to the user based on electrical signals supplied from the source driver 170b and the gate driver 180b.
  • the display panel 200b may have a resolution of, for example, WQHD (1440x2560).
  • the encoder 320b and the corresponding decoder 153b are included in the processor 300b and the display driving circuit 100b, respectively, and the display driving circuit 100b includes the up-scaler 157b.
  • the encoder 320b, the decoder 153b, and the up-scaler 157b may be omitted or implemented as part of the control module 120b.
  • FIG. 4 is a flowchart illustrating a display driving method according to an exemplary embodiment.
  • the display driving method may include operations 401 to 409.
  • reference numerals of FIG. 2 will be used.
  • the processor may generate image data.
  • the processor may generate image data in which a plurality of partial image data are concatenated.
  • the processor may store the image data generated in operation 401 in the graphics RAM 110.
  • the image data includes a plurality of partial image data
  • each of the plurality of partial image data may be assigned a predetermined data address.
  • the processor when the processor stores the image data in the graphics RAM 110, the processor may enter a sleep mode. As a result, the processor may not interfere with the operation of the display driving circuit 100.
  • the display driving circuit 100 may select some of the image data stored in the graphic RAM 110. According to an embodiment, the display driving circuit 100 may select at least one partial image data from among a plurality of partial image data.
  • the display driving circuit 100 may output the partial image data selected in operation 407 to a designated area of the display panel 200. That is, the display driving circuit 100 may output part of the image data (for example, partial image data) to the designated area of the display panel 200 by the operation of the display driving circuit 100 itself.
  • control module 120 may dynamically output the selected image data in operation 409.
  • control module 120 may provide the corresponding image data to the user in a continuous operation by shifting the selected (partial) image data in units of blocks.
  • the display driving circuit 100 According to the display driving circuit 100 according to various embodiments of the present disclosure, only some of the image data (partial image data) among the image data stored in the graphic RAM 110 may be selected and displayed. According to an embodiment of the present disclosure, the partial image data may be shifted and output at a predetermined clock period based on an oscillator embedded in the display driving circuit 100, thereby adding a dynamic effect to the output of the partial image data. Can be.
  • the processor does not participate in the operation of the display driving circuit 100 after providing the image data. 1000 and the processor may maintain a sleep mode. As a result, power consumption due to the processor and the high speed serial interface 131 may be minimized.
  • the partial image data may be dynamically output on the display panel 200, the afterimage effect of a specific pixel for a long time may be improved.
  • FIG. 5 illustrates that image data is provided in a specified order according to an embodiment of the present invention.
  • the display driving method may include operations 501 to 513.
  • the display driving method illustrated in FIG. 5 is described with reference to FIGS. 6A and 6B.
  • duplicate descriptions with respect to FIG. 4 may be omitted.
  • the processor may generate image data in which a plurality of partial image data are concatenated.
  • the processor of the smartphone 1000a may generate one image data 111a concatenated with N partial image data 111a-1, 111a-2, and 111a-N.
  • the processor of the smart watch 1000w may generate one image data 111w concatenated with M partial image data 111w-1, 111w-2, and 111w-M.
  • the processor may store the image data generated in operation 501 in the graphics RAM 110.
  • the image data includes a plurality of partial image data
  • each of the plurality of partial image data may be assigned a predetermined data address.
  • the N partial image data 111a-1, 111a-2, and 111a-N may be assigned a data address on the graphics RAM 110a, respectively.
  • the N partial image data 111a-1, 111a-2, and 111a-N may be assigned data addresses at intervals of the size of the partial image data.
  • M partial image data 111w-1, 111w-2, and 111w-M may be assigned a data address on the graphics RAM 110w, respectively.
  • the M partial image data 111w-1, 111w-2, and 111w-M may be assigned data addresses at intervals of the size of the partial image data.
  • the electronic device 1000 and the processor may enter a sleep mode.
  • image data eg, image data 111a of FIG. 6A and image data 111w of FIG. 6B
  • control module 120 of the display driving circuit 100 may select at least one partial image data among the image data stored in the graphics RAM 110.
  • the display driving circuit 100 may select the partial image data 111a-1 among the image data 111a stored in the graphic RAM 110a by using a data address and / or data size.
  • the display driving circuit 100 may select the partial image data 111w-1 from among the image data 111w stored in the graphics RAM 110w using a data address and a data size.
  • the display driving circuit 100 may output the partial image data selected in operation 507 to a designated area of the display panel 200. According to an embodiment, the display driving circuit 100 may dynamically output the selected image data.
  • the display driving circuit 100 may scan the partial image data 111a-1 selected in operation 507 and scan the partial image data 111a-1 and output the scanned image to the designated area 210a.
  • the smartphone 1000a illustrated in FIG. 6A may include a display panel 200a, wherein the display panel 200a includes a main display panel area 201a (a flat display panel area provided on an upper surface of the smartphone 1000a) and a sub display panel area 202a (smartphone). And a curved display panel region provided at the side of 1000a.
  • the designated area 210a may correspond to at least a portion of the sub display panel area 202a.
  • the display driving circuit 100 may dynamically output the partial image data 111a-1 to the designated area 210a.
  • the information corresponding to the partial image data 111a-1 may be output while moving from the right side to the left side of the sub display area 202a.
  • the main display panel area 201a of the display panel 200a may be displayed in black.
  • the display panel 200a is an OLED panel, the pixel of the display area 201a may be turned off.
  • the display driving circuit 100 may output the partial image data 111w-1 to the designated area 210w of the display panel 200w.
  • the designated area 210w may be disposed in a lower area of the display panel 200w.
  • the display driving circuit 100 may dynamically output the partial image data 111w-1 to the designated area 210w.
  • information corresponding to the partial image data 111w-1 ie, an icon representing a message and at least a part of the message content
  • the remaining areas of the display panel 200w except for the designated area 210w may be displayed in black.
  • the display panel 200w is an OLED panel
  • pixels of the remaining areas of the display panel 200w except for the designated area 210w may be turned off.
  • the electronic device 1000 may determine whether a user input for activating the processor is received. That is, the electronic device 1000 may determine whether a user input for switching from the sleep mode to the wakeup mode is received.
  • the user input may include pressing a home button, pressing a power button, or touching a touch panel.
  • the electronic device may terminate the display driving method according to an embodiment of the present disclosure and switch to the wakeup mode. In contrast, when the user input is not received, the electronic device may proceed to operation 513.
  • the display driving circuit 100 of the electronic device may change the selected partial image data in a predetermined order.
  • the display driving circuit 100 may include the partial image data 111a-1, 111a-2,. , Partial image data may be output in the order of 111a-N.
  • the display driving circuit 100 may output the partial image data 111a-1 to the designated area 210a after the partial image data 111a -N is output.
  • a plurality of partial image data 111a-1, 111a-2,... , 111a -N can be displayed on the sequentially designated area 210a, so that an effect similar to a news ticker scrolling in one direction can be achieved.
  • the display driving circuit 100 includes the partial image data 111w-1, 111w-2,... ,
  • the partial image data can be output in the order of 111w-M.
  • the display driving circuit 100 may output the partial image data 111w-1 to the designated area 210w again.
  • the smartphone 1000a is described as operating in a landscape mode.
  • the smartphone 1000a may also operate in a portrait mode as in the smartphone 13 of FIG. 1A.
  • the smart watch 1000w of FIG. 6B has been described as operating in a portrait mode, the smart watch 1000w may also operate in a landscape mode as in the smart watch 15 of FIG. 1B, and may be a circular smart watch such as the smart watch 16 of FIG. 1B. May also be implemented.
  • useful information may be provided to the user by the operation of the display driving circuit 100 itself.
  • the partial image data is sequentially changed and provided according to a designated order, the user may be effectively provided with useful information.
  • the partial image data is dynamically provided by the operation of the display driving circuit 100, an effect similar to a moving image may be achieved without driving the processor.
  • the afterimage effect that may occur when the operating pixel is fixed for a long time may be suppressed.
  • FIG. 7 illustrates that image data is provided based on control information according to an embodiment of the present invention.
  • the display driving method according to an exemplary embodiment may include operations 701 to 713.
  • the display driving method illustrated in FIG. 7 is described with reference to FIGS. 8A and 8B.
  • duplicate descriptions with respect to FIGS. 4 and 5 may be omitted.
  • the processor may generate image data in which a plurality of partial image data are concatenated.
  • the processor of the smartphone 1000c illustrated in FIG. 8A may generate image data 800.
  • the image data 800 may include a plurality of partial image data including partial image data 801 to 808.
  • the plurality of partial image data may include predefined numbers, texts, alphabets, weather symbols, other symbols, and the like.
  • the processor may store the image data generated in operation 701 in the graphics RAM 110.
  • the processor 1000c of the smartphone 1000c illustrated in FIG. 8A may store the image data 800 generated in operation 701 in the graphics RAM 110c.
  • the plurality of partial image data included in the image data 800 may be assigned a predetermined data address, respectively.
  • the electronic device (and a processor included in the electronic device) may enter a sleep mode.
  • the processor of the smartphone 1000c shown in FIG. 8A may enter a sleep mode.
  • the display driving circuit 100 may receive control information from an external device (eg, the AP 300, the CP 400, the sensor hub 500, and the touch controller IC 600).
  • the display driving circuit 100 may transmit control information from various types of processors (eg, AP 300, CP 400, sensor hub 500, and touch controller IC 600) disposed outside the display driving circuit 100.
  • processors eg, AP 300, CP 400, sensor hub 500, and touch controller IC 600
  • the display driving circuit 100 may receive control information from the various types of processors, at least a part of the processor may be temporarily switched to a wakeup mode to transmit the control information to the control module. At least a portion of the processor may enter the sleep mode again after transmitting the control information.
  • the control information includes hour, minute, second, time information such as AM / PM, calendar information such as year, month, day, positive / lunar calendar, weather condition, weather information such as temperature, missed call caller, number of caller and Information available from various types of processors (e.g. AP 300, CP 400, Sensor Hub 500, Touch Controller IC 600), such as call information, message originator, message information such as message content, pre-registered user schedule information, etc. It may include.
  • the various types of control information may include information about a data address and / or data size on the graphics RAM 110c.
  • control information is not limited to the above-described example.
  • the control information may include various information obtained from inside or outside of the electronic device (eg, the smartphone 1000c).
  • the display driving circuit 100 of the smartphone 1000c may receive control information related to a current time from the CP 400 and control information related to weather from the AP 300.
  • the display driving circuit 100 may receive control information related to temperature from the sensor hub 400, and receive control information related to touch gestures from the touch controller IC 600.
  • control information may include other electronic devices (eg, a server, another smartphone, a wearable device, a wireless input device, etc.) electrically connected to the electronic device (eg, the smartphone 1000c). May be obtained from. That is, the device capable of providing the control information is not limited to a module mounted in the electronic device.
  • the display driving circuit 100 may select at least one partial image data stored in the graphics RAM 110c based on the control information received in operation 707 using a data address and / or data size.
  • the display driving circuit 100 may select partial image data 801 to 805 stored in the graphics RAM 110c according to control information related to the current time.
  • the display driving circuit 100 may select the partial image data 806 according to the control information related to the weather, and select the partial image data 807 and 808 according to the control information related to the temperature.
  • the display driving circuit 100 may output the partial image data selected in operation 709 to a designated area of the display panel 200.
  • the display driving circuit 100 of the smartphone 1000c may output partial image data 801 to 808 selected in operation 709 to a designated area of the display panel 200c.
  • the display panel 200c may include a main display panel area 201c and a curved display panel area 202c.
  • the partial image 211 to 218 corresponding to the selected partial image data 801 to 808 may be output to the curved display panel region 202c, respectively.
  • the partial image data may be dynamically output.
  • the electronic device 1000 may determine whether a user input for activating the processor is received. That is, the electronic device 1000 may determine whether a user input for switching from the sleep mode to the wakeup mode is received. When the user input is received, the electronic device may terminate the display driving method according to an embodiment of the present disclosure and switch to the wakeup mode. In contrast, when the user input is not received, the electronic device returns to operation 707 and repeats operations 707 to 711.
  • the display driving method illustrated in FIG. 7 may be applied to a smart watch.
  • the display driving circuit 100 of the smart watch 1000d receives predetermined control information from an AP 300, a CP 400, a sensor hub 500, or a touch controller IC 600 built in the smart watch 1000d.
  • the image data corresponding to the control information may be output to a designated area of the display panel 200d.
  • the display panel 200d of the smart watch 1000d may output an image regarding a missed call, an incoming message, a user's schedule, weather, temperature, current time, date, and the like.
  • useful information may be provided to the user by the operation of the display driving circuit 100 itself. Since the partial image data representing the useful information may be selected based on control information received from other modules (eg, AP 300, CP 400, sensor hub 500, and touch controller IC 600), the electronic device 1000 may be more active to the user. Can provide useful information.
  • modules eg, AP 300, CP 400, sensor hub 500, and touch controller IC 600
  • the display driving method may include operations 901 to 919.
  • FIG. 10 In the description of each operation of FIG. 9, reference is made to FIG. 10 alternately, and reference numerals of FIG. 3B will be used to describe FIG. 9.
  • the processor 300b may generate image data obtained by concatenating a plurality of partial image data.
  • the image data 1010 generated by the processor 300b may include low resolution image data 1011 to 1014 lower than the resolution (WQHD 1440x2560) of the display panel 200b.
  • the low resolution image data 1011 to 1014 may each have a resolution of HD (720x1280).
  • the image data 1011 may include partial image data 1011-1 of a first group including a digital clock image, and partial image data 1011-2 of a second group for animation.
  • the image data 1012, 1013, and 1014 are the partial image data 1012-1, 1013-1, 1014-1 of the first group including the digital clock image and the partial image data 1012-1 of the second group for animation. , 1013-1, and 1014-1.
  • the processor 300b may encode the image data generated in operation 901 by a specified method (eg, a DSC method determined by VESA). For example, referring to FIG. 10, the processor 200b may generate encoded image data 1020 by encoding image data 1010. For example, the data size of the image data 1020 encoded by the processor 300b may be reduced to 1/4 of the image data 1010.
  • a specified method eg, a DSC method determined by VESA
  • the processor 300b may store the image data encoded in operation 903 in the graphic RAM 110b of the display driving circuit 100b.
  • the processor may store image data 1020 encoded in operation 903 in the graphics RAM 110b of the display driving circuit 100b.
  • Four encoded image data may be stored in the data space 1030 of the graphic RAM 110b including the encoded image data 1020.
  • Each of the four encoded image data 1030 and the partial image data included therein may be assigned a predetermined data address.
  • the processor 300b may enter a sleep mode. That is, after storing the encoded image data 1020 in the graphics RAM 110b of the display driving circuit 100b, the processor 300b may not interfere with the operation of the display driving circuit 100 (transmission of control information for selecting a portion of the image data). ).
  • the display driving circuit 100b may select a part of the encoded image data stored in the graphic RAM 110b.
  • the display driving circuit 100b may select a part of the encoded image data stored in the graphics RAM 110b based on the control information received from the processor 300b or the designated order.
  • the display driving circuit 100b may select portions 1031 and 1032 of encoded image data stored in the data space 1030 of the graphics RAM 110b.
  • the selected partial 1031 may correspond to data encoded by the partial image data 1011-1 of the first group, and the selected partial 1032 may include any partial image data of the partial image data 1011-2 of the second group. It may correspond to encoded data.
  • the display driving circuit 100b may decode a portion of the image data selected in operation 909. For example, referring to FIG. 10, the display driving circuit 100b may decode portions 1031 and 1032 of the image data selected in operation 909, respectively.
  • the partial image data 1041 and the partial image data 1042 may be generated by the decoding.
  • the display driving circuit 100b may enlarge the image data decoded in operation 911 to a predetermined magnification. For example, referring to FIG. 10, the display driving circuit 100b may enlarge the partial image data 1041 and the partial image data 1042 decoded in operation 911 by four times (two times horizontally and two times vertically), respectively. Through the enlargement, the partial image data 1051 and the partial image data 1052 may be generated.
  • the display driving circuit 100b may output the enlarged image data in operation 913 to a designated area on the display panel 200b.
  • the display driving circuit 100b may output the partial image data 1051 enlarged in operation 913 to the first region 1061 of the output area 1060 of the display panel 200b, and output the enlarged partial image data 1052.
  • the display device may output the second area 1062 of the output area 1060 of the display panel 200b.
  • the processor 300b may determine whether a user input for activating the processor 300b is received. That is, the electronic device 1000 may determine whether a user input for switching from the sleep mode to the wakeup mode is received. When the user input is received, the electronic device may terminate the display driving method according to an embodiment of the present disclosure and switch to the wakeup mode. When switching to the wakeup mode, the processor 300b may output, for example, a lock screen or a home screen to the display panel 200b. On the other hand, when the user input is not received, the processor 300b may perform operation 919.
  • the display driving circuit 100b may select partial image data in the following order. For example, the display driving circuit 100b may select the partial image data in the following order in a predetermined order or based on control information received from the processor 300b or in a random order. Since operation 919 may be performed at a designated period (eg, a period set by a user), a constant animation effect may be achieved on the display panel 200b.
  • a designated period eg, a period set by a user
  • RAM 110b since the encoded video data is stored in the graphic RAM 110b, the image data is n times larger than the case in which the encoding is not performed (4 times in FIG. 10). RAM 110b may be stored.
  • the processor 300b may combine (or concatenate) m image data having a resolution of 1 / m of the resolution of the display panel 200b. Accordingly, when compared to the case in which an image having the same resolution as the display panel 200b (for example, FIGS. 6A and 8A) is generated, many images are m times (four times in FIG. 10). Can be stored in.
  • the image integration degree of the graphics RAM 110b may be increased by 16 times (4 times * 4 times) as compared with the case of FIGS. 6A and 8A. have. Since the graphics RAM 110b may store data 16 times higher in image density at a time, the processor 300b may maintain the sleep mode for 16 times as long as compared to the cases of FIGS. 6A and 8A. As a result, since the frequency at which the processor 300b needs to switch to the awake mode in order to write the image data into the graphics RAM 110b may be reduced, the battery power consumption may be further reduced.
  • the image data 1010 includes lower resolution image data 1011 to 1014 that are lower than the resolution of the display panel 200b (WQHD 1440x2560), but is limited thereto. It doesn't work.
  • the image data 1010 may have a resolution corresponding to the resolution of the display panel 200b as illustrated in FIG. 6A or 8A.
  • the image data 1011 including (or concatenated) the partial image data 1011-1 of the first group and the partial image data 1011-2 of the second group may have the same resolution as that of the display panel 200b (WQHD 1440x2560). May be generated. In this case, the image enlargement (operation 913) by the display driving circuit 100b may be omitted.
  • the image data 1010 is described to be encoded by the processor 300b, but is not limited thereto.
  • the encoding (operation 903) and the corresponding decoding (operation 911) may be omitted.
  • the image data 1010 may be stored in the graphics RAM 110b without being encoded.
  • FIG 11 illustrates an electronic device according to an embodiment of the present disclosure.
  • an electronic device 1101 may include a bus 1110, a processor 1120, a memory 1130, an input / output interface 1150, a display 1160, and a communication interface 1170. According to an embodiment of the present disclosure, the electronic device 1101 may omit at least one of the components or additionally include other components.
  • the bus 1110 may include, for example, circuitry that connects the components 1110-1170 to each other and delivers communication (eg, control messages and / or data) between the components.
  • the processor 1120 may be one or more of a central processing unit (CPU), an application processor (AP) (eg, AP 300 of FIG. 2), or a communication processor (CP) (eg, CP 400 of FIG. 2). It may contain the above.
  • the processor 1120 may execute an operation or data processing related to control and / or communication of at least one other element of the electronic device 1101.
  • the memory 1130 may include volatile and / or nonvolatile memory.
  • the memory 1130 may store, for example, commands or data related to at least one other element of the electronic device 1101.
  • the memory 1130 may store software and / or a program 1140.
  • the program 1140 may include, for example, a kernel 1141, middleware 1143, an application programming interface (API) 1145, and / or an application program (or “application”) 1147. At least a portion of the kernel 1141, middleware 1143, or API 1145 may be referred to as an operating system (OS).
  • OS operating system
  • the input / output interface 1150 may serve as, for example, an interface capable of transferring a command or data input from a user or another external device to other component (s) of the electronic device 1101.
  • the display 1160 may display, for example, various types of content (eg, text, images, videos, icons, or symbols) to the user.
  • the display 1160 may include, for example, the display driving circuit 100, the display panel 200, the touch controller IC 600, and the touch panel 610 illustrated in FIG. 2.
  • the display 1160 may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a part of a user's body.
  • the communication interface 1170 may establish communication between the electronic device 1101 and an external device (eg, the first external electronic device 1102, the second external electronic device 1104, or the server 1106).
  • the communication interface 1170 may be connected to the network 1162 through wireless or wired communication to communicate with an external device (eg, the second external electronic device 1104 or the server 1106).
  • the communication interface 1170 may communicate with an external device (eg, the first external electronic device 1102) through the local area network 1164.
  • Each of the first and second external electronic devices 1102 and 1104 may be the same or different type of device as the electronic device 1101.
  • the server 1106 may include a group of one or more servers.
  • FIG. 12 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 1201 may include, for example, all or part of the electronic device illustrated in FIGS. 2 and 11.
  • the electronic device 1201 may include one or more processors (eg, AP 300 and CP 400 of FIG. 2) 1210, communication module 1220, subscriber identification module 1224, memory 1230, sensor module 1240, input device 1250, display 1260, interface 1270, and audio module 1280.
  • processors eg, AP 300 and CP 400 of FIG. 2
  • communication module 1220 e.g, AP 300 and CP 400 of FIG. 210
  • communication module 1220 e.g, communication module 1220, subscriber identification module 1224, memory 1230, sensor module 1240, input device 1250, display 1260, interface 1270, and audio module 1280.
  • Camera module 1291 e.g, Power management module 1295, battery 1296, indicator 1297, and motor 1298.
  • the processor 1210 may control a plurality of hardware or software components connected to the processor 1210 by, for example, operating an operating system or an application program, and processing and processing various data. Can be performed.
  • the processor 1210 may be implemented with, for example, a system on chip (SoC).
  • SoC system on chip
  • the processor 1210 may further include a graphic processing unit (GPU) and / or an image signal processor.
  • the processor 1210 loads and processes instructions or data received from at least one of the other components (eg, nonvolatile memory) into volatile memory (eg, the graphics RAM 110 of FIG. 2), and processes various data. Can be stored in memory.
  • the communication module 1220 may have a configuration that is the same as or similar to that of the communication interface 1170 of FIG. 11.
  • the communication module 1220 may be, for example, a cellular module 1221, a Wi-Fi module 1223, a Bluetooth module 1225, a GNSS module 1227 (e.g., a GPS module, a Glonass module, a Beidou module, or a Galileo module), an NFC module 1228, and an RF (radio frequency) module 1229.
  • the cellular module 1221 may provide, for example, a voice call, a video call, a text service, or an internet service through a communication network. According to an embodiment of the present disclosure, the cellular module 1221 may perform at least some of the functions that the processor 1210 may provide. According to an embodiment of the present disclosure, the cellular module 1221 may include a communication processor (CP) (eg, CP 400 of FIG. 2).
  • CP communication processor
  • Each of the Wi-Fi module 1223, the Bluetooth module 1225, the GNSS module 1227, or the NFC module 1228 may include, for example, a processor for processing data transmitted and received through a corresponding module.
  • at least some (eg, two or more) of the cellular module 1221, the Wi-Fi module 1223, the Bluetooth module 1225, the GNSS module 1227, or the NFC module 1228 may be included in one integrated chip (IC) or IC package. Can be.
  • Subscriber identification module 1224 may include, for example, a card that includes a subscriber identification module and / or an embedded SIM, and may include unique identification information (eg, an integrated circuit card identifier (ICCID) or subscriber information). (Eg, international mobile subscriber identity).
  • ICCID integrated circuit card identifier
  • subscriber information e.g, international mobile subscriber identity
  • the memory 1230 may include, for example, an internal memory 1232 or an external memory 1234.
  • the external memory 1234 may be functionally and / or physically connected to the electronic device 1201 through various interfaces.
  • the sensor module 1240 may measure a physical quantity or detect an operation state of the electronic device 1201 to convert the measured or detected information into an electrical signal.
  • the sensor module 1240 (e.g., the sensors 510 and 520 of FIG. 2) may include, for example, a gesture sensor 1240A, a gyro sensor 1240B, an air pressure sensor 1240C, a magnetic sensor 1240D, an acceleration sensor 1240E, a grip sensor 1240F, a proximity sensor 1240G, and a color ( color) sensor 1240H (eg, RGB (red, green, blue) sensor), biometric sensor 1240I, temperature / humidity sensor 1240J, illumination sensor 1240K, or UV (ultra violet) sensor may include at least one.
  • the sensor module 1240 may further include a control circuit (eg, the sensor hub 500 of FIG. 2) for controlling at least one or more sensors belonging thereto.
  • the electronic device 1201 may further include a processor configured to control the sensor module 1240 as part of or separately from the processor 1210 to control the sensor module 1240 while the processor 1210 is in a sleep state. .
  • the input device 1250 includes, for example, a touch panel 1252 (eg, the touch panel 610 of FIG. 2), a (digital) pen sensor 1254, a key 1256, or an ultrasonic input device 1258. can do.
  • the touch panel 1252 may use at least one of capacitive, resistive, infrared, or ultrasonic methods, for example.
  • the touch panel 1252 may further include a control circuit (eg, the touch controller IC 600 of FIG. 2).
  • the touch panel 1252 may further include a tactile layer to provide a tactile response to the user.
  • the display 1260 may include a panel 1262 (eg, the display panel 200 of FIG. 2), the hologram device 1264, or the projector 1266.
  • the panel 1262 (eg, the display panel 200 of FIG. 2) may include a configuration that is the same as or similar to that of the display 1160 of FIG. 11.
  • the panel 1262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 1262 (eg, the display panel 200 of FIG. 2) may be configured as a single module together with the touch panel 1252 (eg, the touch panel 610 of FIG. 2).
  • the hologram device 1264 may show a stereoscopic image in the air by using interference of light.
  • the projector 1266 may display an image by projecting light onto a screen.
  • the screen may be located inside or outside the electronic device 1201.
  • the display 1260 may further include a control circuit (eg, the display driving circuit 100 of FIG. 2 or 3) for controlling the panel 1262, the hologram device 1264, or the projector 1266.
  • a control circuit eg, the display driving circuit 100 of FIG. 2 or 3 for controlling the panel 1262, the hologram device 1264, or the projector 1266.
  • the interface 1270 may include, for example, a high-definition multimedia interface (HDMI) 1272, a universal serial bus (USB) 1274, an optical interface 1276, or a D-subminiature 1278.
  • HDMI high-definition multimedia interface
  • USB universal serial bus
  • the interface 1270 may be included in, for example, the communication interface 1170 illustrated in FIG. 11.
  • the audio module 1280 may bidirectionally convert, for example, a sound and an electrical signal. At least some components of the audio module 1280 may be included in, for example, the input / output interface 1150 illustrated in FIG. 11.
  • the audio module 1280 may process sound information input or output through, for example, a speaker 1282, a receiver 1284, an earphone 1286, a microphone 1288, or the like.
  • the camera module 1291 is, for example, a device capable of capturing still images and moving images.
  • the camera module 1291 may include one or more image sensors (eg, a front sensor or a rear sensor), a lens, an image signal processor (ISP), or Flash (eg, LED or xenon lamp, etc.).
  • image sensors eg, a front sensor or a rear sensor
  • ISP image signal processor
  • Flash eg, LED or xenon lamp, etc.
  • the power management module 1295 may manage, for example, power of the electronic device 1201.
  • the power management module 1295 may include a power management integrated circuit (PMIC), a charger integrated circuit (ICC), or a battery or fuel gauge.
  • the battery 1296 may include, for example, a rechargeable battery and / or a solar battery.
  • the indicator 1297 may display a specific state of the electronic device 1201 or a portion thereof (for example, the processor 1210), for example, a booting state, a message state, or a charging state.
  • the motor 1298 may convert an electrical signal into mechanical vibration, and may generate a vibration or haptic effect.
  • the electronic device 1201 may include a processing device (eg, a GPU) for supporting mobile TV.
  • the processing device for supporting mobile TV may process media data according to a standard such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), or mediaFlo (TM).
  • DMB digital multimedia broadcasting
  • DVD digital video broadcasting
  • TM mediaFlo
  • each of the components described in this document may be composed of one or more components, and the names of the corresponding components may vary depending on the type of electronic device.
  • the electronic device may be configured to include at least one of the components described in this document, and some components may be omitted or further include other additional components.
  • some of the components of the electronic device according to various embodiments of the present disclosure may be combined to form one entity, and thus may perform the same functions of the corresponding components before being combined.
  • module may refer to a unit that includes one or a combination of two or more of hardware, software, or firmware.
  • a “module” may be interchangeably used with terms such as, for example, unit, logic, logical block, component, or circuit.
  • the module may be a minimum unit or part of an integrally constructed part.
  • the module may be a minimum unit or part of performing one or more functions.
  • the “module” can be implemented mechanically or electronically.
  • a “module” is one of application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs), or programmable-logic devices that perform certain operations, known or developed in the future. It may include at least one.
  • ASIC application-specific integrated circuit
  • FPGAs field-programmable gate arrays
  • an apparatus eg, modules or functions thereof
  • a method eg, operations
  • computer-readable storage media in the form of a program module. It can be implemented as a command stored in.
  • the command is executed by a processor (eg, the AP 300 of FIG. 2 or the processor 1120 of FIG. 11)
  • the one or more processors may perform a function corresponding to the command.
  • the computer-readable storage medium may be, for example, the memory 1230.
  • Computer-readable recording media include hard disks, floppy disks, magnetic media (e.g. magnetic tape), optical media (e.g. compact disc read only memory), DVD ( digital versatile discs, magneto-optical media (e.g. floptical disks), hardware devices (e.g. read only memory, random access memory (RAM), or flash memory)
  • the program instructions may include not only machine code generated by a compiler, but also high-level language code executable by a computer using an interpreter, etc.
  • the hardware device described above may be various. It can be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
  • Modules or program modules according to various embodiments of the present disclosure may include at least one or more of the above components, some of them may be omitted, or may further include additional components. Operations performed by a module, program module, or other component according to various embodiments of the present disclosure may be executed in a sequential, parallel, repetitive, or heuristic manner. In addition, some operations may be executed in a different order, may be omitted, or other operations may be added. And the embodiments disclosed in this document are presented for the purpose of explanation and understanding of the disclosed, technical content, it does not limit the scope of the technology described in this document. Accordingly, the scope of this document should be construed as including all changes or various other embodiments based on the technical spirit of this document.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne, selon un mode de réalisation, un dispositif électronique pouvant comprendre un affichage, un processeur pour créer des données d'image, une mémoire morte graphique (GRAM) pour mémoriser les données d'image et un circuit d'attaque d'affichage pour attaquer l'affichage. Le circuit d'attaque d'affichage peut être paramétré pour sélectionner une partie des données d'image et émettre en sortie la partie sélectionnée vers une zone prédéfinie de l'affichage.
PCT/KR2016/007658 2015-07-14 2016-07-14 Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique WO2017010822A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP16824739.3A EP3324388B1 (fr) 2015-07-14 2016-07-14 Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique
CN201680041421.6A CN107851415B (zh) 2015-07-14 2016-07-14 显示驱动集成电路、显示驱动方法和电子设备
US15/743,899 US10672097B2 (en) 2015-07-14 2016-07-14 Display driving circuit and method of partial image data
US16/889,179 US11017496B2 (en) 2015-07-14 2020-06-01 Display driving circuit and method of partial image data

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2015-0099645 2015-07-14
KR20150099645 2015-07-14
KR10-2016-0089039 2016-07-14
KR1020160089039A KR102576961B1 (ko) 2015-07-14 2016-07-14 디스플레이 구동 회로, 디스플레이 구동 방법, 및 전자 장치

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/743,899 A-371-Of-International US10672097B2 (en) 2015-07-14 2016-07-14 Display driving circuit and method of partial image data
US16/889,179 Continuation US11017496B2 (en) 2015-07-14 2020-06-01 Display driving circuit and method of partial image data

Publications (1)

Publication Number Publication Date
WO2017010822A1 true WO2017010822A1 (fr) 2017-01-19

Family

ID=57757470

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/007658 WO2017010822A1 (fr) 2015-07-14 2016-07-14 Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2017010822A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108519807A (zh) * 2018-03-23 2018-09-11 维沃移动通信有限公司 一种应用处理器及移动终端
EP3425620A1 (fr) * 2017-07-04 2019-01-09 Beijing Xiaomi Mobile Software Co., Ltd. Procédé et appareil d'affichage continu et support d'informations lisible sur ordinateur
CN111512357A (zh) * 2017-12-20 2020-08-07 三星电子株式会社 基于显示驱动电路中存储的坐标信息移动内容显示位置的电子装置和方法
CN111937063A (zh) * 2017-12-20 2020-11-13 三星电子株式会社 用于控制信号的输出定时的电子设备和方法
WO2023043518A1 (fr) * 2021-09-16 2023-03-23 Apple Inc. Générateur de signal d'affichage permanent

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050215274A1 (en) * 2004-03-26 2005-09-29 Broadcom Corporation MAC controlled sleep mode/wake-up mode with staged wake-up for power management
US20100110106A1 (en) * 1998-11-09 2010-05-06 Macinnis Alexander G Video and graphics system with parallel processing of graphics windows
US20140071477A1 (en) * 2012-09-10 2014-03-13 Canon Kabushiki Kaisha Image forming apparatus and control method thereof
EP2717567A1 (fr) * 2012-10-04 2014-04-09 BlackBerry Limited Sélection de résolutions vidéo basée sur une comparaison dans un appel vidéo
US20150042572A1 (en) * 2012-10-30 2015-02-12 Motorola Mobility Llc Method and apparatus for user interaction data storage

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100110106A1 (en) * 1998-11-09 2010-05-06 Macinnis Alexander G Video and graphics system with parallel processing of graphics windows
US20050215274A1 (en) * 2004-03-26 2005-09-29 Broadcom Corporation MAC controlled sleep mode/wake-up mode with staged wake-up for power management
US20140071477A1 (en) * 2012-09-10 2014-03-13 Canon Kabushiki Kaisha Image forming apparatus and control method thereof
EP2717567A1 (fr) * 2012-10-04 2014-04-09 BlackBerry Limited Sélection de résolutions vidéo basée sur une comparaison dans un appel vidéo
US20150042572A1 (en) * 2012-10-30 2015-02-12 Motorola Mobility Llc Method and apparatus for user interaction data storage

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3324388A4 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3425620A1 (fr) * 2017-07-04 2019-01-09 Beijing Xiaomi Mobile Software Co., Ltd. Procédé et appareil d'affichage continu et support d'informations lisible sur ordinateur
US10698442B2 (en) 2017-07-04 2020-06-30 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for always-on display applied in a display driver integrated circuit
CN111512357A (zh) * 2017-12-20 2020-08-07 三星电子株式会社 基于显示驱动电路中存储的坐标信息移动内容显示位置的电子装置和方法
CN111937063A (zh) * 2017-12-20 2020-11-13 三星电子株式会社 用于控制信号的输出定时的电子设备和方法
CN111937063B (zh) * 2017-12-20 2023-09-29 三星电子株式会社 用于控制信号的输出定时的电子设备和方法
CN111512357B (zh) * 2017-12-20 2024-06-14 三星电子株式会社 包括显示驱动电路的电子装置
CN108519807A (zh) * 2018-03-23 2018-09-11 维沃移动通信有限公司 一种应用处理器及移动终端
US11860708B2 (en) 2018-03-23 2024-01-02 Vivo Mobile Communication Co., Ltd. Application processor and mobile terminal
WO2023043518A1 (fr) * 2021-09-16 2023-03-23 Apple Inc. Générateur de signal d'affichage permanent
US11893925B2 (en) 2021-09-16 2024-02-06 Apple Inc. Always-on display signal generator

Similar Documents

Publication Publication Date Title
AU2017254304B2 (en) Display driving integrated circuit and electronic device having the same
WO2018182287A1 (fr) Procédé de commande à faible puissance d'un dispositif d'affichage et dispositif électronique pour la mise en oeuvre de ce procédé
WO2018169311A1 (fr) Procédé de fonctionnement utilisant une tension gamma correspondant à une configuration d'affichage et dispositif électronique prenant en charge ledit procédé
WO2017078366A1 (fr) Dispositif électronique comprenant une pluralité de dispositifs d'affichage et son procédé de fonctionnement
AU2017266815B2 (en) Operating method for display corresponding to luminance, driving circuit, and electronic device supporting the same
AU2015350680B2 (en) Power control method and apparatus for reducing power consumption
WO2018044071A1 (fr) Procédé de traitement d'image et dispositif électronique le prenant en charge
WO2016137187A1 (fr) Appareil et procédé permettant de fournir un service de duplication d'écran
WO2018182296A1 (fr) Dispositif électronique et procédé de partage d'ecran de dispositif électronique
WO2018151505A1 (fr) Dispositif électronique et procédé d'affichage de son écran
WO2016175480A1 (fr) Dispositif électronique, dispositif adaptateur et leur procédé de traitement de données vidéo
WO2017116024A1 (fr) Dispositif électronique ayant un dispositif d'affichage souple, et procédé de fonctionnement du dispositif électronique
WO2017082685A1 (fr) Procédé de commande d'affichage, panneau d'affichage le mettant en œuvre, dispositif d'affichage et dispositif électronique
WO2018143624A1 (fr) Procédé de commande d'affichage, support de mémoire et dispositif électronique
WO2017155326A1 (fr) Dispositif électronique et procédé de commande d'affichage correspondant
WO2017010822A1 (fr) Circuit d'attaque d'affichage, procédé d'attaque d'affichage et dispositif électronique
WO2018004238A1 (fr) Appareil et procédé de traitement d'image
WO2017026709A1 (fr) Procédé et dispositif d'ajustement de la résolution d'un dispositif électronique
WO2016006851A1 (fr) Dispositif électronique, procédé de fourniture d'une interface dudit dispositif, et accessoire pour ledit dispositif
WO2016114609A1 (fr) Dispositif électronique et procédé de traitement d'informations dans un dispositif électronique
WO2017022971A1 (fr) Procédé et appareil d'affichage pour dispositif électronique
WO2018038482A1 (fr) Dispositif électronique comprenant une pluralité de dispositifs d'affichage tactiles et procédé de changement d'état de celui-ci
WO2018044052A1 (fr) Procédé d'affichage d'informations temporelles et dispositif électronique prenant en charge ledit procédé d'affichage d'informations temporelles
WO2015102464A1 (fr) Dispositif électronique et procédé d'affichage d'événement en mode de réalité virtuelle
WO2018044051A1 (fr) Procédé de pilotage d'affichage comprenant une zone d'affichage incurvée, circuit de commande d'affichage supportant celui-ci, et dispositif électronique comprenant celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16824739

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15743899

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE