WO2017030380A1 - Dispositif numérique et procédé de traitement de données associé - Google Patents

Dispositif numérique et procédé de traitement de données associé Download PDF

Info

Publication number
WO2017030380A1
WO2017030380A1 PCT/KR2016/009075 KR2016009075W WO2017030380A1 WO 2017030380 A1 WO2017030380 A1 WO 2017030380A1 KR 2016009075 W KR2016009075 W KR 2016009075W WO 2017030380 A1 WO2017030380 A1 WO 2017030380A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
data
tiles
tile
rendered
Prior art date
Application number
PCT/KR2016/009075
Other languages
English (en)
Inventor
Jinhong Park
Sunho KI
Minkyu Kim
Youngduke SEO
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150117162A external-priority patent/KR20170022333A/ko
Priority claimed from KR1020150117163A external-priority patent/KR20170022334A/ko
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2017030380A1 publication Critical patent/WO2017030380A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA

Definitions

  • the present invention relates to a digital device, and more particularly, to a digital device and method of processing data therein.
  • embodiments of the present invention are directed to a digital device and method of processing data therein that substantially obviate one or more problems due to limitations and disadvantages of the related art.
  • One object of the present invention is to provide a digital device and method of processing data therein, by which QoS (quality of service) of application data and the like can be processed in a manner of being maintained or improved without hardware capacity improvement or configuration addition of link device(s) related to a processing unit as well as the processing unit for processing the application data.
  • QoS quality of service
  • Another object of the present invention is to provide a digital device and method of processing data therein, by which performance such as QoS and the like can be maintained or improved despite improving overall environment (e.g. , hardware capacity, power, temperature, etc.) of a device by processing application data by software.
  • performance such as QoS and the like can be maintained or improved despite improving overall environment (e.g. , hardware capacity, power, temperature, etc.) of a device by processing application data by software.
  • Further object of the present invention is to provide a digital device and method of processing data therein, by which a cost increase burden due to hardware can be minimized in a manner of maintaining or improving performance of processing application data by software without hardware capacity improvement or configuration addition and by which a device purchase desire can be increased in a manner of improving user’s device satisfaction.
  • a digital device may include a receiving unit configured to receive application data, a controller configured to render tiles of a first image frame of the received application data, render some tiles among all tiles of a second image frame of the received application data, and generate an interpolated image frame using a synthesis based on both the rendered tiles of the first image frame and the rendered some tiles among all tiles of the second image frame, and a display configured to display the first image frame, the generated interpolated image frame and the second image frame of the application data on a display screen.
  • a method of processing data in a digital device may include receiving application data, rendering tiles of a first image frame of the received application data, rendering some tiles among all tiles of a second image frame of the received application data, generating an interpolated image frame using a synthesis based on both the rendered tiles of the first image frame and the rendered some tiles among all tiles of the second image frame, and displaying the first image frame, the generated interpolated image frame and the second image frame of the application data on a display screen.
  • embodiments of the present invention provide various effects and/or features.
  • QoS quality of service
  • application data and the like can be processed in a manner of being maintained or improved without hardware capacity improvement or configuration addition of link device(s) related to a processing unit as well as the processing unit for processing the application data.
  • performance such as QoS and the like can be maintained or improved despite improving overall environment (e.g., hardware capacity, power, temperature, etc.) of a device by processing application data by software.
  • overall environment e.g., hardware capacity, power, temperature, etc.
  • a cost increase burden due to hardware can be minimized in a manner of maintaining or improving performance of processing application data by software without hardware capacity improvement or configuration addition and a device purchase desire can be increased in a manner of improving user’s device satisfaction.
  • FIG. 1 illustrates a schematic diagram of a service system according to one embodiment of the present invention
  • FIG. 2 illustrates a block diagram of a digital device according to one embodiment of the present invention
  • FIG. 3 illustrates a block diagram for a different/detailed configuration shown in FIG. 2;
  • FIG. 4 illustrates a block diagram of an external device according to one embodiment of the present invention
  • FIG. 5 illustrates a perspective view illustrating one example of a watch-type mobile terminal 300 according to an embodiment of the present disclsoure
  • FIG. 6 illustrates a diagram of a control means for controlling a digital device according to one embodiment of the present invention
  • FIG. 7 illustrates a diagram to describe a method of processing application data in a digital device according to one embodiment of the present invention
  • FIG. 8 illustrates a diagram to describe a method of processing application data in a digital device according to another embodiment of the present invention.
  • FIG. 9 illustrates a block diagram for configuration of a digital device for processing an image frame according to one embodiment of the present invention.
  • FIG. 10 illustrates a block diagram for details of configuration of an image processing unit shown in FIG. 9;
  • FIG. 11 and FIG. 12 illustrate diagrams to describe a method of calculating a rendering priority for tiles of a next image frame based on motion prediction data according to one embodiment of the present invention
  • FIG. 13 and FIG. 14 illustrate diagrams to describe a method of synthesizing an interpolated image frame using tile(s) preferentially processed in a corresponding image frame (Frame N + 1) and a previous image frame (Frame N) in the course of rendering according to one embodiment of the present invention
  • FIG. 15 illustrates a diagram to describe an image frame interpolating method according to one embodiment of the present invention
  • FIG. 16 illustrates a diagram to describe an image frame interpolating method according to another embodiment of the present invention.
  • FIG. 17 illustrates a diagram to describe a scheme of processing a real image according to the present invention
  • FIG. 18 illustrates a diagram to describe the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Fame N + 1) on the basis of motion prediction data according to one embodiment of the present invention
  • FIG. 19 illustrates a flowchart for a method of processing data in a digital device according to one embodiment of the present invention
  • FIG. 20 illustrates a block diagram for image processing configuration of a digital device for processing application data according to one embodiment of the present invention
  • FIG. 21 illustrates a diagram to describe details of a processing procedure between a control unit 2020 and a tile order selecting unit 2030 shown in FIG. 20;
  • FIG. 22 illustrates a diagram to describe a tile rendering sequence according to one embodiment of the present invention
  • FIG. 23 illustrates a diagram to describe a method of synthesizing an intermediate image fame according to one embodiment of the present invention
  • FIG. 24 illustrates a block diagram for image processing configuration of a digital device for processing application data according to another embodiment of the present invention.
  • FIG. 25 illustrates a diagram to describe a method of processing image frames in accordance with a tile rendering sequence according to one embodiment of the present invention
  • FIG. 26 illustrates a diagram to describe a method of synthesizing an interpolated image frame using tile(s) preferentially processed in a corresponding image frame (Frame N + 1) and a previous image frame (Frame N) in the course of rendering according to one embodiment of the present invention
  • FIG. 27 illustrates a diagram to describe an image frame interpolating method according to one embodiment of the present invention
  • FIG. 28 illustrates a diagram to describe an image frame interpolating method according to another embodiment of the present invention.
  • FIG. 29 illustrates a diagram to describe the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Fame N + 1) on the basis of motion prediction data according to one embodiment of the present invention.
  • FIG. 30 illustrates a flowchart for a method of processing data in a digital device according to one embodiment of the present invention.
  • a suffix such as “module” and “unit” may be used to refer to elements or components. Use of such a suffix herein is merely intended to facilitate description of the disclosure, and the suffix itself is not intended to give any special meaning or function. Meanwhile, such an ordinal number as ‘first-‘, ‘second-’ and the like may have a meaning of an order. Yet, the terminologies can be used for the purpose of distinguishing one component from another component capable of being overlapped with each other.
  • a digital device may be any device that can handle any one of transmitting, receiving, handling and outputting data, content, service, application, and so forth.
  • the digital device may be connected to other digital devices through wired network or wireless network, paired or connected to an external server, and through the connections, the digital device may transmit and receive the prescribed data.
  • Examples of the digital device may include standing devices such as a network television (TV), a Hybrid Broadcast Broadband TV (HBBTV), a smart TV, Internet Protocol TV (IPTV), and personal computer (PC), or mobile (or handheld) devices such as a Personal Digital Assistant (PDA), a smart phone, a tablet PC, or an Notebook computer.
  • TV network television
  • HBBTV Hybrid Broadcast Broadband TV
  • IPTV Internet Protocol TV
  • PC personal computer
  • PDA Personal Digital Assistant
  • the Digital TV is used in FIGs. 2-3 and the mobile device is used in FIGs. 4-5 depicting the digital device. Further, the digital device in this disclosure may be referred to configuration comprising only a panel, set-top box (STB), or a SET including the entire system.
  • STB set-top box
  • the wired or wireless network described in this disclosure may refer to various pairing methods, standard telecommunication network protocol methods supported for transmitting and receiving data between digital devices or between digital device and the external server.
  • the wired or wireless network also includes various telecommunication network protocols supported now as well as in the future.
  • wired or wireless network examples include wired networks supported by various telecommunication standard such as Universal Serial Bus (USB), Composite Video Banking Sync (CVBS), Component, S-Video (analog), Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI), RGB, D-SUB and so forth, and wireless networks supported by various standards including BluetoothTM, Radio Frequency Identification (RFID), infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), Wireless LAN (WLAN)(Wi-Fi), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet (HSDPA), Long Term Evolution/LTE-Advanced (LTE/LTE-A), Wi-Fi direct, and so forth.
  • RFID Radio Frequency Identification
  • IrDA infrared Data Association
  • UWB Ultra Wideband
  • ZigBee Digital Living Network Alliance
  • DLNA Wireless LAN
  • WLAN Wireless LAN
  • the disclosure referring simply to the digital device can include a standing device or a mobile device depending on the context, and when it is not referred to a specific device, the digital device referred in this disclosure refers to both standing and mobile device.
  • the digital device may perform intelligent functions such as receiving broadcasting program, operating computer functions, and supporting at least one external input, and by being connected through the network wired or wirelessly, the digital device may support e-mail functions, web browsing functions, banking, gaming, and executing applications.
  • the digital device may further include an interface for any one of input or control means supporting a handwriting input, a touch-screen, and a spatial remote control.
  • the digital device may use a standard operating system (OS), however, the digital device described in this disclosure and the embodiments, uses a Web OS. Therefore, the digital device may perform functions such as adding, deleting, amending, and updating the various services and applications for standard universal OS kernel or Linux kernel in order to construct a more user-friendly environment.
  • OS operating system
  • the digital device described in this disclosure and the embodiments uses a Web OS. Therefore, the digital device may perform functions such as adding, deleting, amending, and updating the various services and applications for standard universal OS kernel or Linux kernel in order to construct a more user-friendly environment.
  • the external input includes external input devices described above, meaning all input mechanisms or digital devices, capable of transmitting and receiving data through wired or wireless network connected to and from the digital device.
  • the external input includes High Definition Multimedia Interface (HDMI), game devices such as Playstation or X-Box, smart phone, tablet PC, printing device such as pocket photo, digital devices such as smart TV and blue-ray device.
  • HDMI High Definition Multimedia Interface
  • game devices such as Playstation or X-Box
  • smart phone such as Playstation or X-Box
  • tablet PC tablet PC
  • printing device such as pocket photo
  • digital devices such as smart TV and blue-ray device.
  • the “server” referred to as in this disclosure includes a digital device or a system capable of transmitting and receiving data to and from a client, and may also be referred to as a processor.
  • the server may be servers providing services such as a portal server providing a web page, a web content or a web service, an advertising server providing advertising data, a content server, a Social Network Service (SNS) server providing a SNS service, a service server providing a service by a manufacturer, a Multichannel Video Programming Distributor (MVPD) providing a Video on Demand (VoD) or a streaming service, and a service server providing pay services.
  • SNS Social Network Service
  • MVPD Multichannel Video Programming Distributor
  • FIG. 1 illustrates a schematic diagram of a service system according to one embodiment of a present disclosure.
  • the service system may include a server 105 and a DTV 110, basically.
  • the DTV 110 may be replaced to a mobile device (e.g. , a smart phone) 120 or a wearable device 130.
  • the service system further includes the mobile device 120 or the wearable device 130.
  • the DTV 110 can be controlled by a controlling means including a dedicated remote controller 115, and the like.
  • FIG. 2 illustrates a block diagram of the digital device according to one embodiment of a present disclosure.
  • the digital device 200 can correspond to the DTV 110 shown in FIG. 1.
  • the digital device 200 can include a network interface unit 201, a TCP/IP (Transfer Control Protocol/Internet Protocol) manager 202, a service delivery manager 203, an SI (System Information, Service Information or Signaling Information) decoder 204, a demultiplexer 205, an audio decoder 206, a video decoder 207, a display A/V (Audio/Video) and OSD (On Screen Display) module 208, a service control manager 209, a service discovery manager 210, a SI&metadata database (DB) 211, a metadata manager 212, a service manager 213, a UI (User Interface) manager 214, etc.
  • TCP/IP Transfer Control Protocol/Internet Protocol
  • SI System Information, Service Information or Signaling Information
  • demultiplexer 205 an audio decoder 206
  • a video decoder 207 a display A/V (Audio/Video) and OSD (On Screen Display) module 208
  • a service control manager 209 a
  • the network interface unit (or a receiving unit) 201 can receive or transmit IP packets or IP datagrams (hereinafter, referred as IP packets) through an accessed network.
  • IP packets IP datagrams
  • the network interface unit 201 can receive service, application, content, etc., from a service provider through the network.
  • the TCP/IP manager 202 is involved in a packet delivery of IP packets transmitted to the digital device 200 and IP packets transmitted from the digital device 200 between a source and a destination.
  • the TCP/IP manager 202 may classify received packets according to an appropriate protocol and output the classified packets to the service delivery manager 205, the service discovery manager 210, the service control manager 209, and the metadata manager 212, etc.
  • the service delivery manager 203 can control received service data.
  • the service delivery manager 203 can use a Real-Time Protocol/Real-Time Control Protocol (RTP/RTCP) to control real-time streaming data.
  • RTP/RTCP Real-Time Protocol/Real-Time Control Protocol
  • the service delivery manager 203 can parse a received real-time streaming data packet, transmitted based on the RTP, and transmit the parsed data packet to the demultiplexer 205 or store the parsed data packet in the SI & metadata DB 211 under the control of the service manager 213.
  • the service delivery manager 203 can provide feedback of the network reception information to the server based on the RTCP.
  • the demultiplexer 205 can demultiplex audio data, video data, SI data from a received packet and transmit the demultiplexed data to each of the audio/video decoder 206/207 and the SI decoder 204.
  • the SI decoder 204 can decode the demultiplexed SI data such as program specific information (PSI), program and system information protocol (PSIP), digital video broadcast-service information (DVB-SI), digital television terrestrial multimedia broadcasting/coding mobile multimedia broadcasting (DTMB/CMMB), etc.
  • the SI decoder 204 can store the decoded SI data in the SI & metadata DB 211.
  • the SI data stored in the SI & metadata DB 211 can be read and extracted by a component which requires the SI data according to user request, for example.
  • the audio decoder 206 and the video decoder 207 can decode the demultiplexed audio and video data, respectively.
  • the decoded audio data and video data can be displayed on a display screen of the display unit 208.
  • the application manager can include the service manager 213 and the UI manager 214, for example.
  • the application manager can perform a function of the controller of the digital device 200. In other words, the application manager can administrate the overall state of the digital receiver 200, provides a UI, and manage other mangers.
  • the UI manager 214 can provide a graphic user interface (GUI)/UI for the user using OSD, etc. And, the UI manager 214 can receive a key input from the user and perform an operation of the device in response to the received key input. For example, the UI manager 214 can transmit a key input signal to the service manager 213 if the key input signal of selecting a channel is received from the user.
  • GUI graphic user interface
  • the service manager 213 can control service-related managers such as the service delivery manager 203, the service discovery manager 210, the service control manager 209, and the metadata manager 212.
  • the service manager 213 can generate a channel map and control a channel selection using the generated channel map according to the received key input from the UI manager 214.
  • the service manager 213 can receive service information from the SI decoder 204 and set audio/video PID (packet identifier) of a selected channel to the demultiplexer 205.
  • the set audio/video PID can be used for the demultiplexing procedure.
  • the demultiplexer 2405 can filter the audio data, video data and SI data using the PID (PID filtering or section filtering.)
  • the service discovery manager 210 can provide information required to select a service provider that provides a service. Upon receipt of a signal for selecting a channel from the service manager 213, the service discovery manager 210 discovers or searches a service based on the received signal.
  • the service control manager 209 can select and control a service.
  • the service control manager 209 can use perform service selection and control using IGMP or real time streaming protocol (RTSP) when the user selects a live broadcast service, and using RTSP when the user selects a VOD service.
  • RTSP real time streaming protocol
  • the RTSP can provide a trick mode for the real-time streaming.
  • the service manager 213 can initialized and manage a session through the IMS (IP Multimedia Subsystem) gateway 250 using IMS and SIP (Session Initiation Protocol.)
  • IMS IP Multimedia Subsystem
  • SIP Session Initiation Protocol
  • the metadata manager 212 can manage metadata regarding services and store metadata in the SI & metadata DB 211.
  • the SI & metadata DB 211 can store SI data decoded by the SI decoder 204, metadata managed by the metadata manager 212, and information required to select a service provider, which is provided by the service discovery manager 210.
  • the SI & metadata DB 211 can store system set-up data, etc.
  • the SI & metadata DB 211 can be implemented using a Non-Volatile RAM (NVRAM) or a Flash memory, and the like.
  • NVRAM Non-Volatile RAM
  • Flash memory and the like.
  • An IMS gateway 250 can be a gateway that collects functions required to access IPTV services based on an IMS.
  • FIG. 3a illustrates a block diagram illustrating a digital device according to other embodiment of the present disclosure.
  • the digital device can include a broadcast receiving unit 305, an external device interface 316, a storage unit 318, a user input interface 320, a controller 325, a display unit 330, an audio output unit 335, a power supply unit 340, and a photographing unit (not shown).
  • the broadcast receiving unit 305 can include at least one of tuner 310 and a demodulator 312, and a network interface 314.
  • the broadcast receiving unit 305 can include the tuner 310 and the demodulator 312 without the network interface 314, or can include the network interface 314 without the tuner 310 and the demodulator 312.
  • the broadcast receiving unit 305 can include a multiplexer (not shown) to multiplex a signal, which is demodulated by the demodulator 312 via the tuner 310, and a signal received through the network interface 314.
  • the broadcast receiving unit 305 can include a demultiplexer (not shown) and demultiplex a multiplexed signal, a demodulated signal, or a signal received through the network interface 314.
  • the tuner 310 can receive a radio frequency (RF) broadcast signal, through an antenna, by tuning to a channel selected by the user or all previously stored channels. Also, the tuner 310 can convert the received RF broadcast signal into an IF (Intermediate Frequency) signal or a baseband signal.
  • RF radio frequency
  • the tuner 310 can process both the digital broadcast signal and the analog broadcast signal.
  • the analog baseband image or a voice signal output from the tuner 310 can be directly input to the controller 325.
  • the tuner 310 can receive a RF broadcast signal of single carrier or multiple carriers.
  • the tuner 310 can sequentially tune and receive a RF broadcast signal of all broadcast channel stored by a channel memory function among RF broadcast signal received through an antenna to. And, the tuner 310 can covert the received RF broadcast signal into the DIF.
  • the demodulator 312 receives the DIF signal, demodulates the received DIF signal, and performs a channel decoding, etc.
  • the demodulator 312 includes a trellis decoder, a de-interleaver, a Reed-Solomon decoder, etc., or includes a convolution decoder, the de-interleaver, the Reed-Solomon decoder, etc.
  • the demodulator 312 can outputs a transport stream (TS) after performing a demodulation and a channel decoding.
  • the TS signal can be a signal by multiplexing a video signal, an audio signal or a data signal.
  • the TS signal can be an MPEG-2 TS by multiplexing an MPEG-2 standard video signal, a Dolby (AC-3 standard) audio signal, etc.
  • a TS signal output from the demodulator 312 can be input to the controller 325.
  • the controller 325 can control demultiplexing, processing audio/video signal, etc. Furthermore, the controller 325 can control outputting video through the display unit 330 and outputting audio through the audio output unit 335.
  • the external device interface 316 can provide an environment for interfacing external devices with the digital device.
  • the external device interface 316 can include an A/V input/output unit (not shown) or an RF communication unit (not shown).
  • the external device interface 316 can be connected with external devices such as a digital versatile disk (DVD), a Blu-ray player, a game device, a camera, a camcorder, a computer (including a notebook computer), a tablet PC, a smart phone, a Bluetooth device, a Cloud server and the like in a wired/wireless manner.
  • the external device interface 316 transfer a signal to the controller 325 of the digital device.
  • the signal includes image data, video data, audio data which is input through an external device.
  • the external device is connected to the digital device.
  • the controller 325 can control to output the signal including the processed image data, the processed video data and the processed audio data to the connected external device.
  • the external device interface 316 can further include an A/V input/output unit or a wireless communication unit (not shown).
  • the A/V input/output unit may include a USB terminal, a CVBS terminal, a component terminal, an S-video terminal (analog), a DVI terminal, a HDMI terminal, an RGB terminal, a D-SUB terminal, etc.
  • the RF communication unit can perform near field communication.
  • the digital device can be networked with other electronic apparatuses according to communication protocols such as Bluetooth, RFID, IrDA, UWB, ZigBee, and DLNA, for example.
  • the external device interface 316 can connect a STB via at least one interface described above, and perform an input/output operation with the connected STB.
  • the external device interface 316 can receive application or application list included in a nearby external device, and can transfer the application or the application list to the controller 325 or the storage unit 318.
  • the network interface 314 may provide an interface for connecting the digital device to wired/wireless networks.
  • the digital receiver can transmit/receive data to/from other users or other electronic apparatuses or access a predetermined web page through a network connected thereto or another network linked to the connected network.
  • the network interface 314 can selectively receive a desired application from among publicly open applications through a network.
  • the network interface 314 can select a wanted application among open applications and the selected application via a network.
  • the storage unit 318 may store programs for signal processing and control and store a processed video, audio or data signal.
  • the storage unit 318 may execute a function of temporarily storing a video, audio or data signal input from the external device interface 316 or the network interface 314.
  • the storage unit 318 may store information about a predetermined broadcast channel through a channel memory function.
  • the storage unit 318 can store an application or a list of applications input from the external device interface 316 or the network interface 314.
  • the storage unit 318 may store various platforms which will be described later.
  • the storage unit 318 can include storage media of one or more types, such as a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory), RAM, EEPROM, etc.
  • the digital device may reproduce content files (e.g., a video file, a still image file, a music file, a text file, an application file, etc.) and provide them to the user.
  • FIG. 3a illustrates an embodiment in which the storage unit 318 is separated from the controller 325
  • the configuration of the digital device is not limited thereto and the storage unit 318 may be included in the controller 325.
  • the user input interface 320 may transmit a signal input by the user to the controller 325 or deliver a signal output from the controller 325 to the user.
  • the user input interface 320 can receive control signals such as a power on/off signal, a channel selection signal, an image setting signal, etc. from the remote controller or transmit control signals of the controller 325 to the remote controller according to various communication schemes such as RF communication, IR communication, and the like.
  • control signals such as a power on/off signal, a channel selection signal, an image setting signal, etc. from the remote controller or transmit control signals of the controller 325 to the remote controller according to various communication schemes such as RF communication, IR communication, and the like.
  • the user input interface 320 can transmit control signals input through a power key, a channel key, a volume key, and a local key (not shown) of a set value to the controller 325.
  • the user input interface 320 can transmit a control signal input from a sensing unit (not shown) which senses a gesture of the user or deliver a signal of the controller 325 to the sensing unit.
  • the sensing unit may include a touch sensor, a voice sensor, a position sensor, an action sensor, an acceleration sensor, a gyro sensor, a speed sensor, a tilt sensor, a temperature sensor, a pressure or back-pressure sensor, etc.
  • the controller 325 can generate and output a signal for video or audio output by demultiplexing streams input through the tuner 310, the demodulator 312 or the external device interface 316 or processing demultiplexed signals.
  • a video signal processed by the controller 325 can be input to the display unit 330 and displayed as an image through the display unit 330.
  • the video signal processed by the controller 325 can be input to an external output device through the external device interface 316.
  • An audio signal processed by the controller 325 can be applied to the audio output unit 335. Otherwise, the audio signal processed by the controller 325 can be applied to an external output device through the external device interface 316.
  • the controller 325 may include a demultiplexer and an image processor, which are not shown in FIG. 3a.
  • the controller 325 can control the overall operation of the digital device. For example, the controller 325 can control the tuner 310 to tune to an RF broadcast corresponding to a channel selected by the user or a previously stored channel.
  • the controller 325 can control the digital device according to a user command input through the user input interface 320 or an internal program. Particularly, the controller 325 can control the digital device to be linked to a network to download an application or application list that the user desires to the digital device.
  • the controller 325 may control the tuner 310 to receive a signal of a channel selected in response to a predetermined channel selection command received through the user input interface 320.
  • the controller 325 may process a video, audio or data signal corresponding to the selected channel.
  • the controller 325 may control information on a channel selected by the user to be output with a processed video or audio signal through the display unit 330 or the audio output unit 335.
  • the controller 325 may control a video signal or an audio signal received from an external apparatus, for example, a camera or a camcorder through the external device interface 316 to be output through the display unit 330 or the audio output unit 335 according to an external device image reproduction command received through the user input interface 316.
  • an external apparatus for example, a camera or a camcorder
  • the controller 325 may control a video signal or an audio signal received from an external apparatus, for example, a camera or a camcorder through the external device interface 316 to be output through the display unit 330 or the audio output unit 335 according to an external device image reproduction command received through the user input interface 316.
  • the controller 325 can control the display unit 330 to display images.
  • the controller 325 can control a broadcast image input through the tuner 310, an external input image received through the external device interface 316, an image input through the network interface 314, or an image stored in the storage unit 318 to be displayed on the display unit 330.
  • an image displayed on the display unit 330 can be a still image or video, and it can be a 2D or 3D image.
  • the controller 325 can control reproduction of content.
  • the content may be content stored in the digital device, received broadcast content, or content input from an external device.
  • the content may include at least one of a broadcast image, an external input image, an audio file, a still image, an image of a linked web, and a text file.
  • the controller 325 can control display of applications or an application list, downloadable from the digital device or an external network, when an application view menu is selected.
  • the controller 325 can control installation and execution of applications downloaded from an external network in addition to various UIs. Furthermore, the controller 325 can control an image relating to an application executed by user selection to be displayed on the display unit 330.
  • the digital device may further include a channel browsing processor (not shown) which generates a thumbnail image corresponding to a channel signal or an external input signal.
  • a channel browsing processor (not shown) which generates a thumbnail image corresponding to a channel signal or an external input signal.
  • the channel browsing processor can receive a stream signal (e.g., TS) output from the demodulator 312 or a stream signal output from the external device interface 316 and extract an image from the received stream signal to generate a thumbnail image.
  • the generated thumbnail image can be directly input to the controller 325 or can be encoded and then input to the controller 325.
  • the thumbnail image can be coded into a stream and then applied to the controller 325.
  • the controller 325 can display a thumbnail list including a plurality of thumbnail images on the display unit 330 using thumbnail images input thereto.
  • the thumbnail images included in the thumbnail list can be updated sequentially or simultaneously. Accordingly, the user can conveniently check content of a plurality of broadcast channels.
  • the display unit 330 may convert a video signal, a data signal, and an OSD signal processed by the controller 325 and a video signal and a data signal received from the external device interface 316 into RGB signals to generate driving signals.
  • the display unit 330 may be a PDP, an LCD, an OLED, a flexible display, a 3D display or the like.
  • the display unit 330 may be configured as a touch-screen and used as an input device rather than an output device.
  • the audio output unit 335 receives a signal audio-processed by the controller 325, for example, a stereo signal, a 3.1 channel signal or a 5.1 channel signal, and outputs the received signal as audio.
  • the audio output unit 335 can be configured as one of various speakers.
  • the digital device may further include the sensing unit for sensing a gesture of the user, which includes at least one of a touch sensor, a voice sensor, a position sensor, and an action sensor, as described above.
  • a signal sensed by the sensing unit can be delivered to the controller 325 through the user input interface 320.
  • the digital device may further include the photographing unit for photographing the user.
  • Image information acquired by the photographing unit can be supplied to the controller 325.
  • the controller 325 may sense a gesture of the user from an image captured by the photographing unit or a signal sensed by the sensing unit, or by combining the image and the signal.
  • the power supply unit 340 may supply power to the digital device.
  • the power supply unit 340 can supply power to the controller 325 which can be implemented as a system-on-chip (SoC), the display unit 330 for displaying images, and the audio output unit 335 for audio output.
  • SoC system-on-chip
  • the power supply unit 340 can include a converter (not shown) converting a alternating source into a direct source.
  • the power supply unit 340 can include an inverter (not shown) which is capable of performing a Pulse Width Modulation (PWM) for changing or dimming a luminance.
  • PWM Pulse Width Modulation
  • the remote control device 345 may transmit user input to the user input interface 320.
  • the remote controller can use Bluetooth, RF communication, IR communication, UWB, ZigBee, etc.
  • the remote control device 345 can receive audio, video or data signal output from the user input interface 320 and display the received signal or output the same as audio or vibration.
  • the above-mentioned digital device can be a digital broadcast receiver which is capable of processing a digital broadcast signal of a fixed or mobile ATSC method, or a digital broadcast signal of a DVB method.
  • FIG. 3b illustrates a block diagram illustrating a detailed configuration of a controller shown in FIG. 2 to FIG. 3a according to one embodiment of a present disclosure.
  • the digital receiver may include a demultiplexer 350, an image processor, an OSD generator 366, a mixer 370, a frame rate converter (FRC) 385, and an Output formatter (or a 3D formatter) 390.
  • a demultiplexer 350 an image processor, an OSD generator 366, a mixer 370, a frame rate converter (FRC) 385, and an Output formatter (or a 3D formatter) 390.
  • the demultiplexer 350 can demultiplex an input stream signal into an MPEG-2 TS image, an audio signal and a data signal, for example.
  • the image processor can process a demultiplexed image signal using a video decoder 362 and a scaler 364.
  • the video decoder 362 can decode the demultiplexed image signal and the scaler 364 can scale the resolution of the decoded image signal such that the image signal can be displayed.
  • the video decoder 362 can support various standards. For example, the video decoder 362 can perform a function as an MPEG -2 decoder when the video signal is coded in an MPEG-2 standard. The video decoder 362 can perform a function as a H.264 decoder when the video signal is coded in a digital multimedia broadcasting (DMB) method or the H. 264/H. 265 standard method.
  • DMB digital multimedia broadcasting
  • the image signal decoded by the image processor may be input to the mixer 364.
  • the OSD generator 366 may generate OSD data automatically or according to user input. For example, the OSD generator 366 may generate data to be displayed on the screen of an output unit in the form of an image or text on the basis of a control signal of a user input interface.
  • OSD data generated by the OSD generator 366 may include various data such as a UI image of the digital receiver, various menu screens, widget, icons, and information on ratings.
  • the OSD generator 366 can generate a caption of a broadcast image or data for displaying EPG based broadcast information.
  • the mixer 370 may mix the OSD data generated by the OSD generator 366 and the image signal processed by the image processor.
  • the mixer 370 may provide the mixed signal to the output formatter 390.
  • OSD may be overlaid on a broadcast image or external input image.
  • the FRC 380 may convert a frame rate of input video.
  • the frame rate converter 380 can convert the frame rate of an input 60Hz video to a frame rate of 120Hz or 240Hz, according to an output frequency of the output unit.
  • the FRC 380 may be bypassed when frame conversion is not executed.
  • the output formatter 390 may change the output of the FRC 380, which is input thereto, into a form suitable for the output format of the output unit.
  • the output formatter 390 can output an RGB data signal.
  • this RGB data signal can be output according to low voltage differential signaling (LVDS) or mini-LVDS.
  • LVDS low voltage differential signaling
  • the output formatter 390 can format the 3D image signal such that the 3D image signal is matched to the output format of the output unit, to thereby support a 3D service.
  • An audio processor may audio-process a demultiplexed audio signal.
  • the audio processor can support various audio formats. For example, when audio signals are encoded in MPEG-2, MPEG-4, advanced audio coding (AAC), high efficiency-AAC (HE-AAC), AC-3 and bit sliced audio coding (BSAC) formats, the audio processor can include decoders corresponding to the formats to process the audio signals.
  • AAC advanced audio coding
  • HE-AAC high efficiency-AAC
  • BSAC bit sliced audio coding
  • the audio processor can control base, treble and volume.
  • a data processor (not shown) can process a demultiplexed data signal.
  • the data processor can decode the encoded demultiplexed data signal.
  • the encoded data signal may be EPG information including broadcast information such as the start time and end time (or duration) of a broadcast program which is broadcast through each channel.
  • each component can be integrated, added or omitted according to a capability of the digital device which is actually implemented. That is, if necessary, at least two components are united into a single component or a single component is divided into at least two components. Also, a function performed by each block explains an embodiment of the present disclosure, the specific operation or device is not limited to a scope of the present disclosure.
  • the digital device can be an image signal processing device for performing a signal of an input image or an image stored in the device.
  • the image signal device can be a STB which does not include the display unit and the audio output unit shown, a DVD player, a Blu-ray player, a game device, a computer, etc.
  • FIG. 4 illustrates a block diagram illustrating a digital device according to another embodiment of the present disclosure.
  • FIGs. 2 through 3 explained above refers to a standing device as according to an embodiment of the digital device, but FIGs. 4 through 5 refer to a mobile device as another embodiment of the digital device.
  • the mobile terminal 400 can include a wireless communication unit 410, an A/V input unit 420, an user input unit 430, a sensing unit 440, an output unit 450, a memory 460, an interface unit 470, a controller 480, and a power supply unit 490.
  • the wireless communication unit 410 typically includes one or more components which permit wireless communication between the mobile terminal 400 and a wireless communication system or network within which the mobile terminal 400 is located.
  • the wireless communication unit 410 can include a broadcast receiving module 411, a mobile communication module 412, a wireless Internet module 413, a short-range communication module 414, and a position-location module 415.
  • the broadcast receiving module 411 receives a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • At least two broadcast receiving modules 411 can be provided in the mobile terminal 400 to facilitate simultaneous reception of at least two broadcast channels or broadcast channel switching.
  • the broadcast associated information includes information associated with a broadcast channel, a broadcast program, or a broadcast service provider. Furthermore, the broadcast associated information can be provided via a mobile communication network. In this case, the broadcast associated information can be received by the mobile communication module 412.
  • broadcast associated information can be implemented in various forms.
  • broadcast associated information may include an electronic program guide (EPG) and an electronic service guide (ESG).
  • EPG electronic program guide
  • ESG electronic service guide
  • the broadcast receiving module 511 may be configured to receive broadcast signals transmitted from various types of broadcast systems.
  • broadcasting systems may include digital video broadcasting-Terrestrial (DVB-T), DVB-Satellite (DVB-S), DVB-Handheld (DVB-H), DVB-Convergence of Broadcasting and Mobile Services (DVB-CBMS), Open Mobile Alliance Broadcast (OMA-BCAST), the data broadcasting system known as media forward link only (MediaFLOTM) and integrated services digital broadcast-terrestrial (ISDB-T).
  • OMA-BCAST Open Mobile Alliance Broadcast
  • MediaFLOTM media forward link only
  • ISDB-T integrated services digital broadcast-terrestrial
  • the broadcast receiving module 511 can be configured to be suitable for other broadcasting systems as well as the above-noted digital broadcasting systems.
  • the broadcast signal and/or broadcast associated information received by the broadcast receiving module 411 may be stored in a suitable device, such as the memory 460.
  • the mobile communication module 412 transmits/receives wireless signals to/from one or more network entities (e.g., a base station, an external terminal, and/or a server) via a mobile network such as GSM (Global System for Mobile communications), CDMA (Code Division Multiple Access), or WCDMA (Wideband CDMA).
  • a mobile network such as GSM (Global System for Mobile communications), CDMA (Code Division Multiple Access), or WCDMA (Wideband CDMA).
  • GSM Global System for Mobile communications
  • CDMA Code Division Multiple Access
  • WCDMA Wideband CDMA
  • the wireless Internet module 413 supports Internet access for the mobile terminal 400. This module may be internally or externally coupled to the mobile terminal 400.
  • the wireless Internet technology can include Wi-Fi, Wibro, Wimax, or HSDPA.
  • the short-range communication module 514 facilitates relatively short-range communications. Suitable technologies for implementing this module include RFID, IrDA, UWB, as well as the networking technologies commonly referred to as BluetoothTM and ZigBeeTM, to name a few.
  • the position-location module 415 identifies or otherwise obtains the location of the mobile terminal 400. According to one embodiment, this module may be implemented with a global positioning system (GPS) module.
  • GPS global positioning system
  • the GPS module 415 can precisely calculate current 3-dimensional (3D) position information based on at least longitude, latitude or altitude and direction (or orientation) by calculating distance information and precise time information from at least three satellites and then applying triangulation to the calculated information. Location information and time information are calculated using three satellites, and errors of the calculated location position and one or more time information are then amended (or corrected) using another satellite.
  • the GPS module 415 can calculate speed information by continuously calculating a real-time current location.
  • the A/V input unit 420 is configured to provide audio or video signal input to the mobile terminal 400.
  • the A/V input unit 420 includes a camera 421 and a microphone 422.
  • the camera 421 receives and processes image frames of still pictures or video, which are obtained by an image sensor in a video call mode or a photographing mode. Furthermore, the processed image frames can be displayed on the display 451.
  • the image frames processed by the camera 421 can be stored in the memory 460 or can be transmitted to an external recipient via the wireless communication unit 410.
  • at least two cameras 421 can be provided in the mobile terminal 400 according to the environment of usage.
  • the microphone 422 receives an external audio signal while the portable device is in a particular mode, such as phone call mode, recording mode and voice recognition. This audio signal is processed and converted into electronic audio data. The processed audio data is transformed into a format transmittable to a mobile communication base station via the mobile communication module 412 in a call mode.
  • the microphone 422 typically includes assorted noise removing algorithms to remove noise generated in the course of receiving the external audio signal.
  • the user input unit 430 generates input data responsive to user manipulation of an associated input device or devices.
  • Examples of such devices include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, and a jog switch.
  • the sensing unit 440 provides sensing signals for controlling operations of the mobile terminal 400 using status measurements of various aspects of the mobile terminal. For instance, the sensing unit 440 may detect an open/closed status of the mobile terminal 400, the relative positioning of components (e.g., a display and keypad) of the mobile terminal 400, a change of position (or location) of the mobile terminal 400 or a component of the mobile terminal 400, a presence or absence of user contact with the mobile terminal 400, and an orientation or acceleration/deceleration of the mobile terminal 400. As an example, a mobile terminal 400 configured as a slide-type mobile terminal is considered. In this configuration, the sensing unit 440 may sense whether a sliding portion of the mobile terminal is open or closed.
  • components e.g., a display and keypad
  • the sensing unit 440 senses the presence or absence of power provided by the power supply unit 490, and the presence or absence of a coupling or other connection between the interface unit 470 and an external device.
  • the sensing unit 440 can include a proximity sensor 441.
  • the output unit 450 generates output relevant to the senses of sight, hearing, and touch. Furthermore, the output unit 450 includes the display 451, an audio output module 452, an alarm unit 453, a haptic module 454, and a projector module 455.
  • the display 451 is typically implemented to visually display (output) information associated with the mobile terminal 400. For instance, if the mobile terminal is operating in a phone call mode, the display will generally provide a UI or GUI which includes information associated with placing, conducting, and terminating a phone call. As another example, if the mobile terminal 400 is in a video call mode or a photographing mode, the display 451 may additionally or alternatively display images which are associated with these modes, the UI or the GUI.
  • the display module 451 may be implemented using known display technologies. These technologies include, for example, a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode display (OLED), a flexible display and a three-dimensional display.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode display
  • the mobile terminal 500 may include one or more of such displays.
  • Some of the displays can be implemented in a transparent or optical transmittive type, i.e., a transparent display.
  • a representative example of the transparent display is the TOLED (transparent OLED).
  • a rear configuration of the display 451 can be implemented as the optical transmittive type as well. In this configuration, a user can see an object located at the rear of a terminal body on a portion of the display 451 of the terminal body.
  • At least two displays 451 can be provided in the mobile terminal 400 in accordance with one embodiment of the mobile terminal 400.
  • a plurality of displays can be arranged to be spaced apart from each other or to form a single body on a single face of the mobile terminal 400.
  • a plurality of displays can be arranged on different faces of the mobile terminal 400.
  • the display 451 and a sensor for detecting a touch action are configured as a mutual layer structure (hereinafter called ‘touch screen’)
  • the display 551 is usable as an input device as well as an output device.
  • the touch sensor can be configured as a touch film, a touch sheet, or a touchpad.
  • the touch sensor can be configured to convert pressure applied to a specific portion of the display 451 or a variation of capacitance generated from a specific portion of the display 451 to an electronic input signal. Moreover, the touch sensor is configurable to detect pressure of a touch as well as a touched position or size.
  • a touch input is made to the touch sensor, a signal(s) corresponding to the touch input is transferred to a touch controller.
  • the touch controller processes the signal(s) and then transfers the processed signal(s) to the controller 480. Therefore, the controller 480 is made aware when a prescribed portion of the display 451 is touched.
  • a proximity sensor 441 can be provided at an internal area of the mobile terminal 400 enclosed by the touch screen or around the touch screen.
  • the proximity sensor 441 is a sensor that detects a presence or non-presence of an object approaching a prescribed detecting surface or an object existing (or located) around the proximity sensor 441 using an electromagnetic field strength or infrared ray without mechanical contact.
  • the proximity sensor 441 is more durable than a contact type sensor and also has utility broader than the contact type sensor.
  • the proximity sensor 441 can include one of a transmittive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a radio frequency oscillation proximity sensor, an electrostatic capacity proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. If the touch screen includes the electrostatic capacity proximity sensor, it is configured to detect the proximity of a pointer using a variation of an electric field according to the proximity of the pointer. In this configuration, the touch screen (touch sensor) can be considered as the proximity sensor.
  • an action for enabling the pointer approaching the touch screen to be recognized as placed on the touch screen may be named ‘proximity touch’ and an action of enabling the pointer to actually come into contact with the touch screen may named ‘contact touch’.
  • a position, at which the proximity touch is made to the touch screen using the pointer may mean a position of the pointer vertically corresponding to the touch screen when the pointer makes the proximity touch.
  • the proximity sensor detects a proximity touch and a proximity touch pattern (e.g. , a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state).
  • a proximity touch pattern e.g. , a proximity touch distance, a proximity touch duration, a proximity touch position, a proximity touch shift state.
  • Information corresponding to the detected proximity touch action and the detected proximity touch pattern can be output to the touch screen.
  • the audio output module 452 functions in various modes including a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, and a broadcast reception mode to output audio data which is received from the wireless communication unit 410 or is stored in the memory 460. During operation, the audio output module 452 outputs audio relating to a particular function (e.g. , call received, message received).
  • the audio output module 452 may be implemented using one or more speakers, buzzers, other audio producing devices, and combinations of these devices.
  • the alarm unit 453 outputs a signal for announcing the occurrence of a particular event associated with the mobile terminal 400. Typical events include a call received, a message received and a touch input received.
  • the alarm unit 453 can output a signal for announcing the event occurrence by way of vibration as well as video or audio signal.
  • the video or audio signal can be output via the display 451 or the audio output module 452.
  • the display 451 or the audio output module 452 can be regarded as a part of the alarm unit 453.
  • the haptic module 454 generates various tactile effects that can be sensed by a user. Vibration is a representative one of the tactile effects generated by the haptic module 454.
  • the strength and pattern of the vibration generated by the haptic module 454 are controllable. For instance, different vibrations can be output by being synthesized (or composited) together or can be output in sequence.
  • the haptic module 454 can generate various tactile effects as well as the vibration.
  • the haptic module 454 may generate an effect attributed to the arrangement of pins vertically moving against a contact skin surface, an effect attributed to the injection/suction power of air though an injection/suction hole, an effect attributed to the skim over a skin surface, an effect attributed to a contact with an electrode, an effect attributed to an electrostatic force, and an effect attributed to the representation of a hot/cold sense using an endothermic or exothermic device.
  • the haptic module 454 can be implemented to enable a user to sense the tactile effect through a muscle sense of a finger or an arm as well as to transfer the tactile effect through direct contact.
  • at least two haptic modules 454 can be provided in the mobile terminal 400 in accordance with an embodiment of the mobile terminal 400.
  • the memory 460 is generally used to store various types of data to support the processing, control, and storage requirements of the mobile terminal 400. Examples of such data include program instructions for applications operating on the mobile terminal 400, contact data, phonebook data, messages, audio, still pictures (or photo), and moving pictures. Furthermore, a recent use history or a cumulative use frequency of each data (e.g. , use frequency for each phonebook, each message or each multimedia file) can be stored in the memory 460. Moreover, data for various patterns of vibration and/or sound output in response to a touch input to the touch screen can be stored in the memory 460.
  • data for various patterns of vibration and/or sound output in response to a touch input to the touch screen can be stored in the memory 460.
  • the memory 460 may be implemented using any type or combination of suitable volatile and non-volatile memory or storage devices including hard disk, random access memory (RAM), static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk, multimedia card micro type memory, card-type memory (e.g. , SD memory or XD memory), or other similar memory or data storage device.
  • the mobile terminal 400 can operate in association with a web storage for performing a storage function of the memory 560 on the Internet.
  • the interface unit 470 may be implemented to couple the mobile terminal 400 with external devices.
  • the interface unit 470 receives data from the external devices or is supplied with power and then transfers the data or power to the respective elements of the mobile terminal 400 or enables data within the mobile terminal 400 to be transferred to the external devices.
  • the interface unit 470 may be configured using a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for coupling to a device having an identity module, audio input/output ports, video input/output ports, and/or an earphone port.
  • the identity module is a chip for storing various kinds of information for authenticating a usage authority of the mobile terminal 400 and can include a User Identify Module (UIM), a Subscriber Identity Module (SIM), and/or a Universal Subscriber Identity Module (USIM).
  • a device having the identity module (hereinafter called ‘identity device’) can be manufactured as a smart card. Therefore, the identity device is connectible to the mobile terminal 500 via the corresponding port.
  • the interface unit 470 When the mobile terminal 400 is connected to an external cradle, the interface unit 470 becomes a passage for supplying the mobile terminal 400 with a power from the cradle or a passage for delivering various command signals input from the cradle by a user to the mobile terminal 400.
  • Each of the various command signals input from the cradle or the power can operate as a signal enabling the mobile terminal 400 to recognize that it is correctly loaded in the cradle.
  • the controller 480 typically controls the overall operations of the mobile terminal 400.
  • the controller 480 performs the control and processing associated with voice calls, data communications, and video calls.
  • the controller 480 may include a multimedia module 481 that provides multimedia playback.
  • the multimedia module 481 may be configured as part of the controller 480, or implemented as a separate component.
  • the controller 480 can perform a pattern (or image) recognizing process for recognizing a writing input and a picture drawing input performed on the touch screen as characters or images, respectively.
  • the power supply unit 490 provides power required by various components of the mobile terminal 400.
  • the power may be internal power, external power, or combinations of internal and external power.
  • Various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or some combination of computer software and hardware.
  • the embodiments described herein may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof.
  • controller 480 may also be implemented by the controller 480.
  • the embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which performs one or more of the functions and operations described herein.
  • the software codes can be implemented with a software application written in any suitable programming language and may be stored in memory such as the memory 460, and executed by a controller or processor, such as the controller 480.
  • FIG. 5 illustrates a perspective view illustrating one example of a watch-type mobile terminal 300 according to an embodiment of the present disclsoure.
  • the watch-type mobile terminal 500 includes a main body 501 with a display unit 551 and a band 502 connected to the main body 501 to be wearable on a wrist.
  • mobile terminal 500 may be configured to include features that are the same or similar to that of mobile terminal 400 of FIG. 4.
  • the main body 501 may include a case having a certain appearance. As illustrated, the case may include a first case 501a and a second case 501b cooperatively defining an inner space for accommodating various electronic components. Other configurations are possible. For instance, a single case may alternatively be implemented, with such a case being configured to define the inner space, thereby implementing a mobile terminal 400 with a uni-body.
  • the watch-type mobile terminal 500 can perform wireless communication, and an antenna for the wireless communication can be installed in the main body 501.
  • the antenna may extend its function using the case.
  • a case including a conductive material may be electrically connected to the antenna to extend a ground area or a radiation area.
  • the display unit 551 is shown located at the front side of the main body 501 so that displayed information is viewable to a user.
  • the display unit 551 includes a touch sensor so that the display unit can function as a touch screen.
  • window 551a is positioned on the first case 501a to form a front surface of the terminal body together with the first case 501a.
  • the illustrated embodiment includes audio output module 552, a camera 521, a microphone 522, and a user input unit 523 positioned on the main body 501.
  • audio output module 552 When the display unit 551 is implemented as a touch screen, additional function keys may be minimized or eliminated.
  • the user input unit 523 may be omitted.
  • the band 502 is commonly worn on the user's wrist and may be made of a flexible material for facilitating wearing of the device.
  • the band 502 may be made of fur, rubber, silicon, synthetic resin, or the like.
  • the band 502 may also be configured to be detachable from the main body 501. Accordingly, the band 202 may be replaceable with various types of bands according to a user's preference.
  • the band 502 may be used for extending the performance of the antenna.
  • the band may include therein a ground extending portion (not shown) electrically connected to the antenna to extend a ground area.
  • the band 502 may include fastener 502a.
  • the fastener 502a may be implemented into a buckle type, a snap-fit hook structure, a Velcro® type, or the like, and include a flexible section or material.
  • the drawing illustrates an example that the fastener 502a is implemented using a buckle.
  • FIG. 6 illustrates a diagram illustrating a controlling means of a digital device according to one embodiment of the present disclosure.
  • various user interface devices which can communicate with a digital receiver 600 in a wired/wireless manner can be used as remote controllers.
  • UIDs can include a mobile device (e.g. , a smart phone, a tablet PC, and the like), a magic remote controller 620 and a remote controller 630 equipped with a keyboard and a touch pad in addition to a general remote controller 610.
  • a mobile device e.g. , a smart phone, a tablet PC, and the like
  • a magic remote controller 620 and a remote controller 630 equipped with a keyboard and a touch pad in addition to a general remote controller 610.
  • the remote controllers can use various communication protocols such as Bluetooth, RFID, IrDA, UWB, ZigBee, DLNA, etc.
  • the magic remote controller 620 may include a gyro sensor mounted therein to sense vibration of a user’s hand or rotation. That is, the magic remote controller 620 can move a pointer according to up, down, left and right motions of the user such that the user can easily execute a desired action, for example, easily control a channel or a menu.
  • the remote controller 630 including the keyboard and touch pad can facilitate text input through the keyboard and control of movement of a pointer and magnification and reduction of a picture or video through the touch pad.
  • the keyboard 630 have been implemented so that it is similar to a keyboard of a PC in order to conveniently input text because the traditional remote control 610 is not sufficient enough to control the digital device 600 since the digital device 600 offers more than just providing broadcast programs as it did before, but has advanced into a intelligent integrated digital device providing web browser, application, social network service (SNS) and the like.
  • SNS social network service
  • control means such as the remote control 610, the pointing device 620, and the keyboard 630 can, if necessary, include a touchpad to control functions of text input, move the pointer, enlarging/reducing pictures and video clips more conveniently.
  • a digital device may include a receiving unit configured to receive application data, a controller configured to render tiles of a first image frame of the received application data, render some tiles among all tiles of a second image frame of the received application data, and generate an interpolated image frame using a synthesis based on both the rendered tiles of the first image frame and the rendered some tiles among all tiles of the second image frame, and a display configured to display the first image frame, the generated interpolated image frame and the second image frame of the application data on a display screen.
  • the present disclosure is described by taking an FRUC (frame rate up-conversion) method as one example.
  • the FRUC method is usable for GPU (graphic processing unit), CPU (central processing unit), video decoding, image signal processing, and the like to process or display image frames of a digital device.
  • the FRUC method according to the present disclosure can use at least one of a rendering scheme using a motion vector and a tile rendering scheme.
  • the former motion vector used rendering scheme is the scheme of estimating a motion of an object in an image frame and the generating a new image frame using it.
  • the latter tile rendering scheme is the scheme of processing an image fame by tile units and then using the processed tile data.
  • FIG. 7 illustrates a diagram to describe a method of processing application data in a digital device according to one embodiment of the present disclosure
  • FIG. 8 is a diagram to describe a method of processing application data in a digital device according to another embodiment of the present disclosure.
  • power consumption of an image processing unit varies depending on a workload increment/decrement of the image processing unit according to a frame rate for processing an image frame of an application. In other words, if a frame rate increases, a power consumption quantity of the image processing unit increases as well.
  • a power-consumption sensitive device such as a mobile device and becomes limitation due to consideration of capacity, power consumption, temperature and the like of hardware.
  • a flickering or lagging effect occurs due to the insufficient hardware capacity.
  • a device can provide a visual quality enough to satisfy a user, as a frame rate is lowered, a workload on an image processing unit can be reduced. Through this, power consumption can be saved.
  • the hardware capacity improvement of an image processing unit has accompanying problems or considerations such as a power consumption increase and the like. Therefore, the present disclosure intends to improve user’s sensing performance through the FRUC method by software, i.e. , a tile rendering scheme by excluding considerations for the hardware part.
  • the present disclosure uses a tile rendering scheme.
  • an image frame is processed in a manner of dividing the image frame by a unit of a predetermined number of tiles and processing the image frame by tile units instead of rendering a whole image frame at a time.
  • efficiency of memory bandwidth, data caching or the like can be raised.
  • an image frame is rendered within an image processing unit up to a final pixel by tile unit and is then saved to a frame buffer assigned to an external memory (e.g. , DRAM).
  • an external memory e.g. , DRAM
  • FIG. 7a shows a first object 710 and a second object 720
  • FIG. 7b shows tile data 730 including the first object 710 and the second object 720
  • the tile data 730 shown in FIG. 7b is saved to a tile buffer.
  • such tile data is the same data as tile data 730-1 in a whole image frame 740 saved to a frame buffer shown in FIG. 7c.
  • pixel data included in one or more tiles completely processed for a single image frame is final data.
  • it since such tile data or pixel data has been already processed, it can be used before completing the processing of the whole image data. Therefore, according to the present disclosure, using such tile or pixel data, it is able to generate a new image frame between an already-rendered image frame N and a currently rendered image frame N + 1 without an effect such as lagging or the like.
  • the tile rendering scheme is a scheme of interpolating or inserting (hereinafter named interpolation) one or more newly generated image frames between an image fame (Frame N) and an image frame (Frame N + 1) based on tile rendered data.
  • interpolation interpolating or inserting
  • image frame interpolating process there is a method of using a motion vector or/and an edge detection in an image frame tile rendering processing.
  • an edge detection using method is taken as one example in the present disclosure.
  • the edge detection scheme is sued for an object in an image frame, and more particularly, a dynamic object. So to speak, used is a property that a dynamic object region is mainly edge-detected.
  • a presence or non-presence of a motion of an object in a frame is predicted.
  • motion prediction data for an object in a frame is generated.
  • a digital device can further generate type data of the object.
  • a type of the object may include one of a sold figure ( e.g. , a cubic diagram), a complex figure, a text, and the like.
  • the digital device can determine a rendering priority for tiles of a next image frame.
  • the digital device updates or transmits the determined tile rendering priority data for the corresponding image frame to an image processing unit.
  • the image processing unit Based on the transmitted or updated tile rendering priority data, the image processing unit renders tiles of the corresponding image frame.
  • the rendered tile data can be saved to a tile buffer. If the rendering of all tiles in the image frame is completed, the corresponding data can be saved to the frame buffer. In particular, the tile data rendered by tile units is temporarily saved to the tile buffer and the image frame data rendered by image frame units is temporarily saved to the frame buffer.
  • the digital device can set a predetermined threshold at the tile buffer.
  • a threshold may vary by an image frame unit for example. In doing so, in the setting process, motion prediction data or/and type data for a corresponding image frame can be referred to.
  • the digital device determines whether the tile data of the corresponding image frame saved to the tile buffer exceeds the threshold. As a result of the determination, if the number of tile data saved to the tile buffer exceeds the threshold, the digital device generates a new image fame by synthesizing (or compositing) a previous image frame, which is already saved to the frame buffer after completing the rendering of all times, with the tile data saved to the tile buffer. In doing so, since each tile data saved to the tile buffer is aware of a location within an image frame, when the digital device synthesizes the tile data with the previous image frame, the digital device can generate the new image frame by overwriting the corresponding tile data extracted from the tile buffer on tiles of the previous image frame.
  • the new image frame generation can be achieved if it is a timing for displaying an image frame or the tile data number saved to the tile buffer exceeds the threshold.
  • one or more new image frames can be generated according to the above-described synthesis (or composition).
  • the generated new image fame(s) can be displayed between the image frame (Frame N) and the image frame (Frame N+1).
  • a digital device performs an edge detection on the image frame 810 using an edge detector. From the edge detection data 812, the digital device calculates a tile rendering priority 822 for tiles of a next image frame (Frame N+1). In doing so, the calculated tile rendering priority 822 can be set in a manner that tile(s) for an object in edge direction data has a priority. Thus, if the tile rendering priority 822 is determined, the digital device renders tile of the corresponding image frame 820 based on the tile rendering priority 822.
  • the digital device saves the tile-rendered tile data to a tile buffer. If the number of the tile data saved to the tile buffer exceeds the aforementioned threshold, the digital device synthesizes tiles of a previous image frame and the tile data saved to the tile buffer together using a tile compositor (FIG. 8b) and then generates a new image frame (Frame N+0.5) between the image frame (Frame N) and an image frame (Frame N+1) (FIG. 8c).
  • a tile compositor FIG. 8b
  • the edge detection is performed and a tile rendering priority for a next image frame is then determined.
  • the tile rendering priority of the next image frame (Frame N+1) determined after the image frame (Frame N) rendering and the tile rendering priority of the next image frame (Frame N+2) after the image frame (Frame N+1) rendering are changed. This is changed by motion prediction data, type data or the like of an object of each image frame.
  • FIG. 9 illustrates a block diagram for configuration of a digital device for processing an image frame according to one embodiment of the present disclosure
  • FIG. 10 illustrates a block diagram for details of configuration of an image processing unit shown in FIG. 9.
  • FIG. 9 and FIG. 10 show the processing configurations for rendering image frames particularly in association with the present disclosure.
  • a digital device is configured to include an image processing unit 910, an edge data processing unit 920, a tile data processing unit 930, a synthesizing unit 940, a control unit 950 and the like.
  • the image processing unit 910 includes an application 910, a graphics library, a driver, a GPU (hardware) and the like and performs an image processing, i.e. , a rendering on an image frame by tile units.
  • the edge data processing unit 920 predicts a variation between image frames, configuration and the like using edge data on an image frame (Frame N) rendered by the image processing unit 910, generates data of the prediction, and then forward the generated data to the tile data processing unit 930.
  • the edge data processing unit 920 is configured to include an edge detector 922, a motion predicting unit 924, an object sorting unit 926 and the like.
  • the edge detector 922 generates edge data through edge detecting from the image frame (Frame N) rendered by the image processing unit 910.
  • a single image frame can be divided into 9 tiles (Tile #1 to Tile #9). This is for clarity of the description only and non-limits the number of tiles.
  • the edge detector 922 performs a pixel based edge detection by sequentially receiving an input of pixel data for each tile by starting with Tile #1 of the rendered image frame.
  • the image frame is rendered by tile units.
  • the edge detector 922 can perform the edge detection without delay by first processing a first rendered tile result data (starting with Tile #1) (FIFO: first input first output). Yet, it may be able to extract an edge by a plurality of tile units using a buffer according to a system environment.
  • the edge detector 922 saves the edge data per tile index to a motion histogram database (DB) 1010.
  • DB motion histogram database
  • the motion predicting unit 924 generates motion prediction data by predicting a variation between image frames based on the generated edge data.
  • the motion prediction data is represented as a normalized value, which can be represented as a value between 0 and 1 according to a presence or non-presence of a motion. This is for clarity of the description only. And, the motion prediction data is non-limited by such a numerical value or scheme.
  • the motion predicting unit 924 predicts a motion variation by tile units using a histogram of an edge recorded per tile index in the motion histogram database (DB) 1010.
  • DB motion histogram database
  • the motion predicting unit 924 predicts an extent of a variation per frame region in each image frame by comparing a relative difference between an edge variation detected from a tile of an Nth frame and a tile having the same index in a previous N-1th image frame and a variation histogram of a tile of the corresponding index.
  • the object sorting unit 926 generates type information by sorting configuration information of an object in an image frame, i.e. , a type of the object based on the generated edge data and/or the generated motion prediction data.
  • the object sorting unit 926 sorts a result of a rendered tile into a type (solid figure, a complex figure, a text, etc.) by analyzing a vertical and/or horizontal edge information quantity in a tile.
  • Such type data may be generated by a tile unit. Meanwhile, the sorted information is saved to a database (DB), which can be referred to for the tile priority determination.
  • DB database
  • the tile data processing unit 930 determines priorities of tiles, which are to be rendered, in an image frame (Frame N+1) by receiving the motion prediction data generated by the motion predicting unit 924 and/or the type information generated by the object sorting unit 926. To this end, the tile data processing unit 930 includes a tile rendering priority calculating unit 934, a tile rendering priority updating unit 936 and the like.
  • the tile rendering priority calculating unit 934 calculates rendering priorities of tiles in the image frame (Frame N+1) based on the received motion prediction data, the received type information and the like.
  • the tile rendering priority updating unit 936 compares the tile rendering priority data of the image frame (Frame N+1) calculated by the tile rendering priority calculating unit 934 to a pre-generated tile rendering priority data. If there is a difference, the tile rendering priority updating unit 936 updates the tile rendering priority data. The updated tile rendering priority data is forwarded to the GPU so as to perform a rendering on the tiles of the image frame (Frame N+1).
  • the GPU keeps saving the tile data, which is rendered according to the tile rendering priority, to the tile buffer. If the rendering of all tiles in the image frame (Frame N+1) is completed, the tile buffer saves the corresponding data to the frame buffer by forwarding it to the frame buffer.
  • the synthesizing unit 940 generates a new image frame (Frame N+0.5) by synthesizing the data of the previous image frame (Frame N) saved to the frame buffer with the tile data of the image frame (Frame N+1) saved to the tile buffer and then saves the generated frame to the frame buffer.
  • control unit 950 may include a CPU of the digital device or an image processing control component configured separately.
  • FIG. 11 and FIG. 12 illustrate diagrams to describe a method of calculating a rendering priority for tiles of a next image frame based on motion prediction data according to one embodiment of the present disclosure.
  • the tile data processing unit 930 determines the priorities of the tiles, which are to be rendered in the image frame (Frame N+1), by receiving the motion prediction data and/or the type information.
  • the tile data processing unit 930 includes the tile rendering priority calculating unit 934, the tile rendering priority updating unit 936 and the like.
  • the tile rendering priority calculating unit 934 calculates rendering priorities of tiles in the image frame (Frame N+1) based on the received motion prediction data, the received type information and the like.
  • the tile rendering priority updating unit 936 compares the tile rendering priority data of the image frame (Frame N+1) calculated by the tile rendering priority calculating unit 934 to a pre-generated tile rendering priority data. If there is a difference, the tile rendering priority updating unit 936 updates the tile rendering priority data. So to speak, the tile rendering priority updating unit 936 excludes currently rendered tile(s) and re-designates rendering priorities for the rest of tile(s). In doing so, the image processing unit 910 does not stop or delay the rendering.
  • the updated tile rendering priority data is forwarded to the GPU and the rendering of tiles of the image frame (Frame N+1) is performed.
  • FIG. 11a in case of a horizontal component edge, when an object 110 makes a horizontal movement, it is highly probable that a variation of a tile 1120 is considerable in comparison with a vertical movement.
  • an update for left and right tiles 1130 and 1140 of the tile including the corresponding edge is necessary.
  • a left-to-right range to be designated can be variably designated in accordance with rigidity of an edge.
  • probability of movement is lowered in proportion to a distance from an original location, a priority can be lowered correspondingly.
  • each tile can be prioritized based on various data such as a distance or location from a reference tile among such tiles, a motion prediction of an object and the like.
  • a horizontal component edge and a vertical component edge are complicatedly mixed in a tile, they can be determined with a text pattern.
  • a text pattern if some tile(s) is updated, it may be difficult to obtain a corresponding meaning.
  • all the included tiles can be designated.
  • a priority of each of the designated tiles may be identical.
  • a priority of each tile can be defined in various ways as well as by the aforementioned priority calculation scheme, which is non-limited by the disclosure or description in the present disclosure.
  • a specific tile may have a top priority by user’s setting or the like.
  • priorities of tiles may refer to relative locations or distances from other high-priority tiles, by which the present disclosure is non-limited. For instance, it is not necessary to assign a high priority to a tile neighboring a reference tile. Hence, a priority may not affected by a priority calculation despite failing to neighbor or touch a reference tile in an image frame.
  • FIG. 13 and FIG. 14 illustrate diagrams to describe a method of synthesizing an interpolated image frame using tile(s) preferentially processed in a corresponding image frame (Frame N+1) and a previous image frame (Frame N) in the course of rendering according to one embodiment of the present disclosure.
  • the synthesizing unit 940 generates a new image frame (Frame N+0.5) by synthesizing the data of the previous image frame (Frame N) saved to the frame buffer with the tile data of the image frame (Frame N+1) saved to the tile buffer and then saves the generated frame to the frame buffer.
  • the synthesizing unit 940 synthesizes data of a previously processed Nth image frame with a result of tiles of a currently rendered N+1th image frame.
  • All image frames are processed by being subdivided into a plurality of tile units. And, each processing-completed tile is saved to a frame buffer. If the total number of tiles of the processed N+1th image reaches a threshold defined in a system, it is synthesized with the data of the Nth image frame without a rendering delay of the N+1th image.
  • seam artifacts 1410 such as a cutting plane due to a rapid difference (e.g., color, motion, etc.) on a boundary interface of a tile may occur.
  • a rapid difference e.g., color, motion, etc.
  • various kinds of interpolation functions can be supported and selection or release is possible according to adaptive information.
  • each tile is not synthesized unconditionally by referring to a tile index. Instead, whether to skip or update a synthesis of a corresponding tile is determined based on an image frame variation between tiles of a same index and the made determination is followed.
  • a digital device can enable/disable FRUC function in a manner of determining prediction data of an image frame variation between image frames, frame configuration (type) data, a presence or non-presence (i.e., activation/deactivation) of optimization of FRUC function for an image frame outputted on the basis of hardware system environment information, an interpolation frame rate, and the like, or in a manner that a user directly configures global settings.
  • FIG. 15 illustrates a diagram to describe an image frame interpolating method according to one embodiment of the present disclosure.
  • FIG. 15 it is able to reduce power consumption through an image frame interpolation according to the present disclosure. For instance, if a frame rate of an image processing unit (GPU) exceeds 60fps, as shown in FIG. 15a, as mentioned in the foregoing description, a digital device can reduce power consumption.
  • GPU image processing unit
  • FIG. 16 illustrates a diagram to describe an image frame interpolating method according to another embodiment of the present disclosure.
  • FIG. 16 shows that QoS can be improved through an image frame interpolation according to the present disclosure. For instance, under the v-sync restriction, if a frame rate of an image processing unit is smaller than 60fps, as shown in FIG. 16a, a digital device can improve QoS.
  • FIG. 17 illustrates a diagram to describe a scheme of processing a real image according to the present disclosure.
  • FIG. 17 shows one example of images for generating an interpolated image frame (Frame N+0.5) through a synthesis based on motion prediction data, tile configuration data and the like using a tile rendering scheme between an image frame (Frame N) and an image frame (Frame N+1) according to the present disclosure mentioned in the foregoing description.
  • FIG. 18 illustrates a diagram to describe the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Fame N+1) on the basis of motion prediction data according to one embodiment of the present disclosure.
  • FIG. 18a illustrates a case of a single image frame 1830 to be interpolated between an image frame (Frame N) 1810 and an image frame (Fame N+1) 1820.
  • FIG. 18b shows a case of two image frames 1840 and 1850 to be interpolated between an image frame (Frame N) 1810 and an image frame (Fame N+1) 1820.
  • the number of image frame(s) to be interpolated may be set to be always constant in advance or may be randomly changed between every image frames, for example.
  • the number of the image frames to be interpolated can be adaptively determined. This may be determined on the basis of motion prediction data calculated by a motion predicting unit. Besides, data of an edge detector and an object sorting unit can be referred to.
  • a ratio of the number of image frames to be interpolated or normal image frames to interpolated image frames if a change between predicted frames gets smaller, a configuration of an image frame gets more complicated, or a text configuration becomes less, the number can be lowered or the ratio can be raised.
  • FIG. 19 illustrates a flowchart for a method of processing data in a digital device according to one embodiment of the present disclosure.
  • a digital device receives application data (S1902) and preforms a tile rendering on a first image frame of the received application data (S1904).
  • the digital device generates motion prediction data of an object included in the first image frame from edge data of the tile-rendered first image frame (S1906) and determines a rendering priority for tiles of a second image frame based on the generated motion prediction data of the object (S1908). Subsequently, the digital device renders the tiles of the second image frame based on the determined tile rendering priority (S1910).
  • the digital device generates a third image frame by synthesizing some of the rendered tiles among the tiles of the second image frame and the tiles of the rendering completed first image frame together (S1912). And, the digital device displays the image frames (S1914).
  • the third image frame may be displayed ahead of the second image frame.
  • the third image frame may be displayed between the first image frame and the second image frame. If the number of the tiles of the rendered second image frame exceeds a threshold, the third image frame can be generated by a synthesis with the tiles of the rendered first image frame on the basis of a tile location.
  • the edge data can be obtained using pixel data of the rendered first image frame.
  • the motion prediction data of the object can be obtained using edge histogram data recorded by tile index unit of each image frame.
  • the method may further include the step of generating type data of an object, which includes a type and attribute of the object included in the first image frame, from the edge data of the rendered first image frame.
  • the type data of the object can be obtained from a result of the rendered tile by analyzing vertical edge data and horizontal edge data in a tile.
  • the type data of the object may include at least one of a solid figure, a complex figure and a text.
  • the threshold is determined. And, according to the determined threshold, it is able to determine the number of third frames to render between the first image frame and the second image frame.
  • FIG. 20 illustrates a block diagram for image processing configuration of a digital device for processing application data according to one embodiment of the present disclosure
  • FIG. 21 illustrates a diagram to describe details of a processing procedure between a control unit 2020 and a tile order selecting unit 2030 shown in FIG. 20
  • FIG. 22 illustrates a diagram to describe a tile rendering sequence according to one embodiment of the present disclosure
  • FIG. 23 illustrates a diagram to describe a method of synthesizing an intermediate image fame according to one embodiment of the present disclosure.
  • power consumption of an image processing unit varies in response to a workload increment/decrement of the image processing unit according to a frame rate for processing an image frame of an application.
  • a frame rate increases, a power consumption amount of the image processing unit increases.
  • capacity, power consumption and temperature of hardware and the like should be considered, thereby becoming limitations.
  • Insufficient hardware capacity is an occurrence cause of a flickering or lagging effect.
  • the hardware capacity improvement of a device is not the only solution and the above problem may not be fully solved due to other causes.
  • a digital device can provide a visual quality enough to satisfy a user, as a frame rate is lowered, a workload on an image processing unit can be reduced. Through this, power consumption can be saved.
  • the hardware capacity improvement of an image processing unit has accompanying problems or considerations such as a power consumption increase and the like. Therefore, the present disclosure intends to improve user’s sensing performance through the FRUC method by software, i.e. , a tile rendering scheme by excluding considerations for the hardware part.
  • the present disclosure uses a tile rendering scheme.
  • an image frame is processed in a manner of dividing the image frame by a unit of a predetermined number of tiles and processing the image frame by tile units instead of rendering a whole image frame at a time.
  • efficiency of memory bandwidth, data caching or the like can be raised.
  • an image frame is rendered within an image processing unit up to a final pixel by tile unit and is then saved to a frame buffer assigned to an external memory (e.g. , DRAM).
  • an external memory e.g. , DRAM
  • An FRUC method i.e., a tile rendering method is basically the method of interpolating or inserting (hereinafter called ‘interpolating’) at least one or more image frames newly generated on the basis of tile-rendered data between an image frame (Frame N) and an image frame (Fame N+1).
  • interpolating there is a method of using a motion vector, an edge detection, application profile data, FRUC management data, or the like in the course of performing an image frame tile rendering processing.
  • the present disclosure takes a method of using the application profile data or the FRUC management data for example.
  • FIG. 20 and FIG. 21 show the processing components necessary for the rendering of image frames in a digital device with respect to the present disclosure. Some of the components shown in the drawings may be omitted or modularized by being merged with other components, or component(s) failing to be shown in the drawings may be further included.
  • an image processing unit of a digital device is configured to include a graphic processing unit 2010, a control unit 2020, a tile order selecting unit 2030, a synthesizing unit 2040 and the like.
  • the graphic processing unit 2010 includes an application, a graphics library, a driver, a GPU (hardware) and the like, thereby image-processing, and more particularly, rendering image frames of the application by tile units.
  • the control unit 2020 transmits a control command for a tile order selection necessary for a rendering of a next image frame (Fame N+1) after rendering an image frame (Frame N) to the tile order selecting unit 2030.
  • the control unit 2020 transmits preset FRUC management setup data to the tile order selecting unit 2030 in a manner that the preset FRUC management setup data is included in the control command.
  • the tile order selecting unit 2030 reads the FRUC management setup data by parsing the received control command and also reads application profile data from an application profile database based on currently run application information.
  • the tile order selecting unit 2030 selects the order of tiles to render from the image frame (Fame N+1) based on at least one of the read FRUC management setup data, the currently run application information, the application profile data and the like, and then sets it at the GPU of the graphic processing unit.
  • the GPU renders the tiles of the image frame (Fame N+1) under to control of the tile order selecting unit 2030.
  • the tile order selecting unit 2030 determines at least one tile rendering order 2150 among a scanline-order, a Z-order, a spiral-sequence, an eye-tracking-sequence and the like and then sets the determined at least one rile rendering order 2150 at the graphic processing unit 2010.
  • FIG. 22a shows the scanline-order as one example of the tile rendering sequence
  • FIG. 22b shows the Z-order as one example of the tile rendering sequence
  • FIG. 22c shows the spiral-sequence as one example of the tile rendering sequence
  • FIG. 22d shows the eye-tracking-sequence as one example of the tile rendering sequence, by which the present disclosure is non-limited. For example, other sequences can be used for the tile rendering.
  • all image frames in a single application can be image-processed according to a prescribed one of tile rendering sequences shown in FIGs. 22a to 22d, or a tile rendering sequence different from that applied in an existing image frame may be applied in each image frame or a specific image frame. So to speak, in processing image frames of a single application, one or more tile rendering sequences are applicable.
  • a tile rendering sequence can be determined to match a scanline-order of an image frame.
  • tiles of an image frame are rendered from an upper row to a lower row by a row unit, and more particularly, the tiles can be rendered from a left column to a right column within a row.
  • a tile rendering sequence can render tiles of an image frame in Z-order.
  • tiles belonging to a first column and a second column and a first row and a second row (1st column - 2nd column, 1st row - 2nd row) are rendered to match Z-order
  • tiles are rendered in order of (3rd column - 4th column, 1st row - 2nd row), (1st column - 2nd column, 3rd row - 4th row), (3rd column - 4th column, 3rd row - 4th row), and (5th column - 6th column, 1st row - 2nd row), and tiles for (5th column - 6th column, 3rd row - 4th row) can be finally rendered.
  • the order of the columns and rows may be determined to be different from the aforementioned order for example. For instance, a rendering can be first performed on (1st column - 2nd column, 3rd row - 4th row) instead of (3rd column - 4th column, 1st row - 2nd row). Meanwhile, referring to FIG. 22b, the Z-order always starts with a left upper part and ends at a right lower part in two columns and two rows, and vice versa.
  • a tile rendering sequence can render tiles in the spiral-order, as shown in FIG. 22c, with reference to a specific point or tile of an image frame for example.
  • a tile rendering sequence can render tiles according to user’s indication points or tiles sensed by an eye-tracking sensor, as shown in FIG. 22d, for example.
  • the GPU renders an image frame according to the set tile rendering sequence.
  • the tiles rendered according to the rendering order or the rendering sequence are saved to a tile buffer temporarily and then saved to the frame buffer.
  • the synthesizing unit 2040 generates an intermediate image frame, i.e. , a new image frame (Frame N+0.5) in a manner of receiving tile data of an image frame (Frame N) rendering-completed for all frame and some tile data of all tile data of a currently rendered image frame (Fame N+1) from the GPU and then synthesizing the received data under the control of the control unit 2020.
  • an intermediate image frame i.e. , a new image frame (Frame N+0.5) in a manner of receiving tile data of an image frame (Frame N) rendering-completed for all frame and some tile data of all tile data of a currently rendered image frame (Fame N+1) from the GPU and then synthesizing the received data under the control of the control unit 2020.
  • corresponding tiles are overwritten on the previous image frame (Frame N) 2310 based on tile indexes of the tiles 2320 rendered in the currently rendered image frame (Fame N+1).
  • control unit 2020 determines whether the number of tiles rendered for a currently rendered image frame (Fame N+1) exceeds a predetermined threshold. As a result of the determination, if the number exceeds the threshold, the control unit 2020 controls the synthesizing unit 2040 to perform a corresponding synthesis.
  • FIG. 24 illustrates a block diagram for image processing configuration of a digital device for processing application data according to another embodiment of the present disclosure.
  • FIG. 24 illustrates diagram for a configuration of a digital device for rendering an image frame by adding an edge detecting method to the former digital device shown in FIG. 20.
  • a digital device in order to render an image frame a digital device further includes an edge data processing unit 2410 and a tile data processing unit 2420.
  • the edge data processing unit 2410 includes an edge detector 2412, a motion predicting unit 2414, an object sorting unit 2416 and the like.
  • the tile data processing unit 2420 includes a tile priority calculating unit 2422, a tile rendering priority updating unit 2424 and the like, which are related to the edge data processing unit 2410, as well as the file order selecting unit 2030 shown in FIG. 21.
  • the edge data processing unit 2410 predicts a variation between image frames of an application, configuration information and the like using edge data of the image frame (Frame N) rendered by the graphic processing unit 2010, generates data of the prediction, and then forwards it to the control unit 2020 or/and the tile data processing unit 2420.
  • the edge detector 2412 generates the edge data through an edge detecting from the image frame (Frame N) rendered by the graphic processing unit 2010. For instance, a single image frame can be divided into 9 tiles Tile #1 to Tile #9 for clarity of the description, by which the number of the tiles is non-limited.
  • the edge detector 2412 performs a pixel-based edge detection by sequentially receiving an input of pixel data for each time by starting with Tile #1 of the rendered image frame. In this case, the image frame is rendered by tile units.
  • the edge detector 2412 can perform an edge detection (FIFO: first input first output) without delay by first processing first rendered tile result data (from Tile #1). Yet, it may be able to extract an edge by a plurality of tile units using a buffer in accordance with a system environment.
  • the edge detector 2412 saves the edge data to a motion histogram database per tile index.
  • the motion predicting unit 2414 generates motion prediction data by predicting a variation between image frames based on the generated edge data.
  • the motion prediction data is represented as a normalized value, which can be expressed as a value between 0 and 1 depending on a presence or non-presence of a motion. This is for clarity of the description and is non-limited by such a numerical value or scheme.
  • the motion predicting unit 2414 predicts a motion variation by tile units using a histogram of an edge recorded per tile index in the motion histogram storage unit.
  • An extent of a change per frame region is predicted in a manner of comparing a relative difference between an edge variation detected from a tile of an Nth frame and a tile having the same index of a previous N-1th image frame and a variation histogram of a tile of the corresponding index in each image frame.
  • the object sorting unit 2416 generates type information by sorting configuration information of an object in an image frame, i.e., a type of the object based on the generated edge data and/or the generated motion prediction data.
  • the object sorting unit 2416 sorts a result of a rendered tile into a type (solid figure, a complex figure, a text, etc.) by analyzing a vertical and/or horizontal edge information quantity in a tile.
  • Such type data may be generated by a tile unit. Meanwhile, the sorted information is saved to a database (DB) (not shown), which can be referred to for the tile priority determination.
  • DB database
  • the tile data processing unit 2420 determines priorities of tiles, which are to be rendered, in an image frame (Frame N+1) by receiving the motion prediction data generated by the motion predicting unit 2414 and/or the type information generated by the object sorting unit 2416. To this end, the tile data processing unit 2420 includes a tile rendering priority calculating unit 2422, a tile rendering priority updating unit 2424 and the like.
  • the tile rendering priority calculating unit 2422 calculates rendering priorities of tiles in the image frame (Frame N+1) based on the received motion prediction data, the received type information and the like.
  • the tile rendering priority updating unit 2424 compares the tile rendering priority data of the image frame (Frame N+1) calculated by the tile rendering priority calculating unit 2422 to pre-generated tile rendering priority data. If there is a difference, the tile rendering priority updating unit 2424 updates the tile rendering priority data. The updated tile rendering priority data is forwarded to the GPU so as to perform a rendering on the tiles of the image frame (Frame N+1).
  • the GPU keeps saving the tile data, which is rendered according to the tile rendering priority, to the tile buffer. If the rendering of all tiles in the image frame (Frame N+1) is completed, the tile buffer saves the corresponding data to the frame buffer by forwarding it to the frame buffer.
  • the synthesizing unit 2040 generates a new image frame (Frame N+0.5) by synthesizing the data of the previous image frame (Frame N) saved to the frame buffer with the tile data of the image frame (Frame N+1) saved to the tile buffer and then saves the generated frame to the frame buffer.
  • control unit 2020 may include a CPU of the digital device or an image processing control component configured separately.
  • control unit 2020 can control a related processing on a corresponding configuration by selecting at least one of an edge detecting scheme and a tile order selecting scheme. For instance, these schemes may be identically applicable to all frames in an application, or different schemes may be applied to specific image frames under the control of the control unit 2020, respectively.
  • FIG. 25 illustrates a diagram to describe a method of processing image frames in accordance with a tile rendering sequence according to one embodiment of the present disclosure.
  • FIG. 25a shows image frames (30fps) rendered by the GPU of the graphic processing unit
  • FIG. 25b shows a case of a normal display (30fps)
  • FIG. 25c shows a case of an HFF display (60fps) according to the present disclosure.
  • the Z-order tile rendering sequence shown FIG. 21b is taken as one example for the description.
  • tiles of an image frame (Frame N) are rendered.
  • the tiles of the corresponding image frame are rendered by the Z-order scheme.
  • the control unit determines whether the number of the tile data of the image frame (Fame N+1) exceeds a predetermined threshold by accessing the tile buffer. As a result of the determination, if the number of the tile data saved to the tile buffer exceeds the threshold, the control unit controls the synthesizing unit to synthesize a new image frame (Fame N+0.5).
  • the synthesizing unit synthesizes the image frame (Fame N+0.5).
  • the synthesized image frame (Fame N+0.5) is displayed between the image frame (Frame N) and the image frame (Fame N+1).
  • the threshold may or may not be equal in each image frame.
  • an image frame synthesis may be performed according to a threshold different from that of another image frame.
  • the threshold may be set by the control unit or the like for example.
  • a tile rendering sequence may be changed according to an image frame as well as the threshold.
  • FIG. 26 illustrates a diagram to describe a method of synthesizing an interpolated image frame using tile(s) preferentially processed in a corresponding image frame (Frame N+1) and a previous image frame (Frame N) in the course of rendering according to one embodiment of the present disclosure.
  • Every image frame is processed by being subdivided into a plurality of tile units. And, each processing-completed tile is saved to a frame buffer. If the total number of tiles of a processed N+1th image frame reaches a threshold defined in a system, it is synthesized with data of an Nth image frame without a rendering delay of the N+1th image frame.
  • a frame variation between tile(s) of a same index is equal to or greater than a predetermined rate, it may be able to skip a synthesis for the corresponding tile. For instance, this is to reduce artifacts due to rapid lighting or motion change.
  • seam artifacts such as a cutting plane due to a rapid difference (e.g., color, motion, etc.) on a boundary interface of a tile may occur.
  • various kinds of interpolation functions e.g., a saw-tooth composition method shown in FIG. 26b, a blending method shown in FIG. 26c, etc.
  • a selection or release is possible according to adaptive information.
  • FIG. 27 illustrates a diagram to describe an image frame interpolating method according to one embodiment of the present disclosure.
  • FIG. 27 it is able to reduce power consumption through an image frame interpolation according to the present disclosure. For instance, through the V-sync restriction, if a frame rate of an image processing unit (GPU) exceeds 60 fps, as shown in FIG. 27a, a digital device can reduce power consumption.
  • GPU image processing unit
  • FIG. 28 illustrates a diagram to describe an image frame interpolating method according to another embodiment of the present disclosure.
  • FIG. 28 shows that QoS can be improved through an image frame interpolation according to the present disclosure. For instance, under the V-sync restriction, if a frame rate of an image processing unit is smaller than 60fps, as shown in FIG. 28a, a digital device can improve QoS.
  • FIG. 29 illustrates a diagram to describe the determination of the number of image frames to be interpolated between an image frame (Frame N) and an image frame (Fame N+1) on the basis of motion prediction data according to one embodiment of the present disclosure.
  • FIG. 29a shows a case of a single image frame 2930 to be interpolated between an image frame (Frame N) 2910 and an image frame (Fame N+1) 2920.
  • FIG. 29b shows a case of two image frames 2940 and 2950 to be interpolated between an image frame (Frame N) 2910 and an image frame (Fame N+1) 2920.
  • the number of image frames to be interpolated may be set to be always constant in advance or may be randomly changed between every image frames, for example.
  • the number of the image frames to be interpolated can be adaptively determined.
  • FIG. 30 illustrates a flowchart for a method of processing data in a digital device according to one embodiment of the present disclosure.
  • a digital device receives application data (S3002) and sets a tile rendering sequence for image frames of the application data (S3004).
  • the digital device renders each tile of a first image frame according to the set tile rendering sequence (S3006) and then generates an interpolated image frame by synthesizing tile data of some rendered tile among all tiles of the first image frame and a previously rendering-completed image frame with each other based on a tile index (S3008).
  • the digital device displays the previously rendering-completed image frame, the interpolated image frame and the first image frame (S3010).
  • the interpolated image can be displayed ahead of the first image frame.
  • the interpolated image frame can be displayed between the previously rendering-completed image frame and the first image frame. If the number of the rendered tiles among all tiles of the first image frame exceeds a preset threshold, the interpolated image frame can be generated in a manner of being synthesized with tiles of the previously rendering-completed image frame based on a tile index.
  • the set tile rendering sequence can refer to at least one of FRUC management data, a currently run application information and profile data of the application and may include at least one of a scanline-order, a Z-order, a spiral-order and an eye-tracking sequence.
  • the set tile rendering sequence may be different by an application unit or an image frame unit.
  • the set tiling sequence can refer to tile rendering priority data based on at least one of object motion prediction data and an object type data based on edge data of the previously rendering-completed image frame.
  • the previously rendering-completed image frame, the interpolated image frame and the first image frame can be displayed sequentially. And, a plurality of the interpolated image frames can be generated between the previously rendering-completed image frame and the first image frame.
  • QoS of the application data and the like can be processed in a manner of being maintained or improved.
  • Performance of QoS and the like can be maintained or improved by improving overall environments (e.g., hardware capacity, power, temperature, etc.) of a device in a manner of processing the application data by software.
  • overall environments e.g., hardware capacity, power, temperature, etc.
  • a cost increase burden due to hardware can be minimized in a manner of maintaining or improving performance of processing the application data by software without hardware capacity improvement or configuration addition and a device purchase desire of a user can be increased in a manner of improving user’s device satisfaction.
  • an improved visual quality can be provided to a user with low power, low capacity and the like.
  • QoS can be improved in a high-specification application that requires capacity beyond hardware capacity limit of an image processing unit.
  • Environmental limitations put on hardware (H/W) thermal, power and the like can be improved as well as limitations put on CPU performance, memory bandwidth and l an associated device such as a display. In this case, although a load such as an image processing unit, a CPU, a memory or the like is lowered, QoS can be secured.
  • a digital device and data processing method therein disclosed in the present disclosure may be non-limited by the configurations and methods of the embodiments mentioned in the foregoing description. And, the embodiments mentioned in the foregoing description can be configured in a manner of being selectively combined with one another entirely or in part to enable various modifications.
  • a digital device operating method disclosed in the present disclosure can be implemented in a program recorded medium as processor-readable codes.
  • the processor-readable media may include all kinds of recording devices in which data readable by a processor are stored.
  • the processor-readable media may include ROM, RAM, CD-ROM, magnetic tapes, floppy discs, optical data storage devices, and the like for example and also include carrier-wave type implementations such as transmission via Internet.
  • the recording medium readable by a processor is distributed to a computer system connected to a network, whereby codes readable by the processor by distribution can be saved and executed.
  • the present invention relates to a digital device and a method of processing therein and is applicable to all digital apparatus.

Abstract

L'invention concerne un dispositif numérique et un procédé de traitement de données associé. La présente invention comprend une unité de réception configurée pour recevoir des données d'application, une unité de traitement d'image configurée pour restituer, sous la forme d'un pavé, une première trame d'image et une deuxième trame d'image des données d'application reçues, une unité de commande configurée pour générer des données de prédiction de mouvement d'un objet inclus dans la première trame d'image à partir de données de contour de la première trame d'image restituée sous la forme d'un pavé, pour déterminer une priorité de restitution pour des pavés de la deuxième trame d'image sur la base des données de prédiction de mouvement générées de l'objet, pour restituer les pavés de la deuxième trame d'image sur la base de la priorité de restitution déterminée, et pour générer et restituer une troisième trame d'image en synthétisant certains pavés restitués parmi les pavés de la deuxième trame d'image avec des pavés de la première trame d'image à restitution achevée, et une unité d'affichage configurée pour afficher les trames d'image restituées
PCT/KR2016/009075 2015-08-20 2016-08-18 Dispositif numérique et procédé de traitement de données associé WO2017030380A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020150117162A KR20170022333A (ko) 2015-08-20 2015-08-20 디지털 디바이스 및 상기 디지털 디바이스에서 데이터 처리 방법
KR1020150117163A KR20170022334A (ko) 2015-08-20 2015-08-20 디지털 디바이스 및 상기 디지털 디바이스에서 데이터 처리 방법
KR10-2015-0117163 2015-08-20
KR10-2015-0117162 2015-08-20

Publications (1)

Publication Number Publication Date
WO2017030380A1 true WO2017030380A1 (fr) 2017-02-23

Family

ID=58050808

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/009075 WO2017030380A1 (fr) 2015-08-20 2016-08-18 Dispositif numérique et procédé de traitement de données associé

Country Status (1)

Country Link
WO (1) WO2017030380A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570502A (zh) * 2019-08-05 2019-12-13 北京字节跳动网络技术有限公司 显示图像帧的方法、装置、电子设备和计算机可读存储介质
CN115225940A (zh) * 2021-04-15 2022-10-21 青岛海信宽带多媒体技术有限公司 一种机顶盒及机顶盒页面显示方法
US11523135B2 (en) 2018-04-09 2022-12-06 Nokia Technologies Oy Apparatus, a method and a computer program for volumetric video

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6438275B1 (en) * 1999-04-21 2002-08-20 Intel Corporation Method for motion compensated frame rate upsampling based on piecewise affine warping
US6594313B1 (en) * 1998-12-23 2003-07-15 Intel Corporation Increased video playback framerate in low bit-rate video applications
US20090147853A1 (en) * 2007-12-10 2009-06-11 Qualcomm Incorporated Resource-adaptive video interpolation or extrapolation
US20140168240A1 (en) * 2012-12-18 2014-06-19 Motorola Mobility Llc Methods and systems for overriding graphics commands
US20140294320A1 (en) * 2013-03-29 2014-10-02 Anil Kokaram Pull frame interpolation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6594313B1 (en) * 1998-12-23 2003-07-15 Intel Corporation Increased video playback framerate in low bit-rate video applications
US6438275B1 (en) * 1999-04-21 2002-08-20 Intel Corporation Method for motion compensated frame rate upsampling based on piecewise affine warping
US20090147853A1 (en) * 2007-12-10 2009-06-11 Qualcomm Incorporated Resource-adaptive video interpolation or extrapolation
US20140168240A1 (en) * 2012-12-18 2014-06-19 Motorola Mobility Llc Methods and systems for overriding graphics commands
US20140294320A1 (en) * 2013-03-29 2014-10-02 Anil Kokaram Pull frame interpolation

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11523135B2 (en) 2018-04-09 2022-12-06 Nokia Technologies Oy Apparatus, a method and a computer program for volumetric video
CN110570502A (zh) * 2019-08-05 2019-12-13 北京字节跳动网络技术有限公司 显示图像帧的方法、装置、电子设备和计算机可读存储介质
CN115225940A (zh) * 2021-04-15 2022-10-21 青岛海信宽带多媒体技术有限公司 一种机顶盒及机顶盒页面显示方法
CN115225940B (zh) * 2021-04-15 2023-07-28 青岛海信宽带多媒体技术有限公司 一种机顶盒及机顶盒页面显示方法

Similar Documents

Publication Publication Date Title
WO2016143965A1 (fr) Dispositif d'affichage, et procédé de commande correspondant
WO2016085094A1 (fr) Dispositif multimédia et procédé de commande associé
WO2012046928A1 (fr) Procédé de production de contenu publicitaire utilisant un dispositif d'affichage, et dispositif d'affichage à cet effet
WO2017003022A1 (fr) Dispositif d'affichage et son procédé de commande
WO2015099343A1 (fr) Dispositif numérique et son procédé de commande
WO2012026651A1 (fr) Procédé de synchronisation de contenus et dispositif d'affichage permettant le procédé
WO2016027933A1 (fr) Dispositif numérique et son procédé de commande
WO2014209053A1 (fr) Dispositif numérique et procédé de traitement de ses données de service
WO2016186254A1 (fr) Panneau d'affichage et son procédé de commande
WO2012015116A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2014137200A1 (fr) Terminal mobile, et procédé de commande associé
WO2011126202A1 (fr) Appareil d'affichage d'image et son procédé d'utilisation
WO2016085070A1 (fr) Système de commande de dispositif, dispositif numérique, et procédé de commande pour commander ledit dispositif
WO2012015117A1 (fr) Procédé pour faire fonctionner un appareil d'affichage d'image
WO2012081803A1 (fr) Procédé de fourniture d'un menu d'applications dans un dispositif d'affichage d'images et dispositif d'affichage d'images selon celui-ci
WO2017135585A2 (fr) Haut-parleur principal, haut-parleur secondaire et système comprenant ceux-ci
WO2017061793A1 (fr) Dispositif numérique et procédé de traitement des données par celui-ci
WO2017034065A1 (fr) Dispositif d'affichage et son procédé de commande
WO2016104907A1 (fr) Dispositif numérique, et procédé de traitement de données par le même dispositif numérique
WO2017018737A1 (fr) Dispositif numérique, et procédé de traitement de données dans le dispositif numérique
WO2012030055A1 (fr) Appareil d'affichage d'image et procédé d'affichage d'image associé
WO2012074189A1 (fr) Procédé de commande d'affichage sur écran et dispositif d'affichage d'image l'utilisant
WO2012030025A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
EP2652959A1 (fr) Télévision réseau traitant plusieurs applications et procédé destiné à commander cette télévision
WO2017200215A1 (fr) Terminal mobile et procédé de commande de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16837322

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16837322

Country of ref document: EP

Kind code of ref document: A1