WO2015130024A1 - Dispositif numérique et son procédé de commande - Google Patents

Dispositif numérique et son procédé de commande Download PDF

Info

Publication number
WO2015130024A1
WO2015130024A1 PCT/KR2015/001108 KR2015001108W WO2015130024A1 WO 2015130024 A1 WO2015130024 A1 WO 2015130024A1 KR 2015001108 W KR2015001108 W KR 2015001108W WO 2015130024 A1 WO2015130024 A1 WO 2015130024A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
image
service
signal
mvpd
Prior art date
Application number
PCT/KR2015/001108
Other languages
English (en)
Korean (ko)
Inventor
강상우
임강희
임창욱
장동헌
박찬진
김범준
한상철
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140131936A external-priority patent/KR102224486B1/ko
Priority claimed from KR1020140131935A external-priority patent/KR102268751B1/ko
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to US15/121,931 priority Critical patent/US10063923B2/en
Publication of WO2015130024A1 publication Critical patent/WO2015130024A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4432Powering on the client, e.g. bootstrap loading using setup parameters being stored locally or received from the server

Definitions

  • the present invention relates to a digital device, and more particularly, after a video output application is executed, a user does not recognize after executing a first image control application for controlling the video output application with minimal functionality.
  • a method of processing a service such as MVPD (Multichannel Video Programming Distributor) on a Web OS platform, and executing a second image control application for controlling the video output application with various functions using a technology that cannot be used. Relates to a digital device.
  • MVPD Multichannel Video Programming Distributor
  • the conventional digital TV also supports an MVPD (Multichannel Video Programming Distributor) service.
  • MVPD Multichannel Video Programming Distributor
  • the conventional MVPD service is a form in which the corresponding service provider (set) all embedded in the set-top box (STB). Therefore, in digital devices, the software is embedded in the device to replace the function of the existing hardware set-top box. For this reason, in the conventional digital device, it is difficult for the user to access the MVPD service, etc., and due to the load according to the support of the MVPD service in the digital device, service loading or initialization time such as booting is increased, so that the user uses the service There was a problem giving discomfort.
  • MVPD Multichannel Video Programming Distributor
  • the present invention has been made to solve the above situation or problem, and one object of the present invention is to preferentially boot the first image control application that provides the minimum image control function when the user turns on the device. to boot.
  • Another object of the present invention is to boot a second image control application that provides various image control functions in the background while providing a minimum image control function in the first image control application.
  • Another object of the present invention is to perform resource management so that the first image control application can boot faster while providing a minimum image control function.
  • Another object of the present invention is to support and process an RFD and / or IP-based Multichannel Video Programming Distributor (MVDP) service in a digital device equipped with a webOS platform.
  • MVDP Video Programming Distributor
  • Another object of the present invention is to provide an access and use convenience by providing the MVPD service in a form similar to external inputs such as a high definition multimedia interface (HDMI), a component, and the like.
  • HDMI high definition multimedia interface
  • Another object of the present invention is to service and process equal or better performance compared to conventional MVPD service or hardware set-top box (s).
  • Still another object of the present invention is to provide a service seamlessly in the case of booting or the like in supporting an MVPD service on the webOS platform.
  • a method of controlling a digital device may include: receiving a power on signal, an application executed at a power off point before receiving the power on signal; Determining whether the image output application is the image output application, and if the application executed at the power-off time is the image output application, loading and executing the image output application, the image output through the image output application Loading and executing the first image control application for controlling the image; loading the second image control application; and when the loading is completed, terminate the first image control application and execute the second image control application. It can be designed to include the steps.
  • Method for processing MVPD service in a digital device the step of storing data for the last input application in the EIM; Identifying whether an MVPD mode is present in a TV service processor, and transmitting the identification result to an MVPD service processor; Identifying, by the MVPD service processor, an MVPD service type; Launching an MVPD application according to the MVPD service type identified by the MVPD service processor; And outputting the MVPD application to the input list at the time of booting or rebooting by registering with other external inputs to the list of input hubs and providing the MVPD service on the screen.
  • a digital device includes a user interface unit for receiving an input signal from a user, a broadcast service module for receiving a broadcast signal, a receiver for receiving an external input signal, A display module for displaying at least one or more applications and a controller for controlling operation of the digital device, the controller receiving an power on signal and executing the power off point prior to receiving the power on signal; Determining whether the image output application is an image output application, and if the application executed at the power-off time is an image output application, loading and executing the image output application and controlling an image output through the image output application.
  • the application may be designed to load and execute an application, to load a second image control application, to terminate the first image control application and to execute the second image control application when the loading is completed.
  • a digital device for processing an MVPD service may include an EIM processor configured to store data for a last input application; A TV service processor for identifying an MVPD mode and transferring the identification result to an MVPD service processor; And identify the MVPD service type, launch the MVPD application according to the identified MVPD service type, register it with other external inputs in the list of input hubs, and output the MVPD application in the input list at boot or reboot, And an MVPD service processor for providing an MVPD service.
  • the first image control application that provides the minimum image control function may be preferentially booted when a user turns on the device.
  • a technical effect of booting a second image control application that provides various image control functions in the background while providing a minimum image control function in the first image control application is provided.
  • an RF or / and an IP-based MVPD service may be supported and processed in a digital device equipped with a webOS platform.
  • the MVPD service may be provided in a form similar to external inputs such as HDMI, a component, and the like to provide access and convenience of use.
  • FIG. 1 is a view for schematically illustrating a service system including a digital device according to an embodiment of the present invention
  • FIG. 2 is a block diagram illustrating a digital device according to one embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a digital device according to another embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a digital device according to another embodiment of the present invention.
  • FIG. 5 is a block diagram illustrating a detailed configuration of the controller of FIGS. 2 to 4 according to one embodiment of the present invention
  • FIGS. 2 to 4 illustrates input means connected with the digital device of FIGS. 2 to 4 according to one embodiment of the present invention
  • FIG. 7 illustrates a Web OS architecture according to an embodiment of the present invention
  • FIG. 8 is a diagram for explaining the architecture of a Web OS device according to one embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a graphic composition flow in a Web OS device according to one embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a media server according to one embodiment of the present invention.
  • FIG. 11 is a block diagram illustrating a configuration of a media server according to one embodiment of the present invention.
  • FIG. 12 is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention
  • FIG. 13 is a block diagram illustrating in detail a configuration module of a digital device according to another embodiment of the present invention.
  • FIG. 14 is a diagram for describing a booting mode of a digital device according to one embodiment of the present invention.
  • FIG. 15 illustrates an example in which a digital device executes a first image control application and a second image control application according to an embodiment of the present disclosure
  • 16 is a diagram for describing an image control function provided by a digital device according to one embodiment of the present invention.
  • FIG. 17 illustrates a control function performed by a digital device through a first image control application according to an embodiment of the present disclosure
  • FIG. 18 is another diagram illustrating a control function performed by a digital device through a first image control application according to one embodiment of the present invention.
  • FIG. 19 illustrates a control function performed by a digital device through a second image control application according to an embodiment of the present disclosure
  • 20 is another diagram for describing a control function performed by a digital device through a second image control application according to one embodiment of the present invention
  • 21 and 22 are views for explaining an example of determining an image output application that a digital device receives and initially executes a power on signal according to an embodiment of the present disclosure
  • FIG. 23 is a diagram for explaining an example in which a digital device downloads a second image control application from an external server or the like according to an embodiment of the present disclosure
  • 24 is a flowchart illustrating a control method of a digital device according to one embodiment of the present invention.
  • FIG. 25 is a block diagram illustrating a configuration for recognizing and processing an RF-based MVPD service as an input mode in a device according to an embodiment of the present invention.
  • FIG. 26 is a sequence diagram for processing an MVPD service according to an embodiment of the present invention.
  • FIG. 27 is a block diagram illustrating a MVPD service processing module according to an embodiment of the present invention.
  • FIG. 28 illustrates a seamless process when the input mode is MVPD according to an embodiment of the present invention
  • 29 and 30 illustrate sequence diagrams for last input processing associated with a seamless change in accordance with one embodiment of the present invention
  • FIG. 31 illustrates a processing block diagram for a seamless change according to an embodiment of the present invention
  • FIG. 32 is a process sequence diagram made in the processing block of FIG. 31;
  • FIG. 35 is a diagram for explaining a method of recognizing an MVPD as an input mode according to an embodiment of the present invention.
  • FIG. 36 illustrates an UX including MVPD in an input mode according to FIG. 35 described above;
  • FIG. 37 is a view illustrating bootd according to an embodiment of the present invention.
  • FIG. 38 illustrates a block diagram for recognizing and processing an IP-based MVPD service as an input mode in a device according to an embodiment of the present invention
  • FIG. 39 illustrates a media framework for an IP-based MVPD service according to an embodiment of the present invention
  • FIG. 40 is a diagram for explaining a media framework according to another embodiment of the present invention.
  • FIG. 41 is a block diagram illustrating a main configuration for providing an IP-based MVPD service according to an embodiment of the present invention.
  • FIGS. 42 and 43 are block diagrams illustrating a main configuration for providing an IP-based MVPD service according to another embodiment of the present invention.
  • the term “digital device” described herein refers to at least one of transmitting, receiving, processing, and outputting data such as content, service, application, and the like. Includes all devices that perform.
  • the digital device may be paired or connected (hereinafter referred to as 'pairing') with another digital device, an external server, or the like through a wired / wireless network, and transmits predetermined data therethrough. Can send / receive At this time, if necessary, the data may be appropriately converted (converted) before the transmission / reception.
  • the digital device includes, for example, a standing device such as a network television (TV), a hybrid broadcast broadband TV (HBBTV), a smart television (TV), an internet protocol television (IPTV), a personal computer (PC), or the like.
  • a mobile device such as a PDA (Personal Digital Assistant), smart phone (Smart Phone), tablet PC (Tablet PC), notebook (notebook) and the like can all include.
  • a digital TV is illustrated in FIG. 2 and a mobile device is illustrated in FIG. 3, which will be described later to help the understanding of the present invention and for convenience of the applicant.
  • the digital device described herein may be a configuration having only a panel, or may be a set configuration such as a set-top box (STB), a device, a system, and the like. .
  • STB set-top box
  • wired / wireless network refers to a communication network supporting various communication standards or protocols for pairing and / or transmitting and receiving data between digital devices or digital devices and external servers.
  • wired / wireless networks include all communication networks that are currently or will be supported by the specification and are capable of supporting one or more communication protocols for them.
  • Such wired and wireless networks include, for example, Universal Serial Bus (USB), Composite Video Banking Sync (CVBS), Component, S-Video (Analog), Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI), Network for wired connection such as RGB, D-SUB and communication standard or protocol therefor, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), Zigbee (ZigBee), Digital Living Network Alliance (DLNA), Wireless LAN (WLAN) (Wi-Fi), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), LTE / LTE It may be formed by a network for a wireless connection such as Long Term Evolution / LTE-Advanced (A), Wi-Fi Direct, and a communication standard or protocol therefor.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • the meaning when referred to herein only as a digital device, the meaning may mean a fixed device or a mobile device depending on the context, and may be used to include both unless specifically mentioned.
  • the digital device is, for example, an intelligent device that supports a broadcast receiving function, a computer function or support, at least one external input, and the like, and includes e-mail and web browsing through the above-described wired / wireless network. , Banking, games, applications, and the like.
  • the digital device may include an interface for supporting at least one input or control means (hereinafter, “input means”) such as a handwritten input device, a touch-screen, and a spatial remote controller. Can be.
  • the digital device may use a standardized universal operating system (Operating System), and in particular, the digital device described in the present specification uses a web OS. Accordingly, digital devices can add, delete, modify, and update various services or applications on a general-purpose OS kernel or Linux kernel. It is possible, through which a more user-friendly environment can be constructed and provided.
  • a standardized universal operating system Operating System
  • the digital device described in the present specification uses a web OS. Accordingly, digital devices can add, delete, modify, and update various services or applications on a general-purpose OS kernel or Linux kernel. It is possible, through which a more user-friendly environment can be constructed and provided.
  • the above-described digital device may receive and process an external input.
  • the external input may be connected to an external input device, that is, the digital device through a wired / wireless network, to transmit / receive data, and to process the external input.
  • An input means to a digital device.
  • a game device such as a high-definition multimedia interface (HDMI), a playstation or an X-box, a smartphone, a tablet PC, a pocket photo, etc. may be used as the external input.
  • digital devices such as printing devices, smart TVs, Blu-ray device devices, and the like.
  • server refers to a digital device or system for supplying data to or receiving data from the above-mentioned digital device, that is, a client, and also referred to as a processor. do.
  • a portal server for providing a web page, a web content or a web service
  • an advertising server for providing advertising data
  • Providing a content server providing content an SNS server providing a social network service (SNS), a service server provided by a manufacturer, and providing a video on demand (VOD) or streaming service
  • SNS social network service
  • VOD video on demand
  • It may include a multi-channel video programming distributor (MVDP), a service server for providing a pay service, and the like.
  • MVDP multi-channel video programming distributor
  • the meaning may be a meaning including not only an application but also a service based on the context.
  • FIG. 1 is a diagram schematically illustrating a service system including a digital device according to an embodiment of the present invention.
  • the service system includes a content provider 10, a service provider 20, a network provider 30, and a home network end user (HNED). And 40.
  • the HNED 40 comprises, for example, a client 100, ie a digital device according to the invention.
  • the content provider 10 produces and provides various contents. As shown in FIG. 1, such a content provider 10 may include a terrestrial broadcast sender, a cable SO (System Operator) or MSO (Multiple SO), a satellite broadcast sender, various Internet broadcast senders, and an individual. Content providers and the like. The content provider 10 may produce and provide various services or application webs in addition to broadcast content.
  • a content provider 10 may include a terrestrial broadcast sender, a cable SO (System Operator) or MSO (Multiple SO), a satellite broadcast sender, various Internet broadcast senders, and an individual.
  • MSO Multiple SO
  • the content provider 10 may produce and provide various services or application webs in addition to broadcast content.
  • the service provider 20 service packetizes the content produced by the content provider 10 and provides it to the HNED 40.
  • the service provider 20 may package at least one or more of contents produced by the first terrestrial broadcast, the second terrestrial broadcast, the cable MSO, the satellite broadcast, the various Internet broadcasts, the application, and the like for the service, and the HNED ( 40).
  • the service provider 20 provides a service to the client 100 in a uni-cast or multi-cast manner.
  • the service provider 20 may transmit data to a plurality of clients 100 registered in advance, for this purpose may use the Internet Group Management Protocol (IGMP) protocol.
  • IGMP Internet Group Management Protocol
  • the content provider 10 and the service provider 20 described above may be the same entity.
  • the content produced by the content provider 10 may be packaged as a service and provided to the HNED 40 to perform the functions of the service provider 20 together or vice versa.
  • the network provider 30 provides a network for data exchange between the content provider 10 or / and the service provider 20 and the client 100.
  • the client 100 receives a data through a network provider 30, for example, by establishing a home network, and receives data about various services or applications such as VoD and streaming. You can also send / receive.
  • the content provider 10 and / or the service provider 20 in the service system may use conditional access or content protection means to protect the transmitted content.
  • the client 100 may use a processing means such as a cable card (or point of deployment) or a downloadable casing (DCAS) in response to the limited reception or content protection.
  • a processing means such as a cable card (or point of deployment) or a downloadable casing (DCAS) in response to the limited reception or content protection.
  • the client 100 may also use a bidirectional service through a network. Accordingly, the client 100 may perform a role or function of a content provider, and the service provider 20 may receive it and transmit it to another client.
  • the content provider 10 and / or the service provider 20 may be a server that provides a service described later herein.
  • the server may mean owning or including the network provider 30 as necessary.
  • the service or service data includes not only services or applications received from the outside described above, but also internal services or applications, and the services or applications include service or application data for the Web OS-based client 100. Can mean.
  • a digital device includes a user interface unit for receiving an input signal from a user, a broadcast service module for receiving a broadcast signal, a receiver for receiving an external input signal, A display module for displaying at least one or more applications and a controller for controlling operation of the digital device, the controller receiving an power on signal and executing the power off point prior to receiving the power on signal; Determining whether the image output application is an image output application, and if the application executed at the power-off time is an image output application, loading and executing the image output application and controlling an image output through the image output application.
  • the application may be designed to load and execute an application, to load a second image control application, to terminate the first image control application and to execute the second image control application when the loading is completed.
  • a digital device may include a processor that stores data about a last input application, a TVD processor that identifies a MVPD mode, and delivers the identification result to an MVPD service processor. Identify the service type, launch the MVPD application according to the identified MVPD service type, register it with other external inputs in the list of input hubs, and register the MVPD application in the input list at boot or reboot. And an MVPD service processor for outputting the MVPD service on the screen.
  • FIG. 2 is a block diagram illustrating a digital device according to an embodiment of the present invention.
  • the digital device described herein corresponds to the client 100 of FIG. 1.
  • the digital device 200 includes a network interface 201, a TCP / IP manager 202, a service delivery manager 203, an SI decoder 204, A demux or demultiplexer 205, an audio decoder 206, a video decoder 207, a display A / V and OSD module 208, a service control manager (service control manager) 209, service discovery manager 210, SI & metadata DB 211, metadata manager 212, service manager 213, And a UI manager 214.
  • a network interface 201 includes a network interface 201, a TCP / IP manager 202, a service delivery manager 203, an SI decoder 204, A demux or demultiplexer 205, an audio decoder 206, a video decoder 207, a display A / V and OSD module 208, a service control manager (service control manager) 209, service discovery manager 210, SI & metadata DB 211, metadata manager 212, service manager 213, And a UI
  • the network interface unit 201 may be configured to perform IP packet (s) (Internet Protocol (IP) packet (s)) or IP datagram (s) (hereinafter referred to as IP packet (s) through an accessing network. Send / receive)
  • IP packet Internet Protocol
  • IP datagram IP datagram
  • Send / receive For example, the network interface unit 201 may receive a service, an application, content, and the like from the service provider 20 of FIG. 1 through a network.
  • the TCP / IP manager 202 may be configured to transfer packets between the source and the destination for IP packets received by the digital device 200 and IP packets transmitted by the digital device 200. involved in packet delivery).
  • the TCP / IP manager 202 classifies the received packet (s) to correspond to an appropriate protocol, and includes a service delivery manager 205, a service discovery manager 210, a service control manager 209, and a metadata manager 212. Output the classified packet (s).
  • the service delivery manager 203 is in charge of controlling the received service data.
  • the service delivery manager 203 may use RTP / RTCP when controlling real-time streaming data.
  • the service delivery manager 203 parses the received data packet according to the RTP and transmits it to the demultiplexer 205 or the control of the service manager 213.
  • the service delivery manager 203 feeds back the network reception information to a server that provides a service using RTCP.
  • the demultiplexer 205 demultiplexes the received packet into audio, video, SI (System Information) data, and the like, and transmits the demultiplexed unit to the audio / video decoders 206/207 and the SI decoder 204, respectively.
  • SI System Information
  • the SI decoder 204 includes demultiplexed SI data, that is, Program Specific Information (PSI), Program and System Information Protocol (PSIP), Digital Video Broadcasting-Service Information (DVB-SI), and Digital Television Terrestrial Multimedia (DTMB / CMMB). Decode service information such as Broadcasting / Coding Mobile Multimedia Broadcasting).
  • the SI decoder 204 may store the decoded service information in the SI & metadata database 211. The stored service information may be read and used by the corresponding configuration, for example, at the request of a user.
  • the audio / video decoder 206/207 decodes each demultiplexed audio data and video data.
  • the decoded audio data and video data are provided to the user through the display unit 208.
  • the application manager may include, for example, the UI manager 214 and the service manager 213 and perform a control function of the digital device 200.
  • the application manager may manage the overall state of the digital device 200, provide a user interface (UI), and manage other managers.
  • UI user interface
  • the UI manager 214 provides a Graphic User Interface (UI) / UI for a user by using an OSD (On Screen Display) and the like, and receives a key input from the user to perform a device operation according to the input. For example, the UI manager 214 transmits the key input signal to the service manager 213 when receiving a key input related to channel selection from the user.
  • UI Graphic User Interface
  • OSD On Screen Display
  • the service manager 213 controls a manager associated with a service such as a service delivery manager 203, a service discovery manager 210, a service control manager 209, and a metadata manager 212.
  • the service manager 213 generates a channel map and controls the channel selection using the generated channel map according to the key input received from the UI manager 214.
  • the service manager 213 receives service information from the SI decoder 204 and sets the audio / video packet identifier (PID) of the selected channel to the demultiplexer 205.
  • PID audio / video packet identifier
  • the PID set as described above may be used in the above demultiplexing process. Accordingly, the demultiplexer 205 filters (PID or section filtering) audio data, video data, and SI data by using the PID.
  • the service discovery manager 210 provides information necessary to select a service provider that provides a service. Upon receiving a signal regarding channel selection from the service manager 213, the service discovery manager 210 searches for a service using the information.
  • the service control manager 209 is responsible for selecting and controlling services. For example, the service control manager 209 uses IGMP or RTSP when the user selects a live broadcasting service such as a conventional broadcasting method, and uses RTSP when selecting a service such as VOD. Select and control services.
  • the RTSP protocol may provide a trick mode for real time streaming.
  • the service control manager 209 may initialize and manage a session through the IMS gateway 250 using an IP Multimedia Subsystem (IMS) or a Session Initiation Protocol (SIP).
  • IMS IP Multimedia Subsystem
  • SIP Session Initiation Protocol
  • the protocols are one embodiment, and other protocols may be used depending on implementation.
  • the metadata manager 212 manages metadata associated with the service and stores the metadata in the SI & metadata database 211.
  • the SI & metadata database 211 stores service information decoded by the SI decoder 204, metadata managed by the metadata manager 212, and information necessary to select a service provider provided by the service discovery manager 210. do.
  • the SI & metadata database 211 can store set-up data and the like for the system.
  • the SI & metadata database 211 may be implemented using non-volatile memory (NVRAM), flash memory, or the like.
  • NVRAM non-volatile memory
  • the IMS gateway 250 is a gateway that collects functions necessary for accessing an IMS-based IPTV service.
  • FIG. 3 is a block diagram illustrating a digital device according to another embodiment of the present invention.
  • Figure 3 is a mobile device to another embodiment of the digital device.
  • the mobile device 300 may include a wireless communication unit 310, an A / V input unit 320, a user input unit 330, a sensing unit 340, an output unit 350,
  • the memory 360 may include an interface unit 370, a controller 380, a power supply unit 390, and the like.
  • the wireless communication unit 310 may include one or more modules that enable wireless communication between the mobile device 300 and the wireless communication system or between the mobile device and the network in which the mobile device is located.
  • the wireless communication unit 310 may include a broadcast receiving module 311, a mobile communication module 312, a wireless internet module 313, a short range communication module 314, a location information module 315, and the like. .
  • the broadcast receiving module 311 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 312.
  • the broadcast related information may exist in various forms, for example, in the form of an electronic program guide (EPG) or an electronic service guide (ESG).
  • EPG electronic program guide
  • ESG electronic service guide
  • the broadcast receiving module 311 may be, for example, ATSC, DVB-T (Digital Video Broadcasting-Terrestrial), DVB-S (Satellite), MediaFLO (Media Forward Link Only), DVB-H (Handheld), ISDB-T ( Digital broadcasting signals may be received using a digital broadcasting system such as Integrated Services Digital Broadcast-Terrestrial.
  • the broadcast receiving module 311 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 311 may be stored in the memory 360.
  • the mobile communication module 312 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice signal, a video call signal, or a text / multimedia message.
  • the wireless internet module 313 may include a module for wireless internet access and may be embedded or external to the mobile device 300.
  • Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
  • the short range communication module 314 refers to a module for short range communication.
  • Short range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, RS-232, and RS-485. Can be.
  • the location information module 315 may be a module for acquiring location information of the mobile device 300, and may use a Global Position System (GPS) module as an example.
  • GPS Global Position System
  • the A / V input unit 320 is for inputting an audio or / video signal, and may include a camera 321 and a microphone 322.
  • the camera 321 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the imaging mode.
  • the processed image frame may be displayed on the display unit 351.
  • the image frame processed by the camera 321 may be stored in the memory 360 or transmitted to the outside through the wireless communication unit 310. Two or more cameras 321 may be provided depending on the use environment.
  • the microphone 322 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 312 and output in the call mode.
  • the microphone 322 may be implemented with various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
  • the user input unit 330 generates input data for the user to control the operation of the terminal.
  • the user input unit 330 may include a key pad, a dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like.
  • the sensing unit 340 may determine the current state of the mobile device 300 such as an open / closed state of the mobile device 300, a location of the mobile device 300, presence or absence of user contact, orientation of the mobile device, acceleration / deceleration of the mobile device, and the like.
  • the sensing unit generates a sensing signal for controlling the operation of the mobile device 300. For example, when the mobile device 300 is moved or tilted, the position or tilt of the mobile device may be sensed. Also, whether the power supply unit 390 is supplied with power or whether the interface unit 370 is coupled to an external device may be sensed.
  • the sensing unit 240 may include a proximity sensor 341 including near field communication (NFC).
  • the output unit 350 is to generate an output related to visual, auditory or tactile senses, and may include a display unit 351, a sound output module 352, an alarm unit 353, a haptic module 354, and the like. have.
  • the display unit 351 displays (outputs) information processed by the mobile device 300. For example, when the mobile device is in the call mode, the UI or GUI related to the call is displayed. When the mobile device 300 is in a video call mode or a photographing mode, the mobile device 300 displays a photographed and / or received image, a UI, or a GUI.
  • the display unit 351 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display ( flexible display) and three-dimensional display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • three-dimensional display three-dimensional display.
  • Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
  • a representative example of the transparent display is TOLED (Transparant OLED).
  • the rear structure of the display unit 351 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 351 of the terminal body.
  • two or more display units 351 may exist.
  • a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile device 300, or may be disposed on different surfaces.
  • the display unit 351 and a sensor for detecting a touch motion form a mutual layer structure (hereinafter referred to as a touch screen)
  • the display unit 351 may be input in addition to the output device. It can also be used as a device.
  • the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 351 or capacitance generated at a specific portion of the display unit 351 into an electrical input signal.
  • the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
  • the corresponding signal (s) is sent to the touch controller.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 380.
  • the controller 380 may determine which area of the display unit 351 is touched.
  • the proximity sensor 341 may be disposed in an inner region of the mobile device surrounded by the touch screen or near the touch screen.
  • the proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • Proximity sensors have a longer life and higher utilization than touch sensors.
  • the proximity sensor examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch
  • the act of actually touching the pointer on the screen is called “contact touch.”
  • the position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
  • the proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state).
  • a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
  • Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
  • the sound output module 352 may output audio data received from the wireless communication unit 310 or stored in the memory 360 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 352 may output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the mobile device 300.
  • the sound output module 352 may include a receiver, a speaker, a buzzer, and the like.
  • the alarm unit 353 outputs a signal for notifying occurrence of an event of the mobile device 300. Examples of events occurring in the mobile device include call signal reception, message reception, key signal input, and touch input.
  • the alarm unit 353 may output a signal for notifying the occurrence of an event by vibration, in addition to a video signal or an audio signal.
  • the video signal or the audio signal may also be output through the display unit 351 or the audio output module 352, so that they 351 and 352 may be classified as part of the alarm unit 353.
  • the haptic module 354 generates various tactile effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 354.
  • the intensity and pattern of vibration generated by the haptic module 354 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.
  • the haptic module 354 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like.
  • Various tactile effects can be generated, such as effects due to the effects of cold / warm reproduction using an element that can absorb heat or generate heat.
  • the haptic module 354 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through muscle sensation such as a finger or an arm.
  • the haptic module 354 may be provided with two or more according to the configuration aspect of the mobile device 300.
  • the memory 360 may store a program for the operation of the controller 380 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 360 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 360 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, It may include a storage medium of at least one type of magnetic disk, optical disk.
  • the mobile device 300 may operate in association with web storage that performs a storage function of the memory 360 on the Internet.
  • the interface unit 370 serves as a path to all external devices connected to the mobile device 300.
  • the interface unit 370 receives data from an external device, receives power, transfers the power to each component inside the mobile device 300, or transmits data within the mobile device 300 to the external device.
  • wired / wireless headset port, external charger port, wired / wireless data port, memory card port, port for connecting a device with an identification module, audio input / output (I / O) port, The video I / O port, the earphone port, and the like may be included in the interface unit 370.
  • the identification module is a chip that stores various types of information for authenticating the usage rights of the mobile device 300, and includes a user identification module (UIM), a subscriber identification module (SIM), and a universal user authentication module (UI). Universal Subscriber Identity Module (USIM), and the like.
  • a device equipped with an identification module (hereinafter referred to as an “identification device”) may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 200 through a port.
  • the interface unit 370 may be a path through which power from the cradle is supplied to the mobile device 300 or may be input by the user from the cradle. It may be a passage through which a command signal is transmitted to the mobile device. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile device is correctly mounted in the cradle.
  • the controller 380 typically controls the overall operation of the mobile device 300.
  • the controller 380 performs, for example, related control and processing for voice call, data communication, video call, and the like.
  • the controller 380 may include a multimedia module 381 for multimedia playback.
  • the multimedia module 381 may be implemented in the controller 380 or may be implemented separately from the controller 380.
  • the controller 380 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on a touch-screen as a character and an image, respectively.
  • the power supply unit 390 receives an external power source and an internal power source under the control of the controller 380 to supply power for operation of each component.
  • Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and other electrical units for performing other functions. Examples may be implemented by the controller 380 itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described herein may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • Software code may be implemented in software applications written in a suitable programming language.
  • the software code may be stored in the memory 360 and executed by the controller 380.
  • FIG. 4 is a block diagram illustrating a digital device according to another embodiment of the present invention.
  • the digital device 400 include a broadcast receiver 405, an external device interface 435, a storage 440, a user input interface 450, a controller 470, a display 480, and audio. It may include an output unit 485, a power supply unit 490 and a photographing unit (not shown).
  • the broadcast receiver 405 may include at least one tuner 410, a demodulator 420, and a network interface unit 430. However, in some cases, the broadcast receiver 405 may include a tuner 410 and a demodulator 420, but may not include the network interface 430, or vice versa.
  • the broadcast receiver 405 includes a multiplexer and a signal demodulated by the demodulator 420 via the tuner 410 and a signal received through the network interface 430. You can also multiplex.
  • the broadcast receiving unit 425 may include a demultiplexer to demultiplex the multiplexed signal or to demultiplex the demodulated signal or the signal passed through the network interface unit 430. Can be.
  • the tuner 410 receives an RF broadcast signal by tuning a channel selected by a user or all previously stored channels among radio frequency (RF) broadcast signals received through an antenna.
  • the tuner 410 also converts the received RF broadcast signal into an intermediate frequency (IF) signal or a baseband signal.
  • IF intermediate frequency
  • the received RF broadcast signal is a digital broadcast signal
  • it is converted into a digital IF signal (DIF).
  • the analog broadcast signal is converted into an analog baseband video or audio signal (CVBS / SIF). That is, the tuner 410 may process both a digital broadcast signal or an analog broadcast signal.
  • the analog baseband video or audio signal CVBS / SIF output from the tuner 410 may be directly input to the controller 470.
  • the tuner 410 may receive an RF broadcast signal of a single carrier or multiple carriers. Meanwhile, the tuner 410 sequentially tunes and receives RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna, and then converts them to intermediate frequency signals or baseband signals (DIFs). Frequency or baseband signal).
  • DIFs baseband signals
  • the demodulator 420 may receive and demodulate the digital IF signal DIF converted by the tuner 410 and perform channel decoding.
  • the demodulator 420 includes a trellis decoder, a de-interleaver, a reed-solomon decoder, or a convolutional decoder, deinterleaver, and lead. A solo decoder or the like.
  • the demodulator 420 may output a stream signal TS after performing demodulation and channel decoding.
  • the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.
  • the stream signal may be an MPEG-2 Transport Stream (TS) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, and the like.
  • TS MPEG-2 Transport Stream
  • the stream signal output from the demodulator 420 may be input to the controller 470.
  • the controller 470 may control demultiplexing, image / audio signal processing, and the like, control the output of the image through the display 480, and the audio output through the audio output unit 485.
  • the external device interface unit 435 provides an interfacing environment between the digital device 300 and various external devices.
  • the external device interface unit 335 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the external device interface unit 435 may include a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), a tablet PC, a smartphone, a Bluetooth device (Bluetooth). device), an external device such as a cloud, etc. may be connected via wired / wireless.
  • the external device interface unit 435 transmits a signal including data such as an image, video, and audio input through the connected external device to the controller 470 of the digital device.
  • the controller 470 may control the processed image, video, audio, and the like to output the data signal to the connected external device.
  • the external device interface unit 435 may further include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the A / V input / output unit may include a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), so that video and audio signals of an external device can be input to the digital device 400. It may include a DVI (Digital Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.
  • the wireless communication unit may perform short range wireless communication with another digital device.
  • the digital device 400 may include, for example, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance (DLNA). It may be networked with other digital devices according to a communication protocol.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • DLNA Digital Living Network Alliance
  • the external device interface unit 435 may be connected to the set top box STB through at least one of the various terminals described above to perform an input / output operation with the set top box STB.
  • the external device interface unit 435 may receive an application or an application list in an adjacent external device and transmit the received application or application list to the controller 470 or the storage unit 440.
  • the network interface unit 430 provides an interface for connecting the digital device 400 to a wired / wireless network including an internet network.
  • the network interface unit 430 may include, for example, an Ethernet terminal for connection with a wired network, and for example, a wireless LAN (WLAN) for connection with a wireless network.
  • WLAN wireless LAN
  • Fi Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and High Speed Downlink Packet Access (HSDPA) communication standards.
  • the network interface unit 430 may transmit or receive data with another user or another digital device through the connected network or another network linked to the connected network.
  • some content data stored in the digital device 400 may be transmitted to another user who is registered in advance in the digital device 400 or a selected user among the other digital devices or the selected digital device.
  • the network interface unit 430 may access a predetermined web page through a connected network or another network linked to the connected network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server.
  • content or data provided by a content provider or a network operator may be received. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider or a network provider may be received through a network.
  • the network interface unit 430 may select and receive a desired application from among applications that are open through the network.
  • the storage unit 440 may store a program for processing and controlling each signal in the controller 470, or may store a signal-processed video, audio, or data signal.
  • the storage unit 440 may perform a function for temporarily storing an image, audio, or data signal input from the external device interface unit 435 or the network interface unit 430.
  • the storage unit 440 may store information about a predetermined broadcast channel through a channel storage function.
  • the storage unit 440 may store an application or an application list input from the external device interface unit 435 or the network interface unit 330.
  • the storage unit 440 may store various platforms described below.
  • the storage unit 440 may include, for example, a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD or XD). Memory, etc.), RAM (RAM), or ROM (EEPROM, etc.) may include at least one type of storage medium.
  • the digital device 400 may reproduce and provide a content file (video file, still image file, music file, document file, application file, etc.) stored in the storage unit 440 to the user.
  • FIG. 4 illustrates an embodiment in which the storage unit 440 is provided separately from the control unit 470, but the present invention is not limited thereto. In other words, the storage unit 440 may be included in the control unit 470.
  • the user input interface unit 450 transmits a signal input by the user to the controller 470 or transmits a signal of the controller 470 to the user.
  • the user input interface unit 450 controls power on / off, channel selection, screen setting, etc. from the remote control device 500 according to various communication methods such as an RF communication method and an infrared (IR) communication method.
  • the signal may be received and processed, or the control signal of the controller 470 may be transmitted to the remote control device 500.
  • the user input interface unit 450 may transmit a control signal input from a local key (not shown), such as a power key, a channel key, a volume key, and a set value, to the controller 470.
  • a local key such as a power key, a channel key, a volume key, and a set value
  • the user input interface unit 450 may transmit a control signal input from a sensing unit (not shown) that senses a user's gesture to the controller 470, or may sense a signal of the controller 470.
  • the sensing unit may include a touch sensor, a voice sensor, a position sensor, an operation sensor, and the like.
  • the controller 470 demultiplexes the stream input through the tuner 410, the demodulator 420, or the external device interface unit 435, or processes the demultiplexed signals to generate a signal for video or audio output. And output.
  • the image signal processed by the controller 470 may be input to the display unit 480 and displayed as an image corresponding to the image signal.
  • the image signal processed by the controller 470 may be input to the external output device through the external device interface 435.
  • the audio signal processed by the controller 470 may be audio output to the audio output unit 485.
  • the voice signal processed by the controller 470 may be input to the external output device through the external device interface 435.
  • controller 470 may include a demultiplexer, an image processor, and the like.
  • the controller 470 may control overall operations of the digital device 400.
  • the controller 470 may control the tuner 410 to control tuning of an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.
  • the controller 470 may control the digital device 400 by a user command or an internal program input through the user input interface 450. In particular, it is possible to connect to the network so that the user can download the desired application or application list into the digital device 400.
  • the controller 470 controls the tuner 410 to input a signal of a channel selected according to a predetermined channel selection command received through the user input interface 450. It processes the video, audio or data signal of the selected channel.
  • the controller 470 allows the channel information selected by the user to be output through the display unit 480 or the audio output unit 485 together with the processed video or audio signal.
  • the controller 470 may be provided from an external device, for example, a camera or a camcorder, input through the external device interface unit 435 according to an external device image playback command received through the user input interface unit 450.
  • the video signal or the audio signal may be output through the display unit 480 or the audio output unit 485.
  • the controller 470 may control the display 480 to display an image.
  • an image For example, a broadcast image input through the tuner 410, an external input image input through the external device interface unit 435, an image input through a network interface unit, or an image stored in the storage unit 440.
  • the display unit 480 may control the display.
  • the image displayed on the display unit 480 may be a still image or a video, and may be a 2D image or a 3D image.
  • the controller 470 may control to reproduce the content.
  • the content may be content stored in the digital device 400, received broadcast content, or external input content input from the outside.
  • the content may be at least one of a broadcast image, an external input image, an audio file, a still image, a connected web screen, and a document file.
  • the controller 470 may control to display an application or a list of applications downloadable from the digital device 300 or from an external network.
  • the controller 470 may control to install and run an application downloaded from an external network, along with various user interfaces. In addition, by selecting a user, an image related to an application to be executed may be controlled to be displayed on the display unit 480.
  • a channel browsing processor may be further provided to generate a thumbnail image corresponding to the channel signal or the external input signal.
  • the channel browsing processor receives a stream signal TS output from the demodulator 320 or a stream signal output from the external device interface 335, extracts an image from the input stream signal, and generates a thumbnail image.
  • the generated thumbnail image may be input as it is or encoded to the controller 470.
  • the generated thumbnail image may be encoded in a stream form and input to the controller 470.
  • the controller 470 may display a thumbnail list including a plurality of thumbnail images on the display unit 480 using the input thumbnail image. Meanwhile, the thumbnail images in the thumbnail list may be updated sequentially or simultaneously. Accordingly, the user can easily grasp the contents of the plurality of broadcast channels.
  • the display unit 480 converts an image signal, a data signal, an OSD signal processed by the controller 470 or an image signal, data signal, etc. received from the external device interface unit 435 into R, G, and B signals, respectively. Generate a drive signal.
  • the display unit 480 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like.
  • the display unit 480 may be configured as a touch screen and used as an input device in addition to the output device.
  • the audio output unit 485 receives a signal processed by the controller 470, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs a voice signal.
  • the voice output unit 485 may be implemented as various types of speakers.
  • a sensing unit including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the digital device 400. .
  • the signal detected by the sensing unit may be transmitted to the control unit 3470 through the user input interface unit 450.
  • a photographing unit (not shown) for photographing the user may be further provided. Image information photographed by a photographing unit (not shown) may be input to the controller 470.
  • the controller 470 may detect a user's gesture by combining or respectively combining an image photographed by a photographing unit or a sensed signal from a sensing unit (not shown).
  • the power supply unit 490 supplies the power throughout the digital device 400.
  • controller 470 may be implemented in the form of a System on Chip (SoC), a display unit 480 for displaying an image, and an audio output unit 485 for audio output. Can be.
  • SoC System on Chip
  • display unit 480 for displaying an image
  • audio output unit 485 for audio output. Can be.
  • the power supply unit 490 may include a converter (not shown) for converting AC power into DC power.
  • a converter for example, when the display unit 480 is implemented as a liquid crystal panel including a plurality of backlight lamps, an inverter capable of operating a pulse width modulation (PWM) for driving of variable brightness or dimming It may further comprise an inverter (not shown).
  • PWM pulse width modulation
  • the remote control device 500 transmits the user input to the user input interface unit 450.
  • the remote control device 500 may use Bluetooth, Radio Frequency (RF) communication, Infrared (IR) communication, Ultra Wideband (UWB), ZigBee (ZigBee), or the like.
  • RF Radio Frequency
  • IR Infrared
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • the remote control device 500 may receive an image, an audio or a data signal output from the user input interface unit 450, display it on the remote control device 500, or output a voice or vibration.
  • the digital device 400 described above may be a digital broadcast receiver capable of processing a fixed or mobile ATSC or DVB digital broadcast signal.
  • the digital device according to the present invention may omit some of the configurations of the illustrated configurations, or may further include components not shown on the contrary.
  • the digital device does not include a tuner and a demodulator, and may receive and play content through a network interface unit or an external device interface unit.
  • FIG. 5 is a block diagram illustrating a detailed configuration of the controller of FIGS. 2 to 4 according to an embodiment of the present invention.
  • control unit may include a demultiplexer 510, an image processor 5520, an OSD generator 540, a mixer 550, a frame rate converter (FRC) 555, and It may include a formatter 560.
  • controller may further include a voice processor and a data processor.
  • the demultiplexer 510 demultiplexes an input stream.
  • the demultiplexer 510 may demultiplex the input MPEG-2 TS video, audio, and data signals.
  • the stream signal input to the demultiplexer 510 may be a stream signal output from a tuner, a demodulator, or an external device interface unit.
  • the image processor 420 performs image processing of the demultiplexed image signal.
  • the image processor 420 may include an image decoder 425 and a scaler 435.
  • the video decoder 425 decodes the demultiplexed video signal, and the scaler 435 scales the resolution of the decoded video signal so that the display unit can output the resolution.
  • the image decoder 525 may support various standards.
  • the video decoder 525 performs the function of the MPEG-2 decoder when the video signal is encoded in the MPEG-2 standard, and the video signal is encoded in the Digital Multimedia Broadcasting (DMB) method or the H.264 standard.
  • DMB Digital Multimedia Broadcasting
  • H.264 the function of the H.264 decoder can be performed.
  • the video signal decoded by the image processor 520 is input to the mixer 450.
  • the OSD generator 540 generates the OSD data according to a user input or itself. For example, the OSD generator 440 generates data for displaying various data in the form of a graphic or text on the screen of the display 380 based on a control signal of the user input interface.
  • the generated OSD data includes various data such as a user interface screen of the digital device, various menu screens, widgets, icons, viewing rate information, and the like.
  • the OSD generator 540 may generate data for displaying broadcast information based on subtitles or EPGs of a broadcast image.
  • the mixer 550 mixes the OSD data generated by the OSD generator 540 and the image signal processed by the image processor to provide the formatter 560. Since the decoded video signal and the OSD data are mixed, the OSD is overlaid and displayed on the broadcast video or the external input video.
  • the frame rate converter (FRC) 555 converts a frame rate of an input video.
  • the frame rate converter 555 may convert the frame rate of the input 60Hz image to have a frame rate of, for example, 120Hz or 240Hz according to the output frequency of the display unit.
  • various methods may exist in the method of converting the frame rate. For example, when the frame rate converter 555 converts the frame rate from 60 Hz to 120 Hz, the frame rate converter 555 inserts the same first frame between the first frame and the second frame or predicts the first frame from the first frame and the second frame. It can be converted by inserting three frames.
  • the frame rate converter 555 may insert and convert three more identical or predicted frames between existing frames. On the other hand, when no separate frame conversion is performed, the frame rate converter 555 may be bypassed.
  • the formatter 560 changes the output of the input frame rate converter 555 to match the output format of the display unit.
  • the formatter 560 may output R, G, B data signals, and the R, G, B data signals may be output as low voltage differential signals (LVDSs) or mini-LVDSs. Can be.
  • the formatter 560 may support a 3D service through the display by configuring the output in a 3D form according to the output format of the display.
  • the voice processing unit (not shown) in the controller may perform voice processing of the demultiplexed voice signal.
  • the voice processor (not shown) may support processing of various audio formats. For example, even when a voice signal is encoded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, BSAC, etc., a decoder corresponding thereto may be provided.
  • the voice processing unit (not shown) in the controller may process base, treble, volume control, and the like.
  • the data processor in the control unit may perform data processing of the demultiplexed data signal.
  • the data processor may decode the demultiplexed data signal even when it is encoded.
  • the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.
  • each component may be integrated, added, or omitted according to the specifications of the digital device actually implemented. That is, as needed, two or more components may be combined into one component or one component may be subdivided into two or more components.
  • the function performed in each block is for explaining an embodiment of the present invention, the specific operation or device does not limit the scope of the present invention.
  • the digital device may be an image signal processing device that performs signal processing of an image stored in the device or an input image.
  • a set top box (STB) excluding the display unit 480 and the audio output unit 485 shown in FIG. 4, the above-described DVD player, Blu-ray player, game device, computer And the like can be further illustrated.
  • FIG. 6 is a diagram illustrating input means connected to the digital device of FIGS. 2 to 4 according to one embodiment of the present invention.
  • a front panel (not shown) or control means (input means) provided on the digital device 600 is used.
  • control means is a user interface device (UID) capable of wired and wireless communication, the remote control 610, keyboard 630, pointing device 620, mainly implemented for the purpose of controlling the digital device 600, A touch pad may be included, but a control means dedicated to an external input connected to the digital device 600 may also be included.
  • control means also includes a mobile device such as a smart phone, a tablet PC, etc. that control the digital device 600 through mode switching and the like, although the purpose is not the digital device 600 control.
  • a pointing device is described as an embodiment, but is not limited thereto.
  • the input means is a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA), RS, or the like. At least one may be employed as necessary to communicate with the digital device.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • DLNA Digital Living Network Alliance
  • RS Digital Living Network Alliance
  • the remote controller 610 refers to conventional input means equipped with various key buttons necessary for controlling the digital device 600.
  • the pointing device 620 is equipped with a gyro sensor to implement a pointer corresponding to a screen of the digital device 600 based on a user's movement, pressure, rotation, etc., so that the digital device 600 Transmits a predetermined control command.
  • the pointing device 620 may be named by various names such as a magic remote controller and a magic controller.
  • the keyboard 630 is an intelligent integrated digital device that provides a variety of services such as a web browser, an application, a social network service (SNS), and the like, as the digital device 600 provides only a conventional broadcast. It is not easy to implement, and it is implemented to facilitate input of texts by implementing it similar to the keyboard of a PC.
  • SNS social network service
  • control means such as the remote control 610, the pointing device 620, the keyboard 630 is provided with a touch pad, if necessary, for more convenient and various control purposes such as text input, pointer movement, enlargement / reduction of pictures or videos Can be used for
  • the digital device described in the present specification uses a Web OS as an OS and / or a platform.
  • a process such as a configuration or an algorithm based on Web OS may be performed by the controller of the above-described digital device.
  • the control unit includes the control unit in FIGS. 2 to 5 and uses the concept broadly. Therefore, hereinafter, the configuration for the processing of Web OS-based or related services, applications, content, etc. in the digital device, the hardware or components including the related software (software), firmware (firmware), etc. to the controller (controller) Explain by naming.
  • Such a Web OS based platform is intended to enhance development independence and functionality scalability by integrating services and applications based on, for example, a luna-service bus, and to develop applications based on a web application framework. Productivity can also be increased. In addition, multi-tasking can be supported by efficiently utilizing system resources through Web OS processes and resource management.
  • the Web OS platform described in the present specification can be used not only for fixed devices such as PCs, TVs, set-top boxes (STBs) but also for mobile devices such as mobile phones, smart phones, tablet pcs, notebooks, and wearable devices. .
  • the architecture of software for digital devices is a monolithic structure that is based on conventional problem solving and market-dependent monolithic structures, and is a single process and closed product based on multi-threading technology. Afterwards, there was a difficulty in external application, and after that, we aimed for new platform-based development, and then layered and componentized by pursuing cost innovation and efficient UI and external application development through chip-set replacement. ), Which had a three-layered structure, add-on, single source product, and add-on structure for open applications.
  • the software architecture has been further developed to provide a modular architecture of functional units, to provide a Web Open API (Application Programming Interface) for the eco-system, and to provide a game engine. Modular design for the native open API (Native Open API), etc. is being made, and accordingly, it is generated as a multi-process structure based on the service structure.
  • FIG. 7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention.
  • the platform can be largely classified into a kernel, a system library-based Web OS core platform, an application, a service, and the like.
  • the architecture of the Web OS platform is a layered structure, with the OS at the bottom layer, system library (s) at the next layer, and applications at the top.
  • the lowest layer may include a Linux kernel as an OS layer and include Linux as an OS of the digital device.
  • BSP Board Support Package
  • HAL Hardware Abstraction Layer
  • Web OS core modules layer Web OS core modules layer
  • service layer Luna-Service bus layer
  • the Enyo framework / NDK / QT layer and the top layer, the application layer is sequentially present.
  • some layers in the above-described Web OS layer structure may be omitted, and a plurality of layers may be one layer or conversely, one layer may have a plurality of layer structures.
  • the Web OS core module layer is based on a Luna Surface Manager (LSM) that manages surface windows, etc., a System & Application Manage (SAM) and a WebKit (WebKit) that manages the execution and execution states of applications. It may include a WAM (Web Application Manager) for managing a web application.
  • LSM Luna Surface Manager
  • SAM System & Application Manage
  • WebKit WebKit
  • WAM Web Application Manager
  • the LSM manages an application window displayed on the screen.
  • the LSM manages display hardware, provides a buffer that renders contents required by applications, and composes a result of rendering by a plurality of applications on a screen. You can print
  • the SAM manages performance policies for various conditions of systems and applications.
  • WAM is based on the Enyo Framework, which allows Web OS to be viewed as a web application.
  • the service use of the application is made through the Luna-service bus, and the service can be newly registered on the bus, and the application can find and use the service that it needs.
  • the service layer may include services of various service levels, such as a TV service and a Web OS service.
  • the Web OS service may include a media server, Node.JS, and the like.
  • the Node.JS service supports, for example, JavaScript.
  • Web OS services can communicate over the bus to Linux processes that implement function logic. It can be divided into four parts, which are migrated from the TV process and the existing TV to the Web OS or services that are differentiated from the makers, the Web OS common service, and the JavaScript developed and used through Node.js. It consists of a Node.js service.
  • the application layer may include all applications that can be supported by a digital device, such as a TV application, a showcase application, a native application, and a web application.
  • Applications on the Web OS may be classified into a web application, a Palm Development Kit (PDK) application, a Qt Meta Language or Qt Modeling Language (QML) application, and the like according to an implementation method.
  • PDK Palm Development Kit
  • QML Qt Modeling Language
  • the web application is based on the WebKit engine and runs on the WAM Runtime. Such web applications may be based on the Enyo framework, or may be developed and executed based on general HTML5, Cascading Style Sheets (CSS), or JavaScript.
  • CCS Cascading Style Sheets
  • the PDK application includes a third-party or native application developed in C / C ++ based on a PDK provided for an external developer.
  • the PDK refers to a development library and a set of tools provided to enable a third party such as a game to develop a native application (C / C ++).
  • a PDK application can be used for the development of applications whose performance is important.
  • the QML application is a Qt-based native application, and includes a basic application provided with the Web OS platform such as a card view, a home dashboard, a virtual keyboard, and the like.
  • QML is a mark-up language in script form instead of C ++.
  • the native application refers to an application that is developed in C / C ++, compiled, and executed in a binary form.
  • Such a native application has an advantage in that its execution speed is fast.
  • FIG. 8 is a diagram illustrating the architecture of a Web OS device according to one embodiment of the present invention.
  • FIG. 8 is a block diagram based on runtime of a Web OS device, which can be understood with reference to the layered structure of FIG. 7.
  • services and applications and Web OS core modules are included on a system OS (Linux) and system libraries, and communication therebetween may be via a luna-service bus.
  • Node.js services based on HTML5, CSS, JavaScript, e-mail, contacts, calendar, logging, backup, file notifier Web OS services such as notify, database (DB), activity manager, system policy, audio daemon (AudioD), update, media server, etc.
  • TV services such as Electronic Program Guide (PVR), Personal Video Recorder (PVR), data broadcasting, etc., voice recognition, Now on, Notification, search CP services such as ACR (Auto Content Recognition), CBOX (Contents List Broswer), wfdd, DMR, Remote Application, Download, SDPIF (Sony Philips Digital Interface Format), PDK applications, browser , Native applications such as QML applications
  • Enyo framework-based UI-related TV applications and Web applications are processed through Web OS core modules such as SAM, WAM, and LSM described above through the Luna-Service Bus.
  • TV applications and Web applications may not necessarily be Enyo framework based or UI related.
  • CBOX can manage the list and metadata of the content of external devices such as USB, DLNA, cloud, etc. connected to the TV. Meanwhile, the CBOX may output content listings of various content containers such as USB, DMS, DVR, cloud, etc. in an integrated view. In addition, the CBOX can display various types of content listings such as pictures, music, and videos, and manage its metadata. In addition, the CBOX may output the contents of the attached storage in real-time. For example, the CBOX should be able to immediately output a content list of the storage device when the storage device such as USB is plugged in. In this case, a standardized method for processing the content listing may be defined. In addition, CBOX can accommodate a variety of connection protocols.
  • SAM is intended to improve module complexity and enhance scalability.
  • the existing System Manager processes multiple functions such as system UI, window management, web application runtime, and handling constraints on UX in one process, so that the complexity of implementation is large. Clear implementation interfaces reduce implementation complexity.
  • LSM supports the development and integration of system UX implementations, such as card views and launcher, independently, and makes it easy to respond to changes in product requirements.
  • LSM when synthesizing a plurality of application screens, such as App-on-App, to make the most of the hardware resources (HW resource) to enable multi-tasking, multi-window (multi-window) and 21: 9, etc. It can provide a window management mechanism (window management mechanism) for.
  • LSM supports the implementation of system UI based on QML and improves its development productivity.
  • QML UX can easily compose views of screen layouts and UI components, and can easily develop code to handle user input.
  • the interface between QML and Web OS components is made through QML extension plug-in, and the graphic operation of the application may be based on the wayland protocol, luna-service call, etc. have.
  • LSM stands for Luna Surface Manager and functions as an application window compositor.
  • the LSM allows you to synthesize independently developed applications, UI components, etc. on the screen.
  • the LSM defines a output area, an interworking method, and the like as a compositor.
  • the compositor LSM handles graphics compositing, focus management, input events, and the like.
  • the LSM receives an event, focus, and the like from an input manager.
  • the input manager may include a HID such as a remote controller, a mouse & a keyboard, a joystick, a game pad, an application remote, a pen touch, and the like.
  • LSM supports multiple window models, which can be executed simultaneously in all applications due to the system UI.
  • NLP Natural Language Processing
  • MRCU Mobile Radio Control Unit
  • Live menu ACR (Auto Content Recognition), etc. .
  • FIG. 9 is a diagram illustrating a graphic composition flow in a Web OS device according to one embodiment of the present invention.
  • the graphic composition processing includes a web application manager 910 in charge of a UI process, a webkit 920 in charge of a web process, a LSM 930, and a graphic manager (GM). Through 940.
  • the generated graphic data is transferred to the LSM 930 when the graphic data is not a full-screen application.
  • the web application manager 910 receives an application generated by the web kit 920 for sharing a GPU (Graphic Processing Unit) memory for graphic management between the UI process and the web process, and then displays the full-screen as described above. If it is not the application passes to the LSM (930). In the case of the full-screen application, the LSM 930 may be bypassed, and in this case, the LSM 930 may be directly transferred to the graphic manager 940.
  • the LSM 930 may be bypassed, and in this case, the LSM 930 may be directly transferred to the graphic manager 940.
  • the LSM 930 transmits the received UI application to the Wayland Compositor via the Wayland surface, and processes the received UI application to the graphic manager.
  • the graphic data delivered from the LSM 930 is delivered to the graphic manager compositor via, for example, the LSM GM surface of the graphic manager 940.
  • the full-screen application is delivered directly to the graphic manager 940 without passing through the LSM 930, which is processed by the graphic manager compositor via the WAM GM surface.
  • the graphics manager handles all graphic data in the Web OS device, including GM surfaces such as data broadcasting applications, caption applications, etc., as well as data via LSM GM surfaces and WAM GM surfaces. Receives all the graphic data that has passed through and processes it to be properly displayed on the screen.
  • GM surfaces such as data broadcasting applications, caption applications, etc.
  • WAM GM surfaces Receives all the graphic data that has passed through and processes it to be properly displayed on the screen.
  • the function of the GM compositor is the same as or similar to that of the compositor described above.
  • FIG. 10 is a diagram illustrating a media server according to an embodiment of the present invention.
  • FIG. 11 is a block diagram illustrating a configuration of a media server according to an embodiment of the present invention. Is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention.
  • the media server supports the execution of various multimedia in the digital device and manages necessary resources.
  • the media server can efficiently use hardware resources required for media play.
  • the media server requires audio / video hardware resources in order to execute multimedia, and can efficiently utilize the resource usage status.
  • fixed devices with larger screens than mobile devices require more hardware resources to run multimedia and require faster encoding / decoding and graphics data delivery due to the large amount of data.
  • the media server may perform broadcasting, recording, and tuning tasks, record simultaneously with viewing, or simultaneously display the sender and receiver screens during a video call. It should be able to handle
  • the media server has limited hardware resources such as encoders, decoders, tuners, and display engines on a chip-set basis, making it difficult to execute multiple tasks at the same time. Input is processed.
  • the media server can enhance the system stability, for example, by removing and restarting a playback pipeline in which an error occurred during media playback by pipeline and restarting the error. Even if it does not affect other media play.
  • a pipeline is a chain connecting the respective unit functions such as decoding, analysis, and output when a media play request is requested, and required unit functions may vary according to a media type.
  • Media servers can have extensibility, for example, adding new types of pipelines without affecting existing implementations.
  • the media server may accommodate a camera pipeline, a video conference pipeline, a third-party pipeline, and the like.
  • the media server can handle normal media playback and TV task execution as separate services because the interface of the TV service is different from the media playback case.
  • the media server supports operations such as' setchannel ',' channelup ',' channeldown ',' channeltuning 'and' recordstart 'in relation to TV service, and' play 'and' pause in relation to general media playback.
  • operations such as' and 'stop', different operations can be supported for both, and they can be treated as separate services.
  • the media server may control or integrate management of resource management functions.
  • the allocation and retrieval of hardware resources in the device are integrated in the media server.
  • the TV service process transmits the running task and resource allocation status to the media server.
  • the media server frees resources and executes pipelines as each media runs, allowing execution by priority (e.g., policy) upon request for media execution based on the resource status occupied by each pipeline. Recall resources of other pipelines.
  • priority e.g., policy
  • predefined execution priority and required resource information for a specific request are managed by a policy manager, and the resource manager may communicate with the policy manager to process resource allocation and retrieval.
  • the media server may hold an identifier (ID) for all operations related to playback. For example, the media server may direct and direct a particular pipeline based on the identifier. The media server may issue separate commands to the pipelines for more than one media playback.
  • ID identifier
  • the media server can be responsible for playback of HTML5 standard media.
  • the media server may follow the TV restructuring scope of the separate service processing of the TV pipeline.
  • the media server may be designed and implemented regardless of the TV restructuring scope. If the TV is not serviced separately, the media server may need to be re-executed when there is a problem with a specific task.
  • the media server is also referred to as uMS, or micro media server.
  • the media player is a media client, which is, for example, a web for HTML5 video tag, camera, TV, Skype, 2nd Screen, etc. It may mean a kit.
  • management of micro resources such as a resource manager, a policy manager, and the like is a core function.
  • the media server also controls the playback control role for the web standard media content.
  • the media server may also manage pipeline controller resources.
  • Such media servers support, for example, extensibility, reliability, efficient resource usage, and the like.
  • the uMS that is, the media server
  • the uMS is a Web OS device such as a cloud game, a MVPD (pay service, etc.), a camera preview, a second screen, a skype, and the like. It manages and controls the overall use and control of resource for proper processing within the system. Meanwhile, each resource uses, for example, a pipeline when the resource is used, and the media server can manage and control the creation, deletion, and use of the pipeline for resource management.
  • a pipeline is created when a media associated with a task starts a continuation of tasks such as parsing a request, a decoding stream, a video output, and the like.
  • tasks such as parsing a request, a decoding stream, a video output, and the like.
  • watching, recording, channel tuning, and the like are each processed under control of resource usage through a pipeline generated according to the request. .
  • an application or service is connected to a media server 1020 via a luna-service bus 1010, and the media server 1020 is connected to pipelines regenerated through the luna-service bus 1010.
  • the application or service may have various clients according to its characteristics and may exchange data with the media server 1020 or pipeline through the client or the client.
  • the client includes, for example, a uMedia client (web kit) and a resource manager (RM) client (C / C ++) for connecting to the media server 1020.
  • a uMedia client web kit
  • RM resource manager
  • the application including the uMedia client is connected to the media server 1020 as described above. More specifically, the uMedia client corresponds to, for example, a video object to be described later, and the client uses the media server 1020 for operation of the video by request.
  • the video operation relates to a video state
  • loading, unloading, play, playback, or reproduce, pause, stop, and the like are all states related to the video operation. May contain data.
  • Each operation or state of such video can be handled through the creation of a separate pipeline.
  • the uMedia client sends state data related to the video operation to the pipeline manager 1022 in the media server.
  • the pipeline manager 1022 obtains information on a resource of the current device through data communication with the resource manager 1024 and requests allocation of a resource corresponding to the state data of the uMedia client.
  • the pipeline manager 1022 or the resource manager 1024 controls resource allocation through data communication with the policy manager 1026 when necessary in relation to the resource allocation. For example, when the resource manager 1024 has no or insufficient resources to allocate according to the request of the pipeline manager 1022, appropriate resource allocation may be performed according to the request according to the priority comparison of the policy manager 1026. Can be.
  • the pipeline manager 1022 requests the media pipeline controller 1028 to generate a pipeline for an operation according to the request of the uMedia client for the allocated resource according to the resource allocation of the resource manager 1024.
  • the media pipeline controller 1028 generates the required pipeline under the control of the pipeline manager 1022.
  • This generated pipeline as shown, not only a media pipeline, a camera pipeline, but also a pipeline related to play, pause, and pause may be generated.
  • the pipeline may include a pipeline for HTML5, Web CP, smartshare playback, thumbnail extraction, NDK, cinema, Multimedia and Hypermedia Information coding Experts Group (MHEG), and the like.
  • the pipeline may include, for example, a service-based pipeline (own pipeline) and a URI-based pipeline (media pipeline).
  • an application or service including an RM client may not be directly connected to the media server 1020. This is because an application or service may handle media directly. In other words, when an application or service directly processes media, it may not go through a media server. However, at this time, resource management is required for pipeline creation and its use. For this, the uMS connector functions. Meanwhile, when the uMS connector receives a resource management request for direct media processing of the application or service, the uMS connector communicates with the media server 1020 including the resource manager 1024. To this end, the media server 1020 should also be equipped with a uMS connector.
  • the application or service may respond to the request of the RM client by receiving resource management of the resource manager 1024 through the uMS connector.
  • RM clients can handle services such as native CP, TV services, second screens, Flash players, YouTube Source Source Extensions (MSE), cloud games, and Skype.
  • the resource manager 1024 may manage the resource through the data communication with the policy manager 1026 as necessary for resource management.
  • the URI-based pipeline is made through the media server 1020, rather than directly processing the media as described above.
  • a URI-based pipeline may include a player factory, a Gstreamer, a streaming plug-in, a digital rights management plug-in pipeline, and the like.
  • an interface method between an application and media services may be as follows.
  • the PDK interface using a service.
  • a method of using a service in the existing CP can be used to extend existing platform plug-ins based on Luna for backward compatibility.
  • Seamless change is handled by a separate module (e.g. TVWIN), which is the process of first displaying and seamlessly displaying the TV on the screen without the Web OS before or during the Web OS boot. . It is used for the purpose of providing basic functions of TV service for fast response to the user's power on request because Web OS boot time is slow.
  • the module also supports seamless change, factory mode, and the like, which provide fast boot and basic TV functions as part of the TV service process.
  • the module may be responsible for switching from the non-web OS mode to the Web OS mode.
  • a processing structure of the media server is shown.
  • the solid line box may indicate a process processing configuration
  • the dotted line box may indicate an internal processing module during the process.
  • the solid arrow may indicate an inter-process call, that is, a Luna service call
  • the dashed arrow may indicate a notification or data flow such as register / notify.
  • a service or web application or PDK application (hereinafter referred to as an "application") is connected to various service processing configurations via a luna-service bus, through which the application operates or is controlled.
  • the data processing path depends on the type of application. For example, when the application is image data related to a camera sensor, the application is transmitted to the camera processor 1130 and processed. In this case, the camera processor 1130 processes image data of the received application, including a gesture, a face detection module, and the like. For example, when the data is required to be selected by the user or to automatically use the pipeline, the camera processor 1130 may generate a pipeline through the media server processor 1110 and process the corresponding data.
  • the audio may be processed through the audio processor 1140 and the audio module 1150.
  • the audio processor 1140 processes the audio data received from the application and transmits the audio data to the audio module 1150.
  • the audio processor 1140 may include an audio policy manager to determine the processing of the audio data.
  • the audio data thus processed is processed by the audio module 1160.
  • the application may notify data related to audio data processing to the audio module 1160, which may also notify the audio module 1160 in a related pipeline.
  • the audio module 1150 includes an advanced Linux sound architecture (ALSA).
  • the corresponding content data is transmitted to the DRM service processor 1160, and the DRM service processor 1170 generates a DRM instance.
  • the DRM service processor 1160 may be connected to and process the DRM pipeline in the media pipeline through the Luna-service bus to process the content data on which the DRM is applied.
  • the following describes processing when the application is media data or TV service data (e.g., broadcast data).
  • TV service data e.g., broadcast data
  • FIG. 12 illustrates only the media server processing unit and the TV service processing unit in FIG. 11 described above in more detail.
  • the TV service processor 1120 may include, for example, at least one or more of a DVR / channel manager, a broadcasting module, a TV pipeline manager, a TV resource manager, a data broadcasting module, an audio setting module, a path manager, and the like.
  • the TV service processor 1220 may include a TV broadcast handler, a TV broadcast interface, a service processor, a TV middleware, a path manager, and a BSP. NetCast).
  • the service processor may mean, for example, a module including a TV pipeline manager, a TV resource manager, a TV policy manager, a USM connector, and the like.
  • the TV service processor may have a configuration as shown in FIG. 11 or 12 or a combination thereof, and some components may be omitted or some components not shown may be added.
  • the TV service processor 1120/1220 transmits the DVR or channel related data to the DVR / channel manager based on the property or type of the TV service data received from the application, and then to the TV pipeline manager to transmit the TV pipe. Create and process a line. Meanwhile, when the attribute or type of the TV service data is broadcast content data, the TV service processor 1120 generates and processes a TV pipeline through a TV pipeline manager for processing the corresponding data through a broadcast module.
  • a json (Javascript standard object notation) file or a file written in c is processed by the TV broadcast handler and transmitted to the TV pipeline manager through the TV broadcast interface to generate and process a TV pipeline.
  • the TV broadcast interface unit may transmit data or files that have passed through the TV broadcast handler to the TV pipeline manager based on the TV service policy and refer to them when generating the pipeline.
  • the TV pipeline manager may be controlled by the TV resource manager in generating one or more pipelines in response to a TV pipeline generation request from a processing module or manager in a TV service.
  • the TV resource manager may be controlled by the TV policy manager to request the state and allocation of resources allocated for the TV service according to the TV pipeline creation request of the TV pipeline manager, and the media server processor 1110. / 1210) and uMS connector to communicate data.
  • the resource manager in the media server processor 1110/1210 transmits a status and resource allocation of a resource for a current TV service at the request of the TV resource manager. For example, as a result of checking the resource manager in the media server processor 1110/1210, if all resources for the TV service are already allocated, the TV resource manager may notify that all resources are currently allocated.
  • the resource manager in the media server processing unit removes a predetermined TV pipeline according to the priority or a predetermined criterion among the TV pipelines previously allocated for the TV service together with the notification, and the TV pipeline for the requested TV service. You can also request or assign generation. Alternatively, the TV resource manager may appropriately remove, add, or establish a TV pipeline in accordance with the status report of the resource manager in the media server processor 1110/1210.
  • the BSP supports backward compatibility with existing digital devices, for example.
  • the TV pipelines thus generated may be properly operated under the control of the path manager during the processing.
  • the path manager may determine or control the processing path or process of the pipelines in consideration of not only the TV pipeline but also the operation of the pipeline generated by the media server processor 1110/1210.
  • the media server processor 1110/1210 includes a resource manager, a policy manager, a media pipeline manager, a media pipeline controller, and the like.
  • a pipeline generated under the control of the media pipeline manager and the media pipeline controller can be variously created such as a camera preview pipeline, a cloud game pipeline, and a media pipeline.
  • the media pipeline may include a streaming protocol, an auto / static gstreamer, a DRM, and the like, which may be determined according to a path manager's control.
  • the detailed processing procedure in the media server processor 1110/1210 uses the above-described description of FIG. 10 and will not be repeated herein.
  • the resource manager in the media server processor 1110/1210 may manage resources on a counter base, for example.
  • the digital device 1300 is a block diagram illustrating in detail a configuration module of a digital device according to another embodiment of the present invention. 1 to 12, some of the modules of the digital device of FIG. 13 can be added or changed, and the scope of the present invention is not determined by the elements described in FIGS. It should be interpreted according to the description in.
  • the digital device 1300 may include a user interface unit 1310, a broadcast service module 1320, a receiver 1330, a display module 1340, and a controller ( 1350).
  • the user interface 1310 may receive a specific signal from a user.
  • the user interface unit 1310 generates a window for providing information on an image currently displayed to the user, such as channel information and volume information, and a graphic user interface for transmitting / receiving a control signal. can do.
  • the user interface unit 1310 may receive a specific signal through a touch panel connected to the display module 1340 of the digital device 1300 and receive an infrared ray signal received from a sensor module (not shown). It can receive a specific signal by using.
  • the broadcast service module 1320 may receive a broadcast signal including broadcast program data from a broadcast station or a content provider (CP), and may process the received broadcast signal.
  • the broadcast service module 1320 may include a tuner, a demultiplexer, an image decoder, a scaler, a mixer, a frame rate converter, a formatter, and the like.
  • the broadcast service module 1320 may receive a broadcast signal, decode the received broadcast signal, and transmit broadcast time information included in the broadcast signal to the controller 1350.
  • the receiver 1330 may receive an external input signal input through an external input means.
  • the receiver 1330 may receive and process a control signal transmitted by a user using a remote controller, a remote control application of a smartphone, or the like.
  • the display module 1340 may process and display at least one image content data received by the digital device according to an embodiment of the present invention on a screen.
  • the at least one image content data may include a real-time broadcast program received through the broadcast service module 1320 or an external input image content input through a high definition multimedia interface (HDMI).
  • HDMI high definition multimedia interface
  • the controller 1350 generally manages the functions of at least one of the modules illustrated in FIG. 13, such as the user interface 1310, the broadcast service module 1320, the receiver 1330, and the display module 1340. do. In this regard, it will be described later in more detail with reference to Figures 14 to 24.
  • FIG. 14 is a diagram illustrating a booting mode of a digital device according to one embodiment of the present invention.
  • a digital device may implement a plurality of boot modes.
  • the plurality of boot modes include a first use mode 1410, a factory mode 1420, a normal boot mode 1430, and a warm boot mode 1440. ) May be included.
  • the first use mode 1410 may be a booting process or a booting mode that is performed when a power-on signal is first received after performing a factory initialization process.
  • the user boots in the normal boot mode 1430. The process can be performed.
  • the factory mode 1420 may be a booting process or a booting process that is performed when the digital device receives a power only key signal.
  • the digital device may execute at least one of a latest execution application or a TV service application before powering off.
  • the booting process may be performed in the first use mode 1410 after receiving the power on signal.
  • the normal booting mode 1430 may be a booting process or a booting mode performed by the digital device after the booting process is performed in the first use mode 1410 and the user sets a setting value.
  • the normal boot mode 1430 may be performed.
  • the digital device performs a booting process in the normal booting mode 1430, at least one of the most recently executed application, a timer setting application, and a TV service application may be executed.
  • the warm boot mode 1440 may be a booting process or a booting mode for entering a warm standby state. Accordingly, when the digital device performs the warm boot mode 1440, the digital device may enter a warm standby state. When the digital device receives a power on signal from the user, the digital device quickly enters the normal boot mode 1430 and performs a fast boot. can do.
  • FIG. 15 is a diagram illustrating an example in which a digital device executes a first image control application and a second image control application according to an exemplary embodiment.
  • the controller 1510 of the digital device receives a power on signal, determines whether the application executed at the power off time before receiving the power on signal is an image output application, and determines the result of the determination.
  • the application executed at the time of powering off is an image output application
  • the image output application is loaded and executed
  • the first image control application 1520 for controlling the image output through the image output application is loaded and executed.
  • the second image control application 1530 may be loaded, and when the loading is completed, the first image control application 1520 may be terminated and the second image control application 1530 may be controlled to be executed.
  • the power on signal may be an infrared ray (IR) signal input from an external input means, and the external input means may include a remote controller.
  • IR infrared ray
  • the image output application may include a first image output application for outputting a real-time broadcast program received through a tuner and a second image output application for outputting an external input image input through an external device interface.
  • the first image control application 1520 may provide a channel change function and a volume control function of a real time broadcast program.
  • the image output application is a second image output application
  • the first image control application 1520 provides a function of adjusting the volume of the external input image and changing the type of the external input, and the type of the external input is HDMI. , Composite, Digital Video Disc (DVD), and Component.
  • the second image control application 1530 may be generated using a web language including Java script, Hypertext Markup Language (HTML), and Cascading Style Sheets (CSS).
  • Java script Hypertext Markup Language
  • CSS Cascading Style Sheets
  • the second image control application 1530 may provide a control function, reservation recording, EPG output, and content information output function provided by the first image control application 1520.
  • the controller 1510 loads and executes an image output application, and controls the image output through the image output application.
  • Load and execute the image control application 1520 load the application executed at the power off time, and when the load is completed, a message for receiving an output request of the application executed at the power off time Control to display on the screen.
  • an image output application is displayed in a list of applications executed before power-off. It can be determined whether this is included. As a result of the determination, when the image output application is included in the application list executed before the power-off, the controller 1510 may execute the image output application executed last of the image output applications included in the list. The controller 1510 may control to execute an image output application outputting a preset real-time broadcast program when the image output application is not included in the application list executed before the power-off as a result of the determination.
  • the controller 1510 preferentially executes the first image control application 1520 when an image output application is first executed to display at least one image content.
  • the first image control application 1520 may provide a minimum image control function required by a user, such as a channel change or a volume change. Accordingly, the first image control application 1520 has an advantage of short time required for loading and executing and low resource requirements.
  • the controller 1510 may load a second image control application 1530 that provides more functions than the first image control application 1520 in the background. Can be.
  • the second image control application 1530 may provide various image control functions such as reservation recording, EPG output, content information output, and the like, including the control function provided by the first image control application 1520, the first image control application 1530 may be provided. It takes longer to load and execute than the image control application 1520 and a large amount of resources required.
  • the controller 1510 may terminate the first image control application 1520 and execute the second image control application 1530 when loading of the second image control application 1530 is completed.
  • the first image control application 1520 may take about 4 seconds to load and run, and the second image control application 1530 may take about 20 seconds to load and run. have.
  • the image control application switching process in which the first image control application 1520 is terminated and the second image control application 1530 is executed may be designed such that the user does not recognize the image control application 1520.
  • the user may understand that the image control function provided by the second image control application 1530 is not used before the second image control application 1530 is executed.
  • 16 is a diagram for describing an image control function provided by a digital device according to one embodiment of the present invention.
  • the digital device when a digital device outputs image content through an image output application, the digital device may provide various information and control functions to the user using the image control application.
  • the image control application it is assumed that a real time broadcast program is output through an image output application.
  • a controller of a digital device may control to output channel information 1610 of a real-time broadcast program currently output using an image control application.
  • the controller of the digital device may control to perform a channel change 1630 function of a currently broadcasted real-time broadcast program using an image control application. .
  • the controller of the digital device according to an embodiment of the present invention the broadcast menu (1620) providing a function of outputting the recommended content page, channel list, search, etc. using the image control application Can be controlled to output.
  • the controller of the digital device may control to perform a channel change 1630 function of a currently broadcasted real-time broadcast program using an image control application. .
  • the broadcast menu (1620) providing a function of out
  • the controller of the digital device controls to output the EPG 1640 including channel information of a real-time broadcast program currently output by using an image control application.
  • the first image control application may provide only the channel changing function illustrated in FIG. 16C
  • the second image control application may provide all the functions illustrated in FIGS. 16A through 16D. In this regard it will be described in detail in Figures 17 to 20 below.
  • FIG. 17 is a diagram for explaining a control function performed by a digital device through a first image control application according to one embodiment of the present invention.
  • a controller of the digital device 1700 when a controller of the digital device 1700 receives a power-on signal, the controller of the digital device 1700 executes an image output application based on a preset booting process to perform real-time broadcasting of the first channel.
  • the program 1710 may be controlled to display on the screen.
  • the controller may load and execute the first image control application 1730 to provide a function of controlling the real-time broadcast program 1710 of the first channel.
  • the first image control application 1730 may be designed to provide a minimum control function such as channel change and volume control.
  • the user transmits a channel change request signal to the digital device 1700 through an external input means such as a remote controller, thereby allowing the digital device 1700 to perform real time operation of the second channel. It may be controlled to display the broadcast program 1720.
  • FIG. 18 is another diagram for describing a control function performed by a digital device through a first image control application according to an embodiment of the present disclosure.
  • a controller of the digital device 1800 when a controller of the digital device 1800 receives a power-on signal, the controller of the digital device 1800 executes an image output application based on a preset booting process to perform real-time broadcasting of the first channel.
  • Program 1810 may be controlled to display on the screen.
  • the controller may control to load and execute the first image control application 1830 to provide a function of controlling the real-time broadcast program 1810 of the first channel.
  • the first image control application 1830 may be designed to provide a minimum control function such as channel change and volume control.
  • the first image control application 1830 is executed, even if a user transmits an EPG output request signal to the digital device 1800 through an external input means such as a remote controller, the first image control application 1830 Does not provide an EPG output function, the requested EPG is not displayed.
  • FIG. 19 illustrates a control function performed by a digital device through a second image control application according to an embodiment of the present invention.
  • the controller of the digital device 1900 when the controller of the digital device 1900 according to an embodiment of the present invention receives a power-on signal, the controller executes an image output application based on a preset booting process to perform real-time broadcasting of the first channel.
  • the program 1910 can be controlled to display on the screen.
  • the controller may control to load and execute the first image control application to provide a function of controlling the real time broadcast program 1910 of the first channel.
  • the first image control application may be designed to provide a minimum control function such as channel change and volume control.
  • the controller further provides a second image that provides more functions of controlling the real-time broadcast program 1910 of the first channel in the background while providing a user with a minimum control function through the first image control application.
  • the control application 1930 may be controlled to load.
  • the controller may control to terminate the first image control application and to execute the second image control application 1930. Since the second image control application 1930 also supports all the image control functions provided by the first image control application, the user sends a channel change request signal through an external input means such as a remote controller.
  • the digital device 1900 may control the digital device 1900 to display the real time broadcast program 1920 of the second channel.
  • 20 is another diagram for describing a control function performed by a digital device through a second image control application according to one embodiment of the present invention.
  • the controller of the digital device 2000 executes an image output application based on a preset booting process to perform real-time broadcasting of the first channel.
  • the program 2010 can be controlled to be displayed on the screen.
  • the controller may control to load and execute the first image control application to provide a function of controlling the real time broadcast program 2010 of the first channel.
  • the first image control application may be designed to provide a minimum control function such as channel change and volume control.
  • the controller further provides a second image that provides more functions of controlling the real-time broadcast program 2010 of the first channel in the background while providing the user with the minimum control function through the first image control application.
  • the control application 2030 may be controlled to load.
  • the controller may control to terminate the first image control application and to execute the second image control application 2030.
  • the second image control application 2030 may support not only an image control function provided by the first image control application but also various image control functions such as channel information output, channel list output, reservation recording function, and EPG output.
  • image control functions such as channel information output, channel list output, reservation recording function, and EPG output.
  • FIG. 20 when a user transmits a channel list output request signal to the digital device 2000 through an external input means such as a remote controller, the controller includes a first channel currently being displayed.
  • the channel list 2020 may be controlled to be displayed on the screen.
  • 21 and 22 are diagrams for describing an example of determining, by a digital device, an image output application that receives a power-on signal and executes it for the first time according to an embodiment of the present invention.
  • the controller of the digital device receives the power on signal and determines whether the application executed at the power off time before receiving the power on signal is an image output application. If the application executed at the off time is an image output application, the image output application is loaded and executed, a first image control application for controlling an image output through the image output application is executed, and a second image is executed. The control application may be loaded, and when the loading is completed, the first image control application may be terminated and the control may be performed to execute the second image control application.
  • the controller determines that the application executed at the power-off time is not an image output application, the controller loads and executes an image output application and controls an image output through the image output application.
  • Load and execute a first image control application load an application executed at the power off time, and when the load is completed, screen a message for receiving an output request of the application executed at the power off time Control to display on the
  • the first image output application 2110 is executed
  • the second image output application 2120 is executed
  • the web browser application 2130 is executed in order
  • the digital device is executed.
  • the controller of the digital device determines whether the last application executed immediately before changing to the power-off state is the video output application. As a result of the determination, the controller may control to search whether there is an image output application executed previously since the last application executed immediately before changing to the power-off state is a web browser application instead of an image output application.
  • the controller determines that the second image output application 2120 is the last image output application executed by the search result, and executes the second image output application 2120 first after receiving the power-on signal.
  • the first image control application for controlling the image output through the second image output application 2120 may be loaded and executed. Furthermore, the controller may load the web browser application 2130 in the background and control to display a message 2165 for receiving an output request of the web browser application 2130 on the screen when the loading is completed ( 2160).
  • a messenger application 2210 is executed, a game application 2220 is executed, a web browser application 2230 is executed in order, and a digital device is executed.
  • the controller of the digital device determines whether the last application executed immediately before changing to the power-off state is the video output application. As a result of the determination, the controller may control to search whether there is an image output application executed previously since the last application executed immediately before changing to the power-off state is a web browser application instead of an image output application. In addition, if the data for executing the image output application is not found as a result of the search, the controller may control to load and execute the preset first image output application 2250 first. And.
  • the first image control application for controlling the image output through the first image output application 2250 may be controlled to be loaded and executed. Furthermore, the controller may load the web browser application 2230 in the background and control to display a message 2265 for receiving an output request of the web browser application 2230 on the screen when the loading is completed ( 2260).
  • the user can preferentially receive the most necessary functions after powering on the digital device, and can automatically execute an application used before powering off the digital device. There is this.
  • FIG. 23 is a diagram for explaining an example in which a digital device downloads a second image control application from an external server or the like according to an embodiment of the present disclosure.
  • the controller of the digital device receives the power on signal and determines whether the application executed at the power off time before receiving the power on signal is an image output application.
  • the application executed at the off time is an image output application
  • the image output application is loaded and executed, and the first image control application 2310 for controlling the image output through the image output application is executed.
  • the second image control application 2320 may be loaded, and when the load is completed, the first image control application 2310 may be terminated and the second image control application 2320 may be executed to be executed.
  • the first image control application 2310 and the second image control application 2320 may be generated in different program languages.
  • the second image control application 2320 may be generated using a web language including Java script, HTML, and CSS.
  • the first image control application 2310 should use only the pre-installed application, but the second image control application 2320 may be downloaded from the external server 2330 or the external device 2340.
  • the controller of the digital device 2300 loads and executes the first image control application 2310, and then controls any second image among the plurality of second image control applications 2320. It may be determined whether to execute the application 2320. In this case, the controller may execute the most recently executed second image control application 2320 in order.
  • the controller of the digital device 2300 when the user inputs a signal for requesting execution of a specific function, the controller of the digital device 2300 according to an embodiment of the present invention searches for a second image control application 2320 providing the specific function. The control may be controlled to download from the external server 2330 or the external device 2340.
  • 24 is a flowchart illustrating a control method of a digital device according to an embodiment of the present invention.
  • step S2410 of receiving a power on signal an application executed at a power off point before receiving the power on signal is executed. Determining whether the image output application (S2420), if the application executed at the time of power-off is a video output application, the step of loading and executing the video output application (S2430), output through the image output application Loading and executing a first image control application for controlling the image to be displayed (S2440), loading a second image control application (S2450) and when the loading is completed, terminating the first image control application; And executing the second image control application (S2460). Detailed description of each step is the same as described above, repeated description is omitted.
  • MVPD is an abbreviation of Multichannel Video Programming Distributor and refers to one or more service providers that provide a video on demand (VOD) or streaming service.
  • the MVPD service may be, for example, made of RF (Radio Frequency) based or may be made of IP based (Internet Protocol) based.
  • the MVPD service may be formed in a hybrid form based on a mixture of RF and IP as needed. Therefore, hereinafter, RF-based MVPD service and IP-based MVPD service processing method will be described separately.
  • the conventional MVPD service is a form in which the service provider has embedded all of their solution (s) in the set-top box (STB). Therefore, in digital devices, the software is embedded in the device to replace the function of the existing hardware set-top box.
  • an MVPD is operated by one input and thus, to provide and operate in the form of an external input. Therefore, according to the present invention, MVPD may be serviced in the form of external input such as HDMI, Component, etc.
  • the MVPD according to the present invention may be serviced and processed with equal or higher performance in comparison with the conventional MVPD service or hardware set-top box (s) even when operating as an input of a device.
  • FIG. 25 is a diagram illustrating a configuration block for recognizing and processing an RF-based MVPD service as an input mode in a device according to an embodiment of the present invention.
  • the digital device includes a TV processing unit 2510, an MVPD processing unit (MVPDWin) 2520, and other components for processing an MVPD service.
  • MVPDWin MVPD processing unit
  • an upstart module in the TV processor 2510 receives input data, that is, a file, from the kernel.
  • the upstart module delivers the input file to the QtTVWin module 2512 and the TV service processing module 2514.
  • the QtTVWin module 2512 checks the MVPD mode from the received input file.
  • the QtTVWin module 2512 transfers its role to the MVPD processing unit 2520 and terminates its role.
  • the TV service processing module 2514 processes related data including the configuration of a channel manager, a VSM, an SDEC, an external device, and the like.
  • the MVPD processing unit 2520 processes the MVPD service when a role transfer instruction according to the RF mode recognition is received from the QtTVWin module 2512 as described above. At this time, the MVPD processing unit 2520 checks the / var / lib / eim / last input file stored from the EIM 2570. The MVPD processing unit 2520 may identify and identify whether the type of the MVPD service file input through the identification is an RF-based MVPD_RF type or an IP-based MVPD_IP type.
  • EIM 2570 includes MVPD information in the last input app information in the file.
  • the EIM 2570 stores whether the MVPD mode is required at boot time in the last input application information. Accordingly, the file stored in the EIM 2570 may be read by the QtTVWin module 2512 at boot time and transferred to the MVPD processor 2520 for basic processing as described above.
  • the EIM 2570 may add the last input application information as setting data to a settings service.
  • a value of a last input application is required for an exit key operation from a setting service in an LSM, and the LSM may subscribe to the last input application value.
  • the setting or updating of the setting service of the last input application may be performed in the live TV application, the EIM 2570, or the like.
  • the SAM 2258 and Bootd 2584 may flexibly perform external input related processing according to the external input management policy of the EIM 2570.
  • the SAM 2258 and Bootd 2584 may obtain a last input from the TV service processing module 2514 and match it with a mapping table of inputs and input applications to launch a given application.
  • the SAM 2258 and Bootd 2584 may read and process the / var / lib / eim / lastinput file stored from the EIM 2570 instead of the TV service processing module 2514 as described above.
  • the MVPD application 2540 may launch its own MVPD related service, and at the same time, a service related to the XCAS 2550 may be launched from a PrivateControl plugged into the MVPDWin module 2520 at boot time.
  • the MVPD application 2540 may implement APIs such as downloadChTable and saveLastChInfo in some cases.
  • the channel manager 2560 may perform a Luna service call or the like requested by the MVPD application 2540 when the MVPDWin 2520 no longer exists after the MVPD application 2540 is executed. Provide a service for receiving.
  • the input mode is MVPD
  • a list of input selector apps such as an input hub (picker) 2530, may be displayed along with other inputs.
  • an MVPD application may appear as a default when an application other than the input application is performed while the MVPD service is being viewed, and the corresponding application is terminated or becomes the background.
  • the MVPD service may be seen and booted like other inputs. This may be related to seamless change in digital devices employing the WebOS platform.
  • the input mode may be activated after authentication by each MVPD server after the application is installed.
  • 26 is a diagram illustrating a sequence diagram for processing an MVPD service according to one embodiment of the present invention.
  • FIG. 26 assumes, for example, that an MVPD application is already installed in a webOS device.
  • the EIM may extend the definition of the EIM or a predefined content.
  • a design for dynamically adding MVPD input to a default input application in EIM is required.
  • an associated API addDevice
  • the EIM may extend an API (getAllInputStatus) obtained for configuring input information in the input hub.
  • the webOS device provides the same MVPD when the power is turned on, i.e., rebooted.
  • the MVPD application is typically a web application and, in the case of boot-up, the first web application floats over 30 seconds, for example, based on the initial release, which may not meet the requirements in MVPD. have. Accordingly, the above-described requirements of MVPD are satisfied and a native framework, that is, MVPDWin is used for the user's convenience. As a result, the screen may be presented to the user before the application is executed.
  • MVPDWin is launched when TVWin determines the type of last input application and determines MVPD mode at boot time. At this time, as described above, TVWin ends. Meanwhile, MVPDWin may first process some functions related to streaming, channel switching, etc. provided by the conventional MVPD. In this case, the paid channel may be excluded when the channel is switched. In addition, MVPDWin may launch a service dependent on MVPD.
  • the service dependent on the MVPD may include an XCAS manager service.
  • the MVPD server 2610 receives an authentication request from the MVPD application 2620 (S2604) and returns an authentication result (S2606).
  • the MVPD application 2620 receives the S2606 authentication result and transmits the addDevice API to the EIM 2630 (S2608).
  • the EIM 2630 returns the result to the API transmission (S2610).
  • the input hub (picker) 2640 is launched by the MVPD server 2610 (S2612).
  • the input hub 2640 launched through step S2612 transmits a getAllInputStatus API to the EIM 2630 (S2614), and returns an input list according to the API from the EIM 2630 (S2616).
  • FIG. 27 is a block diagram illustrating an MVPD service processing module according to an embodiment of the present invention.
  • the TV service module 2710, the MVPD service module 2720, the channel manager 2730, the XCAS manager module 2740, and the like are included for the MVPD service processing.
  • the MVPD service module 2720 includes a QML, which may exist for each MVPD service provider, for example. Therefore, in the QML, different QML may be loaded for the MVPD service according to the last input application. In addition, within the MVPD service module 2720, respective banners (Banner) may be obtained by the loaded QML, and a channel map may be obtained from the channel manager 2730. Here, if there is a service dependent on MVPD according to MVPD, it can be launched. For example, the XCAS manager may be included.
  • FIG. 28 illustrates a seamless process when the input mode is MVPD according to an embodiment of the present invention.
  • the input mode is the MVPD in the webOS device, as described above, when the user switches from the MVPDWin to the MVPD application, a mute may occur due to the initialization of the media. This may, for example, affect the usability and require a design for seamless change.
  • the TV service may block routines to initialize the pipeline upon booting for TV or external input.
  • MVPDWin receives and checks the last input application information file (/ var / lib / eim / lastinput) stored from the EIM. In other words, MVPDWin determines whether the type of the last input application is the MVPD mode from the file of the EIM.
  • the MVPDWin controls pipeline related processing processed by the TV service. For example, MVPDWin requests the TV service to initialize the pipeline and set the initialization pipeline. For this purpose, an associated API can be defined. Afterwards, when the MVPD application appears on the screen, the initial pipeline can be passed as described above.
  • the TV service 2810 reads the last input application information from the file (/ var / lib / eim / lastinput) of the EIM, and checks its type. Here, if the type is MVPD_RF or MVPD_IP, the TV service 2810 skips the pipeline initialization process.
  • QtTVWin 2830 reads the last input application information and checks the type, as the TV service 2810 described above has performed. At this time, if the checked last input application type is MVPD_RF or MVPD_IP, the corresponding content is transmitted to MVPDWin 2840 and the user stops performing the function.
  • MVPDWin 2840 opens the pipeline and requests a pipeline ID return.
  • the pipeline ID may be called broadcastId (luna: //com.webos.service.tv.broadcast/open).
  • the MVPDWin 2840 then requests to set up an initial pipeline.
  • the pipeline may be delivered to the MVPD application 2860.
  • the MVPDWin 2840 then requests a connection with the VSM and sets a frequency.
  • bootd 2850 reads the last input application information and launches the last input application based on the read last input application information.
  • the MVPD application 2860 requests pipeline opening to the TV service 2810 (luna: //com.webos.service.tv.broadcast/open), and the TV service 2810 of the initial pipeline already created. Return the pipeline ID.
  • the MVPD application 2860 also requests a connection with the VSM to the TV service 2810.
  • MVPD application 2860 When the MVPD application 2860 is ready, a minimal-boot-done signal (MBD signal) is transmitted to the bootd 2850, and the MVPDWin 2840 receives the MBD signal. MVPDWin 2840 removes all work, relaunchs the MVPD application and terminates its function. Accordingly, the MVPD application 2860 is seamlessly launched into the foreground.
  • MBD signal minimal-boot-done signal
  • the last input list is maintained by the TV service for seamless change processing, and another module obtains and processes an application mapped to the last input from the last input list as needed.
  • the module for processing the seamless change may include a SAM, Bootd, LSM, etc., each module is difficult to obtain the input list in the dynamic (dynamic), it is necessary and distributed static input You may need to map the list to the input application.
  • 29 and 30 are sequence diagrams illustrating a last input process associated with a seamless change according to an embodiment of the present invention.
  • Processing such as a seamless change is performed through Bootd 2910, LSM 2920, TV service 2930, MVPDWin 2940, Media Server 2950, Web Kit 2960, MVPD Application 2970, and the like. .
  • the TV service 2930 skips initial pipeline generation (S2901).
  • the MVPDWin 2940 loads a keep_session to the media server 2950 (S2902).
  • the media server 2950 generates a connectionID (S2904), requests the media pipeline 2955 to generate a pipeline, and loads (S2906 and S2908).
  • the media pipeline 2955 registers pipelineID with the VSM in the TV service 2930 (S2910).
  • the media server 2950 loads complete_event into the MVPDWin 2940 (S1712).
  • the MVPDWin 2940 makes a request for connection and display window setting using the pipelineID to the VSM in the TV service 2930 (S2914 and S2916).
  • the MVPDWin 2940 makes a play request to the media server 2950 (S2918), and the media server 2950 requests the play request to the media pipeline 2955 (S2920).
  • the Bootd 2910 may request launch_Hidden to launch the MVPD application 2970 (S2924). Then, after the MVPD application 2970 prepares the batter without the video tag (S2926), the video tag is loaded into the web kit 2960 (S2928).
  • the MVPD application 2970 requests the generation of the MBD signal to the Bootd 2910 (A2930), and when the application is ready, the Bootd 2910 transmits the MBD signal to the MVPDWin 2950 (S2932). After waiting for an idle state, the MVPDWin 2950 launches a show to the MVPD application 2970 (S2934 and S2936).
  • the LSM transmits a minimize to the MVPDWin 2950 (S2938), and the MVPDWin 2950 closes after receiving the minimize (S2940).
  • the media server 2950 loads the previously assigned pipelineID and complete_event into the web kit 2960 (S2944 and S2946).
  • the web kit 2960 requests connection with the VSM in the TV service 2930 and setting the display window based on the pipelineID (S2948 and S2950). Then, the web kit 2960 loads the play request to the media server 2950 (S2952).
  • FIG. 31 illustrates a process block diagram for a seamless change according to an embodiment of the present invention
  • FIG. 32 illustrates a process sequence diagram performed in the process block of FIG. 31.
  • the basic concept of the present invention is not to manage the last input list and the mapping table for the last input application in the SAM 3130, Bootd 3140, etc., but rather the data for the last input application in the EIM 3120. , Files, etc. are held and managed.
  • the premise is that the MVPD application must be installed and authenticated.
  • the SAM 3130 and the Bootd 3140 may acquire the last input application directly from the EIM 3120 instead of requesting the TV service 3110 and obtaining the last input application from the unique mapping table. .
  • the EIM 3120 subscribes to both the getForegroundAppInfo API of the SAM 3130 and the getLastInput API of the TV service 3110 to set up a last input application.
  • the last input application can be set using this. However, if not MVPD, data may be received from both subscription APIs. However, in the case of external input, since the information of the TV service 3110 may be more accurate on a platform policy, the last input application setting may be used based on the information of the TV service 3110.
  • the EIM 3220 scribes getForegroundAppInfo with the SAM 3240 (S3202), and scribes getLastinput with the TV service 3210 (S3204).
  • S3202 and S3204 may be made at the same time, the above-described steps and the order may be reversed.
  • the SAM 3240 receives setForegroundAppInfo (S3206), and requests getForegroundAppInfo to the EIM 3220 (S3208).
  • the EIM 3220 obtains type information from the database having the MVPDAppId (S3210).
  • the getLastInput request is received from the TV service 3210 (S3212)
  • the EIM 3220 obtains the type and appId from the database having the INPU_NAME (S3214).
  • the EIM 3120 transmits setSystemSettings to the setting service 3250 (S3216), and records the type and last input application in the case of MVPD as the file system 3260 (S3218).
  • Bootd 3230 After rebooting, Bootd 3230 transmits getLastInputApp to file system 3160 (S3220), and acquires a last input application and type information (S3222). Bootd 3230 then requests to launch last input application to SAM 3240 (S3224).
  • the SAM 3240 receives the notifySplashTimeout from the outside (S3226), transmits a getlastinputApp to the file system 3260 (S3228), and launches the last input application. (S3230).
  • FIG. 33 is a diagram illustrating EKfms input focus change according to one embodiment of the present invention.
  • the change of the input focus means that the input focus is naturally changed to the focused window when the MVPDWin moves from the MVPDWin to the MVPD application.
  • the IM 3240 When a key event is received at the IM 3340, the IM 3240 sends a sendEvent to the LSM 3330.
  • the LSM 3330 transmits sendKeyToFocusWindow () to the MVPDWin 3310 and the MVPD application 3320 through the qt-wayland-client 3312 and 3324 through the qt-wayland-server 3332, respectively.
  • m_focusObject of LSM 3330 is holding the currently focused client object, and delivers the event received here, that is, sendKeyToFocusWindow.
  • the MVPD application 3320 processes the sendKeyToFocusWindow () received through the qt-wayland-client 3324 through the web app manager 3322.
  • the MVPDWin may not be a wayland client. In this case, an additional design may be required.
  • the EIM will be described in more detail, such as storing the last input application and delivering it to the setting service.
  • FIG. 34 is a view illustrating in more detail the processing method or function of the EIM according to an embodiment of the present invention.
  • FIG. 34A is not a main method of processing by the EIM
  • FIG. 34B is a main method of processing by the EIM according to the present invention.
  • the information of the input application currently foreground in the EIM is known and set in the setting service.
  • the normal input may receive an event as a subscription reply from the last input of the TV service.
  • the event includes type information of external inputs such as ATV, HDMI_1, RGB, etc., and the EIM sets up a last input application by finding it in an input type and input application mapping table managed by the EIM.
  • the foreground includes not only the case where the input application becomes the foreground but also the case where the TV comes out as a PIG in the TV guide application.
  • MVPD inputs can be handled with a subscription reply of the getForegroundAppInfo of the SAM in relation to the PIG case.
  • the EIM sets the application to a file and setting service if it matches the MVPD input application it manages.
  • the EIM is an owner for writing a file
  • the SAM, Bootd, and TV services are not the recording subject, but only have read permission.
  • 35 is a diagram for explaining a method of recognizing an MVPD as an input mode according to an embodiment of the present invention.
  • the MVPD application is recognized and processed on the WebOS device in input mode.
  • the MVPD application 3510 requests authentication to the application server 3520.
  • the MVPD application 3510 registers itself in the input mode by sending addDevice () to the EIM 3530.
  • the EIM 3530 then sends the input mode, etc., to the input hub (picker) 3540, which is a subscriber, to getAllInputStatus ().
  • FIG. 36 illustrates UX in which MVPD is included in an input mode according to FIG. 35 described above.
  • first region 3610 which also includes an MVPD item 3612.
  • second area 3650 specific details of the MVPD service (Pay TV) are output according to the selection (pointing, hovering, etc.) of the MVPD item 3612 in the first area 3610.
  • an image (PIG substitute image) 3652 an icon, and a description thereof may be output.
  • an item for deleting a predetermined MVPD application is also provided in the second area 3650.
  • all information about the MVPD input can be obtained from the EIM, which can be information in the second region 3650.
  • Image paths obtained from the EIM can be read from the input hub application. For example, images related to normal input are currently in / usr / palm / plugins / inputapps / assets and are accessible from the input hub (picker).
  • FIG. 37 is a diagram illustrating bootd according to one embodiment of the present invention.
  • the boot sequence 371 is used to modify a boot sequence and to launch a last input application.
  • Bootd 3710 reads the last input application information from the last input 3720, and if the type of the last input application is MVPD, the modules dependent on the Bootd 3710, for example, a media server, a webOS-connman-adaptor Run the boot sequence by commanding start, settings service, connman-adaptor, wpa-supplicant, and so on.
  • the modules dependent on the Bootd 3710 for example, a media server, a webOS-connman-adaptor Run the boot sequence by commanding start, settings service, connman-adaptor, wpa-supplicant, and so on.
  • the Bootd 3710 commands launch of the last input application 3630 when the web application launch is prepared.
  • the TV service is responsible for the core functions of the basic broadcast reception and channel-related functions of the TV, in particular, if the provider is RF-based (C & M, T-board, etc.) has an implementation dependency on the TV service. .
  • the CM, PSIP, SDEC, tuner, etc. of the TV can be extended to be implemented in conjunction with the CAS service of the operator.
  • the TV service also performs TV pipeline initialization and VSM connection for TVWin's channel switching.
  • the TV service may seamlessly pass the pipeline ID (pipelineID) when the live TV application is launched.
  • the operator's XCAS manager may be linked with, for example, a TV service.
  • the TV service may handle exceptions of the MVPD mode. For example, to determine if you booted in MVPD mode, refer to the / var / lib / eim / lastinput file managed by EIM.
  • the structure when booting in the MVPD mode, the structure may be defined so as not to pass through the initialization and VSM connection portion of the TV pipeline being processed for TVWin.
  • the initialization is performed by MVPDWin.
  • the TV service may add or extend a channel manager (CM) API.
  • CM channel manager
  • the TV service may further implement a channel switching function using a frequency, a program number, channel information, and the like required by an operator.
  • the TV service can improve the level of channel change. For example, the request from the CM to the PSIP module to obtain the audio PID and video PID takes a long time to respond. Accordingly, the TV service may support the channel manager, the PSIP module, and the like to satisfy the channel switching speed in a similar manner for the MVPD by storing and using the channel information (channel info).
  • the TV service requires a service provider to lower the program map table (PMT) and the audio PID, video PID, etc. using only the PMT, or to parse the virtual channel table (VCT).
  • the optimization response area may vary depending on the region.
  • the MVPD already knows the information of the corresponding program, only the PAT (Program Association Table) and the corresponding PMT are parsed in the PSIP, and the channel information event is sent to the channel manager. And channel manager needs to match PID program only by program number.
  • the digital device can process a DVB method, a DTMB / CMMB method, a table thereof, and the like, and includes a module therefor. can do. Therefore, even in such a case, the TV service may process channel information in the same or similar manner as described above through the corresponding module.
  • the IP-based MVPD service As for the IP-based MVPD service, a function similar to the aforementioned RF-based MVPD service is supported.
  • the MVPD operator embeds the set-top box into the device, the MVPD is regarded as a virtual external input and its UX is output, and the external input is in addition to the MVPD application access method. Also accessible.
  • accessing an IP-based MVPD service it supports channel up / down functions for the live stream afterwards, and additionally supports streaming in the native stage some time before the MVPD web application is output after booting. This has been fully described in the aforementioned RF-based MVPD service.
  • FIG. 38 illustrates a block diagram for recognizing and processing an IP-based MVPD service as an input mode in a device according to an embodiment of the present invention.
  • FIG. 38 refers to the configuration described in FIG. 25 and overlapping descriptions thereof, and uses them and omits them.
  • a configuration different from that of FIG. 25 includes a processing configuration corresponding to a web kit 3808, a media server 3810, a pipeline processing unit 3812, a URL-based channel manager 3852, and an XCAS manager. 3840) and so on.
  • the IP-based MVPD service of FIG. 38 may have a different boot sequence from the aforementioned RF-based MVPD service and Bootd.
  • a boot sequence is optimized for a TV, an external input, and the like.
  • an input picker may be executed within 15 seconds after booting, for example, to enable input change.
  • the basic concept of Bootd reading the last input application type and launching the required module is similar.
  • the last input application can be output by either TVWin or MVPDWin.
  • FIG. 39 is a diagram illustrating a media framework for an IP-based MVPD service according to an embodiment of the present invention
  • FIG. 40 is a diagram illustrating a media framework according to another embodiment of the present invention. Drawing.
  • the media framework can be used in conjunction with a media server (uMediaServer) 3940, and starfish-media-pipeline supports most media, and other camera pipelines exist separately.
  • uMediaServer media server
  • DRM Digital Right Management
  • the DRM can respond by porting to the gStreamer plug-in.
  • FIG. 39 illustrates a media framework for not considering downloadable issues in relation to the former MVPD service
  • FIG. 40 illustrates a media framework considering the downloadable issues.
  • a basic API defined to cope with MVPD, other external service providers, etc. in the WebOS platform may be uMediaServer API.
  • the player of the service provider requests and responds accordingly by using the ⁇ video> tag in the web application, and the native player porting can use the standard gStreamer pipeline.
  • the gStreamer pipeline may be configured as a static pipeline.
  • Service providers can port their DRM libraries to standard gStramer plug-ins so that they can be written to the standard gStreamer pipeline.
  • the media server may configure a separate MVPD dedicated pipeline rather than the starfish-media-pipeline.
  • uMediaServer 3940 loads MVPDWin 3920. On the other hand, the uMediaServer 3940 also loads the MediaAPI into the processing unit 3950.
  • MVPDWin 3920 creates an MVPD player and creates a pipeline.
  • the gStreamer pipeline is also created with custom gStreamer plug-ins 3964.
  • the starfish-media-pipeline in the processor 3950 registers with the VSM 3912 in the TV service 3910.
  • the loadCompleted event is delivered to the MVPDWin 3920.
  • the MVPDWin 3920 makes a connection request with the VSM, and the MVPD application 3930 is launched through the Bootd and the SAM. After the MVPD application 3930 is launched, it notifies the MVPDWin 3920 that the application is ready, and the MVPDWin 3920 no longer performs a function. Accordingly, the MVPD application 3930 directly communicates and functions with the TV service or the uMediaServer 3940 to process the MVPD service.
  • the uMediaServer 4040 may not control the pipeline session.
  • the original pipeline session owner is responsible for ensuring seamlessness.
  • the screen is played using the MVPDWin 4020 until the channel change is performed in the MVPD application 4034.
  • MVPDWin 4020 continues to handle playback, so it doesn't do anything to the pipeline when it receives a minimize event.
  • the UID may be rendered by the MVPD application 4034.
  • the event that the player should receive propagates to the media player of the web kit 4032 through the service of the MVPDWin 4020.
  • the current state of the MVPD player may be transmitted in return of a subscribe call.
  • the application proceeds with loading, and the MVPDWin 4020 transmits a kill call to no longer function.
  • channel changes can use setUri.
  • FIG. 41 is a block diagram illustrating a main configuration for providing an IP-based MVPD service according to an embodiment of the present invention.
  • FIG. 41A is a diagram for explaining that an MVPD application is first launched during a booting process
  • FIG. 41B is a diagram for explaining a case where there is a first channel change on an MVPD application according to FIG. 41A.
  • the MVPD application 4130 is launched through Bootd ⁇ "boot”: true ⁇ 4140.
  • the MVPD application 4130 may be loaded without a video tag.
  • the MVPD application 4130 includes booting in the launch parameters, adds a video tag parameter ⁇ "firstUse": true ⁇ in the MVPD application, and may be loaded first without the video tag.
  • the MVPD application 4130 may make an exception to the web kit player as needed. For example, the video tag parameter is true and no AV mute occurs after loadComplete. And you can not setBlackBG.
  • setUri is used as described above.
  • MVPD application 4130 scribes media events with MVPDWin player 4110.
  • MVPDWin player 4110 returns the current player status to MVPD application 4130 in response to the scribe. Meanwhile, when an event is received from the umediaserver 4120, the MVPDWin player 4110 transmits the received event to the MVPD application 4130.
  • the first channel change of FIG. 41B may only be processed via MVPDWin 4110 and MVPD application 4130, for example, as shown.
  • the MVPD application 4130 when a channel change request is made after the MVPD application 4130 is launched, the MVPD application 4130 loads a video tag. In this case, the MVPD application 4130 loads a URI through a web kit player. In the above, if ⁇ "boot": true ⁇ is already included in the launch parameters in the video tag loading process, the URI is loaded without writing the parameter with respect to the loading.
  • the MVPD application 4130 then sends a kill command to the MVPDWin player to handle the channel change.
  • FIGS. 42 and 43 are block diagrams illustrating a main configuration for providing an IP-based MVPD service according to another embodiment of the present invention.
  • FIG. 42 and 43 employ a different approach than FIG. 40 described above, for example describing the owner of a change pipeline session.
  • Figures 42 and 43 can be designed in a line that does not conflict with the policy of the existing media server.
  • the media server may not be the owner of events and commands such as play, stop, pause, unload, and the like.
  • the media server may not disclose the mediaId managed by the media server other than the owner of the mediaId, that is, the loading request subject.
  • the media server does not manage the client corresponding to the mediaId, and may not participate in or extend exceptions specific to a specific pipeline such as the MVPD pipeline in addition to general pipeline characteristics.
  • the media server may allow only one client (owner) for one mediaId.
  • MVPDWin the owner of the mediaId
  • the media server may not manage client information.
  • MVPDWin may identify the requester and return the mediaId.
  • the functionality of the media server may be extended to correspond, for example, the media server may be extended to receive an existing pipeline control by loading with mediaId. To do this, extend the existing load () API and recognize the mediaId as an argument in addition to the URI.
  • the media server is not a subject that generates an event. Therefore, the event should be uploaded to the media server at the media pipeline, which is the lower end of the media server.
  • the webkit player may process the default operation.
  • the default operation may mute the AV from after loadCompleted to before playing.
  • the default operation is achieved by blacking the player background of the graphics stage from just before loading until the VSM is connected.
  • the current player may be a basic operation.
  • the web kit player the "launch on boot” can be exception handling to disable the code.
  • the web kit player may also add a new exception handling routine to an existing player or create a new player.
  • the MVPD application 4230 notifies that the application is ready to the MVPDWin player 4210, and the MVPDWin player 4210 commands relaunch to the MVPD application 4230.
  • the MVPD application 4230 may receive a launch command from the boot 4240 and may be ready, and the MVPDWin 4210 may receive an event from the media server 4220.
  • the MVPD application 4230 loads the video tag upon receiving the relaunch command, and loads the URI into the webkit player.
  • the web kit player transmits a loadExist (URI) to the media server 4220 through the uMC, receives a mediaId, and subscribes to the returned mediaId.
  • the media server 4220 requests queuing and pending of a pipeline having a given mediaId without loading a new pipeline process if the mediaId is included in the load parameter.
  • a subscription request is received from MVPD application 4230 and the received subscription is on this mediaId, then the old uMC handle client watcher is removed, and the subscription of the old uMC handle is received.
  • Cancel the subscription subscribe the new uMC handle with the mediaId, add a client watcher for the new uMC handle, and handle the load request pending.
  • the media server 4220 delivers the received loadExist (URI) to the media pipeline, and delivers the event back to the media server.
  • the delivered event may include sourceInfo, loadCompeleted, videoInfo, audioInfo, and the like.
  • the media server 4220 which receives the event from the media pipeline 4222, forwards it to the MVPD application.
  • the MVPD application 4230 requests a connection with the VSM 4252 in the TV service 4250.
  • the media server 4220 commands a disconnect to the MVPDWin player 4210, and the MVPDWin player 4210 receives the disconnection. Kill yourself
  • the relaunch time point may be, for example, when MVPDWin is in an idle state. For example, when an event for a request such as a channel change is received and processing is completed, the relaunch may be performed.
  • the life cycle of MVPDWin is until the completion of the process for the seamless in the media server 4220.
  • the process completion time point means processing of the seamless, that is, passing the subscription.
  • the life cycle of MVPDWin also requires exception handling for routines that kill themselves in relation to MVPD applications.
  • the reason for pending the load in the media pipeline if the event is raised in the media pipeline before the subscription, because the event is raised to the player of MVPDWin rather than the web kit, queuing after the subscription To deliver the loaded load.
  • the media pipeline 4222 when the media pipeline 4222 receives a URI and other load parameters, it decides whether to maintain or reload the pipeline by matching the already loaded URI with the URI to be loaded, and compare the remaining parameters with the existing state. Can be.
  • FIG. 43 a method of generating a dedicated player for a service as described above will be described with reference to FIG. 43.
  • control of the pipeline session of the media server 4330 does not move.
  • the above-mentioned pipeline session owner is responsible for the playback so that the seamless is guaranteed.
  • MVPDWin 4310 may not perform any processing on the pipeline even when receiving the minimize event.
  • the UID is drawn by the MVPD application 4340.
  • the event that the player should receive is, for example, propagated to the player of the web kit through the service of the MVPDWin 4310, and the current state of the MVPD player 4320 is returned by a subscription call in accordance with the status of the MVPDWin 4310 player. To pass.
  • the application loads and transmits a kill call to the MVPDWin 4310.
  • channel changes are made using setUri.
  • MVPDWin 4310 kills itself when it is minimized after being launched by TVWin.
  • the MVPD application 4340 moves seamlessly to the foreground, where the MVPD application 4340 is stabilized, sends a ready from the MVPD application 4340 to the MVPDWin 4310, and the MVPDWin 4310. This is after the re-launch command of the MVPD application 4340 to show.
  • the MVPD application 4340 goes to the foreground, the event is subscribed to the MVPD player 4320, i.e., the dedicated player for the MVPD application, and the MVPD player 4320 returns the current state of the player, and the MVPD application ( Web kit player) 4340 sets the state.
  • FIG. 43 is similar to FIG. 42, but the content related to the media pipeline and the MVPD dedicated player is different from each other.
  • an RF or / or IP based MVPD service may be supported and processed in a digital device equipped with a webOS platform, and the MVPD service may be connected to external inputs such as HDMI and components. It can be provided in a similar form to provide convenience of access and use, can be serviced and processed with the same or better performance compared to the conventional MVPD service or hardware set-top box (s), and the MVPD service on the WebOS platform In support, the service can be provided seamlessly in the case of booting.
  • the digital device disclosed herein is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications may be made. It may be.
  • the method of operating a digital device disclosed in the present specification may be embodied as processor readable codes on a processor readable recording medium included in the digital device.
  • the processor-readable recording medium includes all kinds of recording devices for storing data that can be read by the processor. Examples of processor-readable recording media include read only memory (ROM), random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage devices, and the like. It also includes the implementation in the form of a wave (carrier-wave).
  • the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.
  • the present invention relates to digital devices and control methods thereof, and has industrial applicability.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne, selon divers modes de réalisation, un dispositif numérique et son procédé de commande. Un procédé de commande d'un dispositif numérique selon un mode de réalisation de la présente invention peut comprendre les étapes consistant à : recevoir un signal de mise sous tension ; déterminer si une application, qui est exécutée au niveau d'un point temporel de mise hors tension avant de recevoir le signal de mise sous tension, est une application de sortie d'image ; lorsqu'il est déterminé que l'application, qui est exécutée au niveau du point temporel de mise hors tension, est l'application de sortie d'image, charger et exécuter l'application de sortie d'image ; charger et exécuter une première application de commande d'image destinée à commander une sortie d'image par le biais de l'application de sortie d'image ; charger une seconde application de commande d'image ; et, lorsque le chargement est terminé, mettre fin à la première application de commande d'image et exécuter la seconde application de commande d'image.
PCT/KR2015/001108 2014-02-26 2015-02-03 Dispositif numérique et son procédé de commande WO2015130024A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/121,931 US10063923B2 (en) 2014-02-26 2015-02-03 Digital device and control method thereof

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201461945101P 2014-02-26 2014-02-26
US201461945113P 2014-02-26 2014-02-26
US61/945,101 2014-02-26
US61/945,113 2014-02-26
KR1020140131936A KR102224486B1 (ko) 2014-02-26 2014-09-30 디지털 디바이스 및 상기 디지털 디바이스에서 서비스 처리 방법
KR10-2014-0131935 2014-09-30
KR1020140131935A KR102268751B1 (ko) 2014-02-26 2014-09-30 디지털 디바이스 및 그 제어 방법
KR10-2014-0131936 2014-09-30

Publications (1)

Publication Number Publication Date
WO2015130024A1 true WO2015130024A1 (fr) 2015-09-03

Family

ID=54009298

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/001108 WO2015130024A1 (fr) 2014-02-26 2015-02-03 Dispositif numérique et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2015130024A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009453A1 (fr) * 2017-07-06 2019-01-10 엘지전자 주식회사 Dispositif d'affichage
US10560654B2 (en) 2017-09-20 2020-02-11 Lg Electronics Inc. Display device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090034140A (ko) * 2007-10-02 2009-04-07 삼성전자주식회사 복수의 포트를 갖는 메모리를 구비한 휴대 단말기 및 부팅제어 방법
WO2010147267A1 (fr) * 2009-06-17 2010-12-23 엘지전자 주식회사 Appareil d'affichage capable de fournir une information a un utilisateur lors d'un amorçage et son procede de commande
WO2011142498A1 (fr) * 2010-05-14 2011-11-17 엘지전자 주식회사 Télévision numérique et son procédé de commande
WO2012005421A1 (fr) * 2010-07-06 2012-01-12 엘지전자 주식회사 Procédé pour une extension d'application et appareil d'affichage d'image associé
KR20130132079A (ko) * 2012-05-25 2013-12-04 전자부품연구원 녹화 완료 시 디스플레이장치의 상태에 기초한 자동 모드 전환 방법 및 이를 적용한 방송수신장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090034140A (ko) * 2007-10-02 2009-04-07 삼성전자주식회사 복수의 포트를 갖는 메모리를 구비한 휴대 단말기 및 부팅제어 방법
WO2010147267A1 (fr) * 2009-06-17 2010-12-23 엘지전자 주식회사 Appareil d'affichage capable de fournir une information a un utilisateur lors d'un amorçage et son procede de commande
WO2011142498A1 (fr) * 2010-05-14 2011-11-17 엘지전자 주식회사 Télévision numérique et son procédé de commande
WO2012005421A1 (fr) * 2010-07-06 2012-01-12 엘지전자 주식회사 Procédé pour une extension d'application et appareil d'affichage d'image associé
KR20130132079A (ko) * 2012-05-25 2013-12-04 전자부품연구원 녹화 완료 시 디스플레이장치의 상태에 기초한 자동 모드 전환 방법 및 이를 적용한 방송수신장치

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009453A1 (fr) * 2017-07-06 2019-01-10 엘지전자 주식회사 Dispositif d'affichage
US10560654B2 (en) 2017-09-20 2020-02-11 Lg Electronics Inc. Display device

Similar Documents

Publication Publication Date Title
WO2016085094A1 (fr) Dispositif multimédia et procédé de commande associé
WO2015099343A1 (fr) Dispositif numérique et son procédé de commande
WO2016085070A1 (fr) Système de commande de dispositif, dispositif numérique, et procédé de commande pour commander ledit dispositif
WO2016027933A1 (fr) Dispositif numérique et son procédé de commande
WO2017003022A1 (fr) Dispositif d'affichage et son procédé de commande
WO2016104907A1 (fr) Dispositif numérique, et procédé de traitement de données par le même dispositif numérique
WO2016143965A1 (fr) Dispositif d'affichage, et procédé de commande correspondant
WO2016175361A1 (fr) Dispositif d'affichage et son procédé de commande
WO2017034065A1 (fr) Dispositif d'affichage et son procédé de commande
WO2016186254A1 (fr) Panneau d'affichage et son procédé de commande
WO2012030024A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2012026651A1 (fr) Procédé de synchronisation de contenus et dispositif d'affichage permettant le procédé
WO2012015117A1 (fr) Procédé pour faire fonctionner un appareil d'affichage d'image
WO2011126202A1 (fr) Appareil d'affichage d'image et son procédé d'utilisation
WO2011136458A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2012046928A1 (fr) Procédé de production de contenu publicitaire utilisant un dispositif d'affichage, et dispositif d'affichage à cet effet
WO2011132840A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2016175356A1 (fr) Dispositif numérique et procédé de commande de dispositif numérique
WO2017135585A2 (fr) Haut-parleur principal, haut-parleur secondaire et système comprenant ceux-ci
WO2017200215A1 (fr) Terminal mobile et procédé de commande de celui-ci
WO2012030025A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2017047868A1 (fr) Terminal mobile et procédé de commande correspondant
WO2020149426A1 (fr) Dispositif d'affichage d'image et son procédé de commande
WO2017034298A1 (fr) Dispositif numérique et procédé de traitement de données dans ledit dispositif numérique
WO2019221365A1 (fr) Téléviseur flexible et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15755799

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15121931

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 15755799

Country of ref document: EP

Kind code of ref document: A1