KR20160047179A - Digital device and method of processing content thereof - Google Patents
Digital device and method of processing content thereof Download PDFInfo
- Publication number
- KR20160047179A KR20160047179A KR1020140143245A KR20140143245A KR20160047179A KR 20160047179 A KR20160047179 A KR 20160047179A KR 1020140143245 A KR1020140143245 A KR 1020140143245A KR 20140143245 A KR20140143245 A KR 20140143245A KR 20160047179 A KR20160047179 A KR 20160047179A
- Authority
- KR
- South Korea
- Prior art keywords
- application
- service
- request
- screen
- manager
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/858—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
- H04N21/8586—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Disclosed herein are methods of processing content in digital devices and digital devices. Here, in the digital device according to the present invention, there is provided a content processing method comprising the steps of: outputting a live broadcast; Receiving a first request from a user; Outputting a first application execution screen on the live broadcast screen in response to the first request; Receiving a second request from the user; Outputting a second application execution screen on the live broadcast screen from the first application execution screen in response to the second request; Receiving a third request from the user; And switching to a streaming service for the content selected in response to the third request and outputting the streaming service, wherein the first application is terminated in response to the second request.
Description
BACKGROUND OF THE
The transition from analog broadcasting to digital broadcasting is being made. Digital broadcasting is stronger than external analog noise compared with conventional analog broadcasting, so data loss is small, error correction is advantageous, and resolution is high, thus providing a clearer picture. Unlike analog broadcasting, digital broadcasting also provides bidirectional services. In addition to the conventional medium such as terrestrial broadcasting, satellite and cable, IPTV broadcasting service providing real-time broadcasting and CoD (Content on Demand) services through IP (Internet Protocol) Services are also being provided.
Conventionally, while providing one content on the screen, it is impossible to overlay and provide other contents before the execution of the TV. Therefore, when a user using a conventional TV requests a new content, there is an inconvenience in using the content, for example, the currently running content is ended and the previous content is used again. This is also true in recent TVs. For example, even if an application is provided in the process of providing broadcast contents, the application is completely separate application, such as SNS (Social Network Service), and has no connection with the broadcast content. To this end, there is a problem that the entire menu screen is called to search for related content, and the content is provided again according to the user's selection from the searched result.
In order to solve the above-described problems, the present invention provides a multi-tasking system in which one or more applications related to the viewed content are executed in the foreground without switching screens in a process of viewing one content, tasking).
Another object of the present invention is to quickly process a service change or switching service according to multi-tasking as well as user convenience and product satisfaction by making applications available easily.
The present invention further improves user convenience and product satisfaction, and raises the user's desire to purchase a product.
According to an aspect of the present invention, there is provided a method of processing content in a digital device, comprising: outputting a live broadcast; Receiving a first request from a user; Outputting a first application execution screen on the live broadcast screen in response to the first request; Receiving a second request from the user; Outputting a second application execution screen on the live broadcast screen from the first application execution screen in response to the second request; Receiving a third request from the user; And switching to a streaming service for the content selected in response to the third request and outputting the streaming service, wherein the first application is terminated in response to the second request.
According to an embodiment of the present invention, there is provided a digital device comprising: a user interface unit for receiving first to third requests from a user; Outputting a first application execution screen on the live broadcast screen in response to the first request, outputting a second application execution screen from the first application execution screen on the live broadcast screen in response to the second request, A controller for switching to a streaming service for the content selected in response to the third request and outputting the streaming service; And an output unit for outputting the live broadcast screen, the first application execution screen, the second application execution screen, and the streaming service screen according to the third request, and the control unit controls the first application And terminates the execution.
The technical problem to be solved by the present invention is not limited to the above-described technical problems and other technical problems which are not mentioned can be clearly understood by those skilled in the art from the following description .
According to the present invention, there are the following effects.
According to an embodiment of the present invention, one or more applications related to the content to be watched in the process of viewing one content can be executed in the foreground without switching screens, thereby supporting multi-tasking .
According to another embodiment of the present invention, the application can be easily used, and the service conversion and the like can be promptly processed according to the multi-tasking as well as the user's convenience and the satisfaction of the product.
According to another embodiment of the various embodiments of the present invention, there is an effect that user's convenience of use and product satisfaction are improved and the user's desire to purchase a product is increased.
The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.
1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention;
2 is a block diagram illustrating a digital device according to an embodiment of the present invention;
3 is a block diagram illustrating a digital device according to another embodiment of the present invention;
4 is a block diagram illustrating a digital device according to another embodiment of the present invention;
FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention; FIG.
Figure 6 illustrates an input means coupled to the digital device of Figures 2 through 4 according to one embodiment of the present invention;
FIG. 7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention; FIG.
8 is a diagram illustrating an architecture of a web OS device according to an embodiment of the present invention;
9 is a diagram illustrating a graphical composition flow in a web OS device according to an embodiment of the present invention;
10 is a diagram illustrating a media server according to an embodiment of the present invention;
11 is a diagram illustrating a configuration block of a media server according to an embodiment of the present invention;
12 is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention;
13 is a view for explaining a content processing method according to an embodiment of the present invention;
FIG. 14 is a view for explaining a content processing method according to another embodiment of the present invention; FIG.
FIG. 15 is a diagram for explaining the scenario of FIG. 13 in more detail; FIG.
FIG. 16 is a diagram illustrating the scenario of FIG. 14 in more detail; FIG.
17 is a flowchart of a content processing method according to an embodiment of the present invention;
18 is a flowchart of a content processing method according to another embodiment of the present invention, and
19 is a flowchart of a content processing method according to another embodiment of the present invention.
Hereinafter, various embodiments (s) of a digital device and a content processing method in the digital device according to the present invention will be described in detail with reference to the drawings.
The suffix "module "," part ", and the like for components used in the present specification are given only for ease of specification, and both may be used as needed. Also, even when described in ordinal numbers such as " 1st ", "2nd ", and the like, it is not limited to such terms or ordinal numbers.
In addition, although the terms used in the present specification have been selected from the general terms that are widely used in the present invention in consideration of the functions according to the technical idea of the present invention, they are not limited to the intentions or customs of the artisan skilled in the art, It can be different. However, in certain cases, some terms are arbitrarily selected by the applicant, which will be described in the related description section. Accordingly, it should be understood that the term is to be interpreted based not only on its name but on its practical meaning as well as on the contents described throughout this specification.
It is to be noted that the contents of the present specification and / or drawings are not intended to limit the scope of the present invention.
The term " digital device " as used herein refers to at least one of transmitting, receiving, processing, and outputting data, content, service, Includes all devices that perform. The digital device can be paired or connected (hereinafter, referred to as 'pairing') with another digital device, an external server, or the like through a wire / wireless network, Can be transmitted / received. At this time, if necessary, the data may be appropriately converted before the transmission / reception. The digital device may be a standing device such as a network TV, a Hybrid Broadcast Broadband TV (HBBTV), a Smart TV, an IPTV (Internet Protocol TV), a PC (Personal Computer) And a mobile device or handheld device such as a PDA (Personal Digital Assistant), a smart phone, a tablet PC, a notebook, and the like. In order to facilitate understanding of the present invention and to facilitate the description of the present invention, FIG. 2, which will be described later, describes a digital TV, and FIG. 3 illustrates and describes a mobile device as an embodiment of a digital device. In addition, the digital device described herein may be a digital signage, a monitor or a display device composed only of a panel, a set-top box (STB) Or may be part of one service system in conjunction with a server or the like.
The term " wired / wireless network " as used herein collectively refers to communication networks that support various communication standards or protocols for pairing and / or data transmission / reception between digital devices or digital devices and external servers. Such a wired / wireless network includes all of the communication networks to be supported by the standard now or in the future, and is capable of supporting one or more communication protocols therefor. Such a wired / wireless network includes, for example, a USB (Universal Serial Bus), a Composite Video Banking Sync (CVBS), a Component, an S-Video (Analog), a DVI (Digital Visual Interface) A communication standard or protocol for a wired connection such as an RGB or a D-SUB, a Bluetooth standard, a radio frequency identification (RFID), an infrared data association (IrDA), an ultra wideband (UWB) (ZigBee), DLNA (Digital Living Network Alliance), WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) A long term evolution (LTE-Advanced), and Wi-Fi direct, and a communication standard or protocol for the network.
In addition, when the term is simply referred to as a digital device in this specification, the meaning may mean a fixed device or a mobile device depending on the context, and may be used to mean both, unless specifically stated otherwise.
Meanwhile, a digital device is an intelligent device that supports, for example, a broadcast receiving function, a computer function or a support, at least one external input, and the like. The digital device may be an e-mail, web browsing, Banking, game, application, and so on. In addition, the digital device may include an interface for supporting at least one input or control means (hereinafter, " input means ") such as a handwriting input device, a touch- .
In addition, the digital device can use a standardized general-purpose OS (Operating System), but in particular, the digital device described in this specification uses Web OS (Web OS) as an embodiment. Therefore, a digital device can handle adding, deleting, amending, and updating various services or applications on a general-purpose OS kernel or a Linux kernel. And through which a more user-friendly environment can be constructed and provided.
Meanwhile, the above-described digital device can receive and process an external input. The external input is connected to an external input device, that is, the above-mentioned digital device via a wired / wireless network, And all input means or digital devices capable of being processed. For example, the external input may be a game device such as a high-definition multimedia interface (HDMI), a playstation or an X-Box, a smart phone, a tablet PC, a pocket photo devices such as digital cameras, printing devices, smart TVs, Blu-ray device devices and the like.
In addition, the term "server" as used herein refers to a digital device or system that supplies data to or receives data from a digital device, that is, a client, and may be referred to as a processor do. The server may include a portal server for providing a web page, a web content or a web service, an advertising server for providing advertising data, A content server providing content, an SNS server providing a social network service (SNS), a service server provided by a manufacturer, a video on demand (VoD) service or a streaming service And a service server that provides a Multichannel Video Programming Distributor (MVPD) and a pay service, for example.
In the following description, the term " application " refers only to an application, and the term " application " May refer to a web application.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
According to an exemplary embodiment of the present invention, a digital device includes a user interface for receiving first to third requests from a user, a first application execution screen corresponding to the first request, Outputting a second application execution screen on the live broadcasting screen from the first application execution screen in response to the second request, and outputting the second application execution screen on the live broadcasting screen in response to the third request, A first application execution screen, a second application execution screen, and an output unit for outputting a streaming service screen in accordance with the third request, wherein the control unit controls the streaming service to be switched to a streaming service, and an outputting unit, wherein the control unit, in response to the second request, Control is terminated.
1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention.
1, a service system includes a
The
The
The
The above-described
The
The
Meanwhile, the
In addition, the
In FIG. 1, the
2 is a block diagram illustrating a digital device according to an exemplary embodiment of the present invention.
The digital device described herein corresponds to the
The
The
The TCP /
The
The
The
The audio /
The application manager may include, for example, the
The
The
The
The
The
The
The SI &
The SI &
Meanwhile, the
3 is a block diagram illustrating a digital device according to another embodiment of the present invention.
If the above-described Fig. 2 is described with an example of a digital device as a fixing device, Fig. 3 shows a mobile device as another embodiment of a digital device.
3, the
Hereinafter, each component will be described in detail.
The
The
The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms, for example, in the form of an EPG (Electronic Program Guide) or an ESG (Electronic Service Guide).
The
The broadcast signal and / or broadcast related information received through the
The
The
The short-
The
The A /
The image frame processed by the
The
The user input unit 330 generates input data for user's operation control of the terminal. The user input unit 330 may include a key pad, a dome switch, a touch pad (static pressure / static electricity), a jog wheel, a jog switch, and the like.
The
The
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
The
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits corresponding data to the
A
Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
The
The
The
The
The
The
The identification module is a chip for storing various information for authenticating the usage right of the
The
The
The
The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and an electrical unit for performing other functions. In some cases, the implementation described herein Examples may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Here, the software code is stored in the
4 is a block diagram illustrating a digital device according to another embodiment of the present invention.
Another example of the
The
For example, if the received RF broadcast signal is a digital broadcast signal, the signal is converted into a digital IF signal (DIF). If the received RF broadcast signal is an analog broadcast signal, the signal is converted into an analog baseband image or a voice signal (CVBS / SIF). That is, the
In addition, the
The
The
The stream signal output from the
The external
The external
The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog) terminal, A DVI (Digital Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.
The wireless communication unit can perform short-range wireless communication with another digital device. The
Also, the external
Meanwhile, the external
The
The
Meanwhile, the
In addition, the
The
The
The
In addition, the
The
4 illustrates an embodiment in which the
The user
For example, the user
In addition, the user
The user
The
The video signal processed by the
The audio signal processed by the
Although not shown in FIG. 4, the
The
The
For example, the
The
On the other hand, the
In addition, the
On the other hand, when entering the application view item, the
The
Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided.
The channel browsing processing unit receives a stream signal TS output from the
The
The
Meanwhile, the
The audio output unit 485 receives a signal processed by the
In order to detect the gesture of the user, a sensing unit (not shown) having at least one of a touch sensor, a voice sensor, a position sensor, and an operation sensor may be further included in the
On the other hand, a photographing unit (not shown) for photographing a user may be further provided. The image information photographed by the photographing unit (not shown) may be input to the
The
The power supply unit 490 supplies the corresponding power to the
Particularly, it is possible to supply power to a
To this end, the power supply unit 490 may include a converter (not shown) for converting AC power to DC power. Meanwhile, for example, when the
The
Also, the
The
In addition, the digital device according to the present invention may further include a configuration that omits some of the configuration shown in FIG. On the other hand, unlike the above, the digital device does not have a tuner and a demodulator, and can receive and reproduce the content through the network interface unit or the external device interface unit.
FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention. Referring to FIG.
An example of the control unit includes a
The
The
The video decoder 425 decodes the demultiplexed video signal, and the
The
On the other hand, the video signal decoded by the
The
The
A frame rate conversion unit (FRC) 555 converts a frame rate of an input image. For example, the frame
The
On the other hand, the voice processing unit (not shown) in the control unit can perform the voice processing of the demultiplexed voice signal. Such a voice processing unit (not shown) may support processing various audio formats. For example, even when a voice signal is coded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, or BSAC, a corresponding decoder can be provided.
In addition, the voice processing unit (not shown) in the control unit can process the base, the treble, the volume control, and the like.
A data processing unit (not shown) in the control unit can perform data processing of the demultiplexed data signal. For example, the data processing unit can decode the demultiplexed data signal even when it is coded. Here, the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.
On the other hand, the above-described digital device is an example according to the present invention, and each component can be integrated, added, or omitted according to specifications of a digital device actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be divided into two or more components. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and devices thereof do not limit the scope of rights of the present invention.
Meanwhile, the digital device may be an image signal processing device that performs signal processing of an image stored in the device or an input image. Other examples of the video signal processing device include a set-top box (STB), a DVD player, a Blu-ray player, a game device, a computer Etc. can be further exemplified.
FIG. 6 is a diagram illustrating input means coupled to the digital device of FIGS. 2 through 4 according to one embodiment of the present invention.
A front panel (not shown) or a control means (input means) provided on the
The control means includes a
The input means may be a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) At least one can be employed as needed to communicate with the digital device.
The
The
Since the
On the other hand, the control means such as the
The digital device described in this specification uses the Web OS as an OS and / or platform. Hereinafter, the processing such as the configuration or algorithm based on the web OS can be performed in the control unit of the above-described digital device or the like. Here, the control unit includes the control unit in FIGS. 2 to 5 described above and uses it as a broad concept. Accordingly, in order to process services, applications, contents, and the like related to the web OS in the digital device, the hardware and components including related software, firmware, and the like are controlled by a controller Named and explained.
Such a web OS-based platform is intended to enhance development independence and function expandability by integrating services, applications, and the like based on, for example, a luna-service bus, Productivity can be increased. Also, multi-tasking can be supported by efficiently utilizing system resources and the like through a Web OS process and resource management.
Meanwhile, the web OS platform described in this specification can be used not only in fixed devices such as a PC, a TV, and a set-top box (STB) but also in mobile devices such as mobile phones, smart phones, tablet PCs, notebooks, wearable devices .
The structure of the software for digital devices is based on a single process and a closed product based on multi-threading with conventional problem solving and market-dependent monolithic structure, And has been pursuing new platform-based development since then, has pursued cost innovation through chip-set replacement, UI application and external application development efficiency, and developed layering and componentization Layer structure and an add-on structure for add-ons, single-source products, and open applications. More recently, the software architecture provides a modular architecture for functional units, a Web Open API (Application Programming Interface) for echo-systems, and a game engine And a native open API (Native Open API), and thus, a multi-process structure based on a service structure is being created.
7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention.
Referring to FIG. 7, the architecture of the Web OS platform will be described as follows.
The platform can be largely divided into a kernel, a system library based Web OS core platform, an application, and a service.
The architecture of the Web OS platform is layered structure, with the OS at the lowest layer, the system library (s) at the next layer, and the applications at the top.
First, the lowest layer includes a Linux kernel as an OS layer, and can include Linux as an OS of the digital device.
The OS layer is provided with a BOS (Board Support Package) / HAL (Hardware Abstraction Layer) layer, a Web OS core modules layer, a service layer, a Luna-Service layer Bus layer, Native Developer's Kit (NDK) / QT layer (Enyo framework / NDK / QT layer), and Application layer in the uppermost layer.
Meanwhile, some layers of the above-described web OS layer structure may be omitted, and a plurality of layers may be one layer, or one layer may be a plurality of layer structures.
The web OS core module layer is based on LSM (Luna Surface Manager) for managing surface windows and the like, SAM (System & Application Manager) for managing the execution and execution status of applications, and WebKit And a WAM (Web Application Manager) for managing web applications and the like.
The LSM manages an application window displayed on the screen. The LSM is responsible for the display hardware (HW) and provides a buffer for rendering the contents necessary for the applications. The LSM composites the rendered results of a plurality of applications, Can be output.
The SAM manages various conditional execution policies of the system and the application.
WAM, on the other hand, is based on the Enyo Framework, which can be regarded as a basic application for web applications.
The use of an application's service is done via a luna-service bus, a new service can be registered on the bus, and an application can find and use the service it needs.
The service layer may include various service level services such as TV service and Web OS service. Meanwhile, the web OS service may include a media server, a Node.JS, and the like. In particular, the Node.JS service supports, for example, javascript.
Web OS services can communicate over a bus to a Linux process that implements function logic. It can be divided into four parts. It is developed from TV process and existing TV to Web OS, services that are differentiated by manufacturer, service which is manufacturer's common service and JavaScript, and is used through Node.js Node.js service.
The application layer may include all applications that can be supported in a digital device, such as a TV application, a showcase application, a native application, a web application, and the like.
An application on the Web OS can be classified into a web application, a PDK (Palm Development Kit) application, a QT (Qt Meta Language or Qt Modeling Language) application and the like according to an implementation method.
The web application is based on a WebKit engine and is executed on a WAM runtime. These web applications can be based on the ENI framework, or they can be developed and run on a common HTML5, CSS (Cascading Style Sheets), or JavaScript based.
The PDK application includes a native application developed in C / C ++ based on a PDK provided for third-party or external developers. The PDK refers to a development library and a tool set provided for a third party such as a game to develop a native application (C / C ++). For example, PDK applications can be used to develop applications where performance is critical.
The QML application is a Qt-based native application and includes a basic application provided with a web OS platform such as a card view, a home dashboard, a virtual keyboard, and the like. Here, QML is a mark-up language in the form of a script instead of C ++.
In the meantime, the native application is an application that is developed and compiled in C / C ++ and executed in a binary form. The native application has a high speed of execution.
FIG. 8 is a diagram illustrating an architecture of a Web OS device according to an exemplary embodiment of the present invention. Referring to FIG.
FIG. 8 is a block diagram based on the runtime of the Web OS device, which can be understood with reference to the layered structure of FIG.
The following description will be made with reference to FIGS. 7 and 8. FIG.
Referring to FIG. 8, services and applications and WebOS core modules are included on the system OS (Linux) and system libraries, and communication between them can be done via the Luna-Service bus.
E-mail, contact, calendar, etc. Node.js services based on HTML5, CSS, java script, logging, backup, file notify notify, database (DB), activity manager, system policy, audio daemon (AudioD), update, media server, TV services such as Electronic Program Guide (PVD), Personal Video Recorder (PVR), data broadcasting, voice recognition, Now on, Notification, search, , CP services such as Auto Content Recognition (ACR), Contents List Browser (CBOX), wfdd, DMR, Remote Application, download, Sony Philips Digital Interface Format (SDPIF), PDK applications, , QML applications, etc. And the enyo framework based on the TV UI-related applications and web applications, Luna - made the process via the Web OS core modules, such as the aforementioned SAM, WAM, LSM via the service bus. Meanwhile, in the above, TV applications and web applications may not necessarily be UI-based or UI-related.
CBOX can manage the list and metadata of external device contents such as USB, DLNA, cloud etc. connected to TV. Meanwhile, the CBOX can output a content listing of various content containers such as a USB, a DMS, a DVR, a cloud, etc. to an integrated view. In addition, CBOX can display various types of content listings such as pictures, music, video, and manage the metadata. In addition, the CBOX can output the contents of the attached storage in real-time. For example, when a storage device such as a USB is plugged in, the CBOX must be able to immediately output the content list of the storage device. At this time, a standardized method for processing the content listing may be defined. In addition, CBOX can accommodate various connection protocols.
The SAM is intended to improve module complexity and scalability. For example, the existing system manager processes various functions such as system UI, window management, web application runtime, and UX constraint processing in one process to separate the main functions to solve the large implementation complexity, Clarify the implementation interface by clarifying the interface.
LSM supports independent development and integration of system UX implementations such as card view, launcher, etc., and supports easy modification of product requirements. Meanwhile, LSM enables multi-tasking by utilizing hardware resource (HW resource) when synthesizing a plurality of application screens such as an application on application, 21: 9, and so on.
LSM supports implementation of system UI based on QML and improves its development productivity. Based on MVC, QML UX can easily construct views for layouts and UI components, and can easily develop code for handling user input. On the other hand, the interface between the QML and the Web OS component is via the QML extension plug-in, and the graphic operation of the application can be based on the wayland protocol, luna-service call, etc. have.
LSM is an abbreviation of Luna Surface Manager, as described above, which functions as an Application Window Compositor.
LSM synthesizes independently generated applications, UI components, etc. on the screen and outputs them. In this regard, when components such as recents applications, showcase applications, launcher applications, etc. render their own contents, the LSM defines the output area, interworking method, etc. as a compositor. In other words, the compositor LSM handles graphics synthesis, focus management, input events, and so on. At this time, the LSM receives events, focus, etc. from the input manager. These input managers can include HIDs such as remote controllers, mouse & keyboards, joysticks, game pads, application remotes, and pen touches.
As such, LSM supports multiple window models, which can be performed simultaneously in all applications due to the nature of the system UI. In this regard, it is also possible to provide various functions such as launcher, resent, setting, notification, system keyboard, volume UI, search, finger gesture, Voice Recognition (STT (Sound to Text), TTS LSM can support a pattern gesture (camera, mobile radio control unit (MRCU), live menu, ACR (Auto Content Recognition), etc.) .
9 is a diagram illustrating a graphic composition flow in a web OS device according to an embodiment of the present invention.
9, the graphic composition processing includes a
When the web application-based graphic data (or application) is generated as a UI process in the
The
On the other hand, the full-screen application is passed directly to the
The graphical manager processes all the graphic data in the web OS device, including the data through the LSM GM surface described above, the data through the WAM GM surface, as well as the GM surface such as data broadcasting application, caption application, And processes the received graphic data appropriately on the screen. Here, the functions of the GM compositor are the same or similar to those of the compositor described above.
FIG. 10 is a view for explaining a media server according to an embodiment of the present invention, FIG. 11 is a view for explaining a configuration block diagram of a media server according to an embodiment of the present invention, and FIG. 12 Is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention.
The media server supports the execution of various multimedia in the digital device and manages the necessary resources. The media server can efficiently use hardware resources required for media play. For example, the media server requires an audio / video hardware resource for multimedia execution and can efficiently utilize the resource usage status by managing it. In general, a fixed device having a larger screen than a mobile device needs more hardware resources to execute multimedia, and a large amount of data must be encoded / decoded and transmitted at a high speed. Meanwhile, the media server may perform tasks such as broadcasting, recording and tuning tasks in addition to streaming and file-based playback, recording simultaneously with viewing, or simultaneously displaying the sender and recipient screens at the time of video call And so on. However, the media server is limited in terms of hardware resources such as an encoder, a decoder, a tuner, and a display engine, and thus it is difficult to execute a plurality of tasks at the same time. For example, And processes it.
The media server may be robust in system stability because, for example, a pipeline in which an error occurs during media playback can be removed and restarted on a per-pipeline basis, Even if it does not affect other media play. Such a pipeline is a chain that links each unit function such as decoding, analysis, and output when a media reproduction request is made, and the necessary unit functions may be changed according to a media type and the like.
The media server may have extensibility, for example, adding new types of pipelines without affecting existing implementations. As an example, the media server may accommodate a camera pipeline, a video conference (Skype) pipeline, a third-party pipeline, and the like.
The media server can process general media playback and TV task execution as separate services because the interface of the TV service is different from the case of media playback. In the above description, the media server supports operations such as 'setchannel', 'channelup', 'channeldown', 'channeltuning', and 'recordstart' in relation to the TV service, ',' stop ', and so on, so that they can support different operations for both, and can be processed as separate services.
The media server can control or integrally manage the resource management function. The allocation and recall of hardware resources in the device are performed integrally in the media server. In particular, the TV service process transfers the running task and the resource allocation status to the media server. The media server obtains resources and executes the pipeline each time each media is executed, and permits execution by a priority (e.g., policy) of the media execution request, based on the resource status occupied by each pipeline, and And performs resource recall of other pipelines. Here, the predefined execution priority and necessary resource information for the specific request are managed by the policy manager, and the resource manager can communicate with the policy manager to process the resource allocation, the number of times, and the like.
The media server may have an identifier (ID) for all playback related operations. For example, the media server may issue a command to indicate a particular pipeline based on the identifier. The media server may separate the two into pipelines for playback of more than one media.
The media server may be responsible for playback of the HTML 5 standard media.
In addition, the media server may follow the TV restructuring scope of the TV pipeline as a separate service process. The media server can be designed regardless of the TV restructuring scope. If the TV is not a separate service process, it may be necessary to re-execute the entire TV when there is a problem with a specific task.
The media server is also referred to as uMS, i.e., a micro media server. Here, the media player is a media client, which is a media client, for example, a web page for an HTML5 video tag, a camera, a TV, a Skype, a second screen, It can mean a kit (Webkit).
In the media server, management of micro resources such as a resource manager, a policy manager, and the like is a core function. In this regard, the media server also controls the playback control role for the web standard media content. In this regard, the media server may also manage pipeline controller resources.
Such a media server supports, for example, extensibility, reliability, efficient resource usage, and the like.
In other words, the uMS or media server may be a web OS device, such as a resource, such as a cloud game, a MVPD (pay service), a camera preview, a second screen, a Skype, And manage and control the use of resources for proper processing in an overall manner so as to enable efficient use. On the other hand, each resource uses, for example, a pipeline in its use, and the media server can manage and control the generation, deletion, and use of a pipeline for resource management as a whole.
Here, the pipeline refers to, for example, when a media associated with a task starts a series of operations, such as parsing of a request, a decoding stream, and a video output, . For example, with respect to a TV service or an application, watching, recording, channel tuning, and the like are each individually handled under the control of resource utilization through a pipeline generated according to the request .
The processing structure and the like of the media server will be described in more detail with reference to FIG.
10, an application or service is connected to a media server 1020 via a luna-to-service bus 1010, and the media server 1020 is connected to pipelines generated again via the luna- Connected and managed.
The application or service can have various clients depending on its characteristics and can exchange data with the media server 1020 or the pipeline through it.
The client includes, for example, a uMedia client (web kit) and a RM (resource manager) client (C / C ++) for connection with the media server 1020.
The application including the uMedia client is connected to the media server 1020, as described above. More specifically, the uMedia client corresponds to, for example, a video object to be described later, and the client uses the media server 1020 for the operation of video by a request or the like.
Here, the video operation relates to the video state, and the loading, unloading, play, playback, or reproduce, pause, stop, Data may be included. Each operation or state of such video can be processed through individual pipeline generation. Accordingly, the uMedia client sends the state data associated with the video operation to the
The
On the other hand, the
The
In addition, the pipeline may include, for example, a service-based pipeline (its own pipeline) and a URI-based pipeline (media pipeline).
Referring to FIG. 10, an application or service including an RM client may not be directly connected to the media server 1020. This is because the application or service may directly process the media. In other words, if the application or service directly processes the media, it may not pass through the media server. However, at this time, uMS connectors need to manage resources for pipeline creation and use. Meanwhile, when receiving a resource management request for direct media processing of the application or service, the uMS connector communicates with the media server 1020 including the
Accordingly, the application or service can respond to the request of the RM client by receiving the resource management of the
On the other hand, the URI-based pipeline is performed through the media server 1020 instead of processing the media directly as in the RM client described above. Such URI-based pipelines may include a player factory, a G streamer, a streaming plug-in, a DRM (Digital Rights Management) plug-in pipeline, and the like.
On the other hand, the interface method between application and media services may be as follows.
It is a way to interface with a service in a web application. This is a way of using Luna Call using the Palm Service Bridge (PSB), or using Cordova, which extends the display to video tags. In addition, there may be a method using the HTML5 standard for video tags or media elements.
And, it is a method of interfacing with PDK using service.
Alternatively, it is a method of using the service in the existing CP. It can be used to extend existing platform plug-ins based on Luna for backward compatibility.
Finally, it is a way to interface in the case of non-web OS. In this case, you can interface directly by calling the Luna bus.
Seamless change is handled by a separate module (for example TVWIN), which is a process for displaying the TV on the screen first and for streamlining without Web OS, to be. This is because the boot time of WebOS is delayed, so it is used to provide the basic function of the TV service first for quick response to the user's power on request. In addition, the module is part of the TV service process and supports quick boot and null change, which provides basic TV functions, and factory mode. In addition, the module may also switch from the Non-Web OS mode to the Web OS mode.
Referring to FIG. 11, a processing structure of a media server is shown.
11, the solid line box represents the process processing configuration, and the dashed box represents the internal processing module during the process. In addition, the solid line arrows represent inter-process calls, that is, Luna service calls, and the dotted arrows may represent notifications or data flows such as register / notify.
A service or a web application or a PDK application (hereinafter 'application') is connected to various service processing components via a luna-service bus, through which an application is operated or controlled.
The data processing path depends on the type of application. For example, when the application is image data related to the camera sensor, the image data is transmitted to the camera processing unit 1130 and processed. At this time, the camera processing unit 1130 processes image data of a received application including a gesture, a face detection module, and the like. Here, the camera processing unit 1130 can generate a pipeline through the media server processing unit 1110 and process the corresponding data if the user desires to use the pipeline or the like.
Alternatively, when the application includes audio data, the audio processing unit (AudioD) 1140 and the audio module (PulseAudio) 1150 can process the audio. For example, the
Alternatively, when the application includes or processes (includes) DRM-attached content, the DRM service processing unit 1170 transmits the content data to the DRM service processing unit 1160, and the DRM service processing unit 1170 generates a DRM instance And processes the DRM-enabled content data. Meanwhile, the DRM service processing unit 1160 may process the DRM pipeline in the media pipeline through the Luna-Service bus to process the DRM-applied content data.
Hereinafter, processing in the case where the application is media data or TV service data (e.g., broadcast data) will be described.
FIG. 12 shows only the media server processing unit and the TV service processing unit in FIG. 11 described above in more detail.
Therefore, the following description will be made with reference to FIGS. 11 and 12. FIG.
First, when the application includes TV service data, it is processed in the TV service processing unit 1120/1220.
The TV service processing unit 1120 includes at least one of a DVR / channel manager, a broadcasting module, a TV pipeline manager, a TV resource manager, a data broadcasting module, an audio setting module, and a path manager. 12, the TV service processing unit 1220 includes a TV broadcast handler, a TV broadcast interface, a service processing unit, a TV middleware (TV MW (middleware)), a path manager, a BSP For example, NetCast). Here, the service processing unit may be a module including, for example, a TV pipeline manager, a TV resource manager, a TV policy manager, a USM connector, and the like.
In this specification, the TV service processing unit may have a configuration as shown in Fig. 11 or 12, or may be implemented by a combination of both, in which some configurations are omitted or some configurations not shown may be added.
The TV service processing unit 1120/1220 transmits the data to the DVR / channel manager in the case of DVR (Digital Video Recorder) or channel related data based on the attribute or type of the TV service data received from the application, To generate and process the TV pipeline. On the other hand, when the attribute or type of the TV service data is broadcast content data, the TV service processing unit 1120 generates and processes the TV pipeline through the TV pipeline manager for processing the corresponding data via the broadcasting module.
Alternatively, a json (JavaScript standard object notation) file or a file created in c is processed by a TV broadcast handler and transmitted to a TV pipeline manager through a TV broadcast interface to generate and process a TV pipeline. In this case, the TV broadcast interface unit may transmit the data or the file that has passed through the TV broadcast handler to the TV pipeline manager based on the TV service policy and refer to it when creating the pipeline.
Hereinafter, the processing in the TV service processing unit 1220 of FIG. 12, specifically, the process of the TV broadcast interface and below will be described in more detail.
The TV broadcast interface may also function as a controller of the TV service processing unit 1220. The TV broadcast interface requests pipeline creation to the TV pipeline manager, and the TV pipeline manager creates a TV pipeline and requests resources from the TV resource manager. The TV resource manager makes a resource request to the media server via the UMS connector and obtains it, and then returns it to the TV pipeline manager.
The TV pipeline manager arranges the returned resources in the generated TV pipeline and registers the pipeline information in the path manager. The TV pipeline manager then returns the result to the TV pipeline manager, which returns the pipeline to the TV broadcast interface.
The TV broadcast interface then communicates with the TV middleware (MW: MiddleWare) to request a channel change, and the result is returned from the TV middleware.
Through the above process, the TV service can be processed.
The TV pipeline manager can be controlled by the TV resource manager in generating one or more pipelines in response to a TV pipeline creation request from a processing module in a TV service, a manager, or the like. Meanwhile, the TV resource manager can be controlled by the TV policy manager to request the status and allocation of resources allocated for the TV service according to the TV pipeline creation request of the TV pipeline manager, and the media server processing unit 1110 / 1210) and uMS connectors. The resource manager in the media server processing unit 1110/1210 transmits the status of the current TV service, the allocation permission, etc. according to the request of the TV resource manager. For example, if a resource manager in the media server processing unit 1110/1210 confirms that all the resources for the TV service have already been allocated, the TV resource manager can notify that all the resources are currently allocated. At this time, the resource manager in the media server processing unit, together with the notify, removes a predetermined TV pipeline according to a priority or a predetermined criterion among the TV pipelines preliminarily allocated for the TV service and generates a TV pipeline for the requested TV service Or may be assigned. Alternatively, the TV resource manager can appropriately remove, add, or control the TV pipeline in the TV resource manager according to the status report of the resource manager in the media server processing unit 1110/1210.
Meanwhile, the BSP supports, for example, backward compatibility with existing digital devices.
The TV pipelines thus generated can be appropriately operated according to the control of the path manager in the process. The path manager can determine and control the processing path or process of the pipelines by considering not only the TV pipeline but also the operation of the pipeline generated by the media server processing unit 1110/1210.
Next, when the application includes media data, rather than TV service data, it is processed by the media server processing unit 1110/1210. Here, the media server processing units 1110 and 1210 include a resource manager, a policy manager, a media pipeline manager, a media pipeline controller, and the like. On the other hand, the pipeline generated according to the control of the media pipeline manager and the media pipeline controller can be variously generated such as a camera preview pipeline, a cloud game pipeline, and a media pipeline. On the other hand, the media pipeline may include a streaming protocol, an auto / static gstreamer, and a DRM, which can be determined according to the control of the path manager. The specific processing in the media server processing units 1110 and / or 1210 cites the description of FIG. 10 described above, and will not be repeated here.
In this specification, the resource manager in the media server processing unit 1110/1210 can perform resource management with, for example, a counter base.
The media server design on the Web OS platform will be described in more detail as follows.
The media server is a media framework that supports third-party multimedia pipeline (s) to interface with the web OS platform. The media server may control, manage, isolate, deconflict, etc. the resources so that the third-party multimedia pipeline (s) may be compliant. Such a media server can be regarded as a platform module that provides a generalized API for an application to play media and can manage hardware resources and policies consistently. On the other hand, the design of the media server is in generalizing the media processing and reducing the complexity by separating related modules.
The core of such a media server is, for example, the integration of a service interface and a web OS UI. To this end, the media server controls the resource manager, the policy manager, the pipeline manager, and provides API access according to the resource manager query.
The uMS connector is the main API and SDK for interfacing client media pipeline processes with media servers. The Ums connector is an event and message about the interface. The client media pipelines implement client media pipeline state events to enable load, play, pause, seek, stop, unload, release_resource, acquire_resource, .
The uMedia API provides the C and C ++ APIs as media servers.
The Media Resource Manager provides a way to disk live media hardware resources and pipelined client resource usage in a single simple configuration file. The media resource manager provides all the performance and information needed to implement default or third-party media policy management.
The media policy manager functions when the resource manager rejects the media pipeline due to resource conflicts. The policy manager can provide a consistent API and SDK to enable a third-party policy manager implementation. The policy manager supports media pipelines that match Least Recently Used (LRU) and may be used for one or more of the conflicted resources.
The pipeline manager tracks and maintains client media pipelines. The pipeline controller provides a consistent API to the pipeline manager to control and manage the client media pipelines.
The media server communicates with the resource manager and library calls, and the resource manager can communicate through the TV services, the media pipeline, and the Luna service bus.
The media resource manager may configure an entire configurable configuration file to disk live the media hardware and media client pipelines, detect resource conflicts, and collect all the information needed to implement the media policy management .
The media policy manager reads the policy_select and policy_action fields of the resource configuration file, the resource contention attempts to select the active pipeline described by the policy_select field, and issues a problem for outgoing / selected pipelines based on the policy_action field The issue. The selection function may be a parameter supported by a pipeline configuration setting entry. Policy actions are unload and release. All pipelines support the unload command to release all allocated resources. The pipeline can additionally support the release command to release specific resources. In the above, the release command is for fast switch pipelines that compete with common resources, and the unload command of all resources may not be required for the incoming pipeline and deconflict.
The pipeline manager manages the pipeline controller. The pipeline manager maintains a running queue of the pipeline controller and provides unique indexing for incoming messages from the application (s) via the media server.
The pipeline controller maintains the relationship of the associated media client pipeline process. The pipeline controller maintains all relevant states and provides the media client pipeline control interface to the pipeline manager. The pipeline client process is a separate process using the uMS connector to provide the control interface to the media server, and so on. Pipeline (client) media technology (Gstreamer, Stage Fright) can be completely separate and decoupled from media server management and services.
Hereinafter, a digital device according to the present invention and a content processing method in the digital device will be described in more detail.
In particular, in the present specification, the content associated with the content being provided on the screen is overlaid and output by simply performing an operation to provide convenience of use, and the already-present content is switched to the background and the related content according to the request is transmitted to the foreground (multi-tasking function) without the need to switch screens, and it is possible to improve user convenience and product satisfaction by quickly processing service switching according to the multi-tasking function. I want to. As described above, the present invention can be used or executed without requesting a separate menu screen. Even if the content being used is switched to the background, the service can be switched continuously by outputting continuously on the screen, . Meanwhile, a MHEG application and / or a Connected Red Button (CRB) application, which will be described later, may be additionally provided according to a user's selection after the content is provided, but a corresponding channel or broadcast program may be provided to the MHEG and / If it is identified as providing a CRB application, it may be provided at the same time as the content of the corresponding channel if it is followed by scheduled viewing or immediate playback, or if the channel is switched during the passive channel switching process according to the user's setting have. However, in order to facilitate the understanding of the present invention and for the sake of explanation, it will be explained mainly that an MHEG / CRB application is provided according to a user's choice while watching a live broadcast with a first content. However, the present invention is not limited thereto. The first content may include various contents that can be provided in digital devices such as NRT contents, applications, and web services (web pages). do. On the other hand, the second content associated with the first content provided according to the request of the user may or may not be the same format as the first content. Here, regarding the format, if the first content is a broadcast program, the second content is also the same format if it is a broadcast program, and the second content is a different format if it is an application or a web service that is not a broadcast program. Meanwhile, the present invention can support a mixed format in addition to the same or different formats. The mixed format may mean that the same format and content in a different format are supported together.
13 is a view for explaining a content processing method according to an embodiment of the present invention.
Referring to FIG. 13A, the digital device can output a live program, that is, a
On the other hand, when providing the
Meanwhile, although not shown, in the context of the present invention, when there is an MHEG / CRB application associated with the live broadcast while watching a
Referring to FIG. 13B, as described above, when the user accesses the
Referring to FIG. 13C, when a user accesses the
Referring to FIG. 13D, the screen is, for example, a screen on which a CRB application is executed on a live broadcast screen. Referring to FIG. 13D, a CRB application may be overlayed on a live broadcast screen and executed on an upper layer. As shown in FIG. 13D, the CRB application execution screen is provided simultaneously with the live broadcast. Thus, the user can access the CRB application while enjoying the live broadcast screen. On the other hand, when the CRB application is executed while watching a live broadcast (foreground) according to a user's request or the like, the live broadcast in the foreground is switched to the background, and the CRB application to be executed is located in the foreground do. In particular, referring to FIG. 13B, broadcast station information and service category information are output in the form of a bar at the
Referring to FIG. 13E, when the user selects one
In FIG. 13, the CRB application has been described as being accessed only after selection of the MHEG application from the execution of the MHEG application. However, the present invention is not limited thereto. For example, on the screen of FIG. 13A, the screen of FIG. 13D may be directly provided without going through the steps of FIGS. 13B and 13C. Alternatively, the screens of FIGS. 13A to 13C can be provided immediately. At this time, a live broadcast screen may be provided in the background in the screen configuration of FIG. 13C. 13E shows scenes in which a CRB application is overlaid with a live broadcast screen after the execution of a CRB application. However, even if a live broadcast is executed in the background, a full-screen is formed by only a CRB application selected as a whole .
The MHEG application and / or the CRB application described herein may be a dedicated application associated with or providing convenience for services provided by a server, such as, for example, a broadcast station and / or a channel. At this time, the MHEG application and / or the CRB application may or may not be associated with the currently provided content, even if it is an application provided by the server. Particularly, through the CRB application, the user can perform reservation and / or scheduled recording of the contents provided by the corresponding broadcasting station or channel, and performs various communication such as leaving comments or providing feedback through the web service provided in the broadcasting station or channel can do. When a user requests, for example, a live broadcast of a BBC channel, the digital device on which the Web OS platform is installed, in addition to the red button in the remote controller described above so as to access the MHEG or / and CRB application described above, May also be provided so as to be directly accessible in an application list item even when a web launcher (menu) is called in the application list item.
Further, when the CRB application is executed, the MHEG application can be terminated. Then, when the execution of the CRB application is terminated, the controller of the digital device can re-execute and activate the MHEG application finished execution.
FIG. 14 is a view for explaining a content processing method according to another embodiment of the present invention.
Fig. 14 is similar to the contents of Fig. 13 described above. 14A and 14B will now be described with reference to FIGS. 13A and 13B, and the following description will focus on the different parts.
In the above-described Fig. 13, it does not directly enter Fig. 13B to Fig. 13D. However, FIG. 14 shows that it is possible to directly enter from FIG. 14B to FIG. 14C. In other words, when the execution of the CRB application is requested on the MHEG application execution screen of Fig. 14B, the
Meanwhile, in the present specification, when a CRB application is executed or a web page is provided on the screen, when a live broadcast is switched to the background, a time machine mode is automatically executed for the live broadcast, and the corresponding content is time- shitf). In addition, when the CRB application or its web page is displayed on the screen, the web page or the like may be directly linked with the SNS service according to the user's selection. For example, if a user wants to upload a web page corresponding to one article on the screen and upload it to the SNS service and simply share the web page of the article with the SNS service of the user's own account, . In this case, the SNS type, such as Twitter, Facebook, etc., can be set up or uploaded only to the service according to the selection. In this way, when sharing with the SNS service, not only the link but also the voice and text message of the user can be attached and shared.
FIG. 15 is a diagram for explaining the scenario of FIG. 13 in more detail. FIG. 16 is a diagram for explaining the scenario of FIG. 14 in more detail. In particular, in FIGS. 15 and 16, Will be described in more detail.
As shown in FIG. 15A, when the user presses the red button while providing live broadcasting, the MHEG application is provided as shown in FIG. 15B. At this time, the MHEG application includes a
When the user selects the
16, when a red button key signal is input during live broadcast viewing, an MHEG application including a CRB menu is provided, an application or content of a CRB menu is output according to a user's selection of a CRB menu, Button Press the key to return to live broadcasting.
On the other hand, in the above, the execution or provision of the CRB application in FIGS. 15 and 16 can be handled as follows.
First, the CRB application can be executed through the CRB address data (url) extracted by the MHEG. At this time, the CRB application is made, for example, by a maker of a digital device as a Web OS-based web application (index.html), and a target CRB url associated therewith is linked with, for example, the content provider. On the other hand, the CRB content accessible by the CRB application is produced by the corresponding content provider. The application information of the CRB application and the json option are preferably applied to the accessible CRB content in common.
In a digital device, when requesting a URL, a user agent including a model number may be included.
In the CRB application, the URL is extracted by parsing the json-type parameter transmitted from the MHEG. A deep link for this can, for example, use window.launchParams.
Meanwhile, according to an embodiment of the present invention, the CRB application to be executed is overlaid and outputted on live broadcast as described above.
It is also desirable that at least two applications are stackable on top of the live broadcast (z-order). For example, a live broadcast on the lowest layer, a CRB application on the upper layer, and a setting menu on the uppermost layer may be output. Alternatively, a ribbon may be output instead of the setting menu in the uppermost layer. In connection with this, the key event may be executable only in the parent application. For example, key event propagation may or may not occur in a live broadcast during the execution of a CRB application. The layer may correspond to an application.
On the other hand, when the CRB application is terminated, the program returns to the live broadcast that was being watched. At this time, the MHEG engine can be activated and the MHEG application can be executed as needed.
In addition, a recent list or a launcher may not be called out, that is, displayed on the screen when the CRB application is running. This is because, for example, the recency list or the launcher can be executed or terminated only without switching to the background.
Finally, when the CRB application is executed, the platform UI / UX available on the original digital device, such as Home, Exit, Input, Info, Settings, etc., It is possible to access and operate as originally in spite of execution.
Hereinafter, various scenarios for the launch sequence of the CRB application according to the present invention will be described.
17 is a flowchart of a content processing method according to an embodiment of the present invention.
The
When the
The
The
The
The
The
Then, the
In the above, in the case of using a custom plug-in, in the case of a URL change, the plug-in injection can make an injection in the frame, WAM or the corresponding CRB content provider.
18 is a flowchart of a content processing method according to another embodiment of the present invention.
In Fig. 18, steps S1802 to S1822 are similar to steps S1702 to S1722 of Fig. 17 described above. Therefore, a detailed description thereof will be omitted and the above-mentioned contents will be used.
Hereinafter, a description will be given mainly of a portion different from that of FIG. 17 described above.
18, there is no custom plug-in as in Fig. Accordingly, unlike FIG. 17 described above, in step S1820, the
The
18, the
19 is a flowchart of a content processing method according to another embodiment of the present invention.
In Fig. 19, steps S1902 to S1922 are similar to steps S1802 to S1822 in Fig. 18 described above. Therefore, a detailed description thereof will be omitted and the above-mentioned contents will be used.
Hereinafter, portions different from those of FIG. 18 described above will be mainly described.
19 does not notify the CRB application launch to the
The
According to the present invention described above, it is possible to overlay and output the content associated with the content being provided on the screen with only a simple operation, thereby providing the convenience of use, and the content being provided is switched to the background, So that the multi-tasking function is supported without switching the screen, and service switching and the like are quickly processed according to the multi-tasking function support, thereby improving user's convenience and product satisfaction.
The content processing method in the digital device and the digital device according to the present invention is not limited to the configuration and the method of the embodiments described above but the embodiments can be applied to all of the embodiments Or some of them may be selectively combined.
Meanwhile, the operation method of the digital device of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the digital device. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the present invention.
201: network interface unit 202: TCP / IP manager
203: service delivery manager 204: SI decoder
205: Demultiplexer 206: Audio decoder
207: Video decoder 208:
209: Service Control Manager 210: Service Discovery Manager
211: SI & Metadata Database 212: Metadata Manager
213: service manager 214: UI manager
Claims (20)
Outputting a live broadcast;
Receiving a first request from a user;
Outputting a first application execution screen on the live broadcast screen in response to the first request;
Receiving a second request from the user;
Outputting a second application execution screen on the live broadcast screen from the first application execution screen in response to the second request;
Receiving a third request from the user; And
Switching to a streaming service for the selected content in response to the third request, and outputting the streaming service,
In the above, the first application is terminated in accordance with the second request.
Wherein the first application is a Multimedia and Hypermedia Information Coding Experts Group (MHEG) application and the second application is a Connected Red Button (CRB) web application.
And when the second application is terminated, activating the first application and outputting a live broadcast screen in the foreground.
Wherein each content item constituting the second application is linked with a URL (Uniform Resource Locator).
Wherein when requesting a URL for a predetermined content item in the second application, a user agent including a model number is included.
Wherein the platform UI or the UX (platform User Interface or User experience) of the digital device is accessed regardless of the execution of the second application.
Wherein during the first request and the second request, a live broadcast screen is continuously provided in the background.
Wherein the second application is launched by a System & Application Manager (SAM) and a WAM (Web Application Manager) that support the Web OS platform and is loaded via a plug-in.
Wherein the live broadcast is served in the form of a web application.
Wherein the live broadcast application controls the on / off of the first application processing engine according to the launch and termination for execution of the second application.
A user interface unit for receiving first to third requests from a user;
Outputting a first application execution screen on the live broadcast screen in response to the first request, outputting a second application execution screen from the first application execution screen on the live broadcast screen in response to the second request, A controller for switching to a streaming service for the content selected in response to the third request and outputting the streaming service; And
And an output unit for outputting the live broadcast screen, the first application execution screen, the second application execution screen, and the streaming service screen according to the third request,
And the control unit controls execution of the first application in accordance with the second request.
Wherein the first application is an MHEG application and the second application is a CRB web application.
Wherein,
And controls to activate the first application and output a live broadcast screen to the foreground when the second application is terminated.
Wherein each content item constituting the second application is linked to a URL.
Wherein,
Wherein the control unit refers to a user agent including a model number when requesting a URL for a predetermined content item in the second application.
Wherein,
Wherein the platform UI or UX of the digital device is controlled to be accessed regardless of the execution of the second application.
Wherein,
And controls to continuously provide a live broadcast screen in the background during the first request and the second request.
Wherein,
Controlling the launch of the second application via a SAM and WAM supporting the web OS platform and being loaded via a plug-in.
Wherein the live broadcast is serviced in the form of a web application.
Wherein,
And controlling on / off of a first application processing engine according to an end and a termination for execution of the second application through the live broadcast application.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140143245A KR20160047179A (en) | 2014-10-22 | 2014-10-22 | Digital device and method of processing content thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140143245A KR20160047179A (en) | 2014-10-22 | 2014-10-22 | Digital device and method of processing content thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160047179A true KR20160047179A (en) | 2016-05-02 |
Family
ID=56021490
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140143245A KR20160047179A (en) | 2014-10-22 | 2014-10-22 | Digital device and method of processing content thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160047179A (en) |
-
2014
- 2014-10-22 KR KR1020140143245A patent/KR20160047179A/en not_active Application Discontinuation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101567832B1 (en) | Digital device and method for controlling the same | |
KR102355624B1 (en) | Mobile terminal and method for controlling the same | |
KR101632221B1 (en) | Digital device and method for processing service thereof | |
KR20160023089A (en) | Digital device and method for controlling the same | |
KR20160062417A (en) | Multimedia device and method for controlling the same | |
KR20160066268A (en) | Multimedia device and method for controlling the same | |
KR20170092407A (en) | Main speaker, sub speaker and system comprising main speaker and sub speaker | |
KR102367882B1 (en) | Digital device and method of processing application data thereof | |
KR20170028104A (en) | Display device and method for controlling the same | |
KR20150101368A (en) | Digital device and method of processing application thereof | |
KR20160065504A (en) | Multimedia device and method for controlling the same | |
KR20170090102A (en) | Digital device and method for controlling the same | |
KR20170087307A (en) | Display device and method for controlling the same | |
KR102396035B1 (en) | Digital device and method for processing stt thereof | |
KR102356780B1 (en) | Display device and method for controlling the same | |
KR20170092408A (en) | Digital device and method for controlling the same | |
KR20160048430A (en) | Digital device and method of processing data thereof | |
KR102443319B1 (en) | Digital device and method for controlling the same | |
KR20170059094A (en) | Digital device and method for controlling the same | |
KR20170075486A (en) | Digital device and method for controlling the same | |
KR20160127438A (en) | Display device and method for controlling the same | |
KR20160028226A (en) | Display device and method of processing content thereof | |
KR20150101902A (en) | Digital device and method of controlling thereof | |
KR20160026046A (en) | Digital device and method for controlling the same | |
KR102294600B1 (en) | Digital device and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |