KR20160080041A - Flexible digital device and method of processing data the same - Google Patents

Flexible digital device and method of processing data the same Download PDF

Info

Publication number
KR20160080041A
KR20160080041A KR1020150045320A KR20150045320A KR20160080041A KR 20160080041 A KR20160080041 A KR 20160080041A KR 1020150045320 A KR1020150045320 A KR 1020150045320A KR 20150045320 A KR20150045320 A KR 20150045320A KR 20160080041 A KR20160080041 A KR 20160080041A
Authority
KR
South Korea
Prior art keywords
screen
variable
signal
application
digital device
Prior art date
Application number
KR1020150045320A
Other languages
Korean (ko)
Inventor
최은정
허우범
정윤석
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of KR20160080041A publication Critical patent/KR20160080041A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/301Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements flexible foldable or roll-able electronic displays, e.g. thin LCD, OLED

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Various embodiments (s) for a digital device and an application processing method in the digital device are disclosed herein. Here, the digital device according to an embodiment of the present invention includes a memory, a first application execution screen and a display unit for displaying the requested menu on the screen on the first application execution screen, a first A user interface for receiving a signal and a second signal for selecting a menu item from one or more menu items for the variable between the flat screen and the curved screen from the displayed menu, And a controller for controlling the variable to a screen of the curvature corresponding to the selected menu item, wherein the controller displays first guide data for identifying the variable on the first application execution screen on the screen before the variable end, After the variable end, the first application Orientation execution controls to display the second guide data identifying the variable end on the screen.

Description

TECHNICAL FIELD [0001] The present invention relates to a variable-type digital device and a method of processing data in the variable-type digital device.

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a variable digital device, and more particularly to variable mode operation and data processing therefor in a variable digital device equipped with a web OS (WebOS) platform.

Mobile devices such as a smart phone and a tablet PC have attracted attention in addition to a standing device such as a personal computer (PC) and a television (TV). Fixed devices and mobile devices originally developed in their respective domains, but the area has become obscure due to the recent boom of digital convergence.

In addition, as the digital device develops or changes environment, the user's eye level increases, and there is a demand for various kinds of high-speed services or applications.

Conventional digital devices generally employ a flat screen. In addition, such a flat screen is supported by a fixed panel which is largely unvaried. However, as a variable type capable of changing curvature recently, that is, a flexible panel has been developed, a digital device having a curved screen using the same is being developed gradually.

Recently, a hybrid type digital device supporting both a planar mode and a variable mode has been developed.

Disclosed herein is a variable digital device and a method for processing data in the variable digital device.

It is an object of the present invention to provide an intuitive user interface for variable mode operation control and to control the variable digital device easily and easily.

Another object of the present invention is to maximize the convenience of use of the variable digital device by referring to and utilizing various factors.

Another object of the present invention is to improve the satisfaction of the variable-type digital device and promote the desire to purchase the product through the above-described contents.

The technical problem to be solved by the present invention is not limited to the above-described technical problems and other technical problems which are not mentioned can be clearly understood by those skilled in the art from the following description .

This document discloses various embodiments (s) of digital devices and processing methods in the digital devices.

A method of processing data in a digital device according to an embodiment of the present invention includes displaying a first application execution screen on a screen, receiving a first signal requesting a menu display, 1 < / RTI > application execution screen; selecting a menu item from one or more menu items for variable between a flat screen and a curved screen from the displayed menu; Receiving a signal and varying a curvature corresponding to a menu item selected according to the second signal to a screen of curvature, wherein the varying comprises: prior to the variable end, The first guide data for identifying the variable is displayed on the display , And after the variable-end on a first application execution screen on the screen displaying the second guide data identifying the variable end.

A method of processing data in a digital device according to another embodiment of the present invention includes receiving a setup signal to automatically vary between a flat screen and a curved screen, receiving an application execution request signal, Determining whether or not the screen is variable based on the collected reference data, varying the screen based on the determination result, and displaying an execution screen of the application requested to be executed on the variable screen / RTI >

A digital device according to an embodiment of the present invention includes a memory, a first application execution screen and a display unit displaying the requested menu on the screen on the first application execution screen, a first signal requesting a menu display, A user interface for receiving a second signal for selecting one of two or more menu items for the variable between the flat screen and the curved screen from the displayed menu, And a controller for controlling the variable to a screen of a curvature corresponding to the menu item, wherein the controller displays first guide data for identifying the variable on the first application execution screen on the screen before the variable end, After completion of the first application on the screen And controlling to display the second guide data identifying the variable ends in the line screen.

A digital device according to another embodiment of the present invention includes a memory, a setting signal for automatically changing between a flat screen and a curved screen, a user interface for receiving an application execution request signal, A control unit for determining whether or not the screen is variable based on the collected reference data and controlling the screen variable based on the determination result and a display unit for displaying an execution screen of the application requested to be executed on the variable screen .

The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.

The effects of the present invention are as follows.

According to one embodiment of the various embodiments of the present invention, an adjustable digital device can be easily and easily controlled by providing an intuitive user interface for variable mode operation control.

According to another embodiment of the various embodiments of the present invention, the convenience of using the variable-type digital device can be maximized by referring to and using various factors.

According to another embodiment of the various embodiments of the present invention, there is an effect that the satisfaction of the variable digital device can be improved and the desire to purchase a product can be inspected.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention;
2 is a block diagram illustrating a digital device according to an embodiment of the present invention;
3 is a block diagram illustrating a digital device according to another embodiment of the present invention;
4 is a block diagram illustrating a digital device according to another embodiment of the present invention;
FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention; FIG.
6 illustrates input means coupled to the digital device of Figs. 2 through 4 according to one embodiment of the present invention; Fig.
FIG. 7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention; FIG.
8 is a diagram illustrating an architecture of a web OS device according to an embodiment of the present invention;
9 is a diagram illustrating a graphical composition flow in a web OS device according to an embodiment of the present invention;
10 is a diagram illustrating a media server according to an embodiment of the present invention;
11 is a block diagram illustrating a configuration of a media server according to an exemplary embodiment of the present invention;
12 is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention;
Figures 13 and 16 show digital devices of a flat screen and a curved screen;
Figure 14 illustrates a variable mode menu in accordance with an embodiment of the present invention;
Figure 15 illustrates input means for the variable mode menu access and variable mode control in accordance with an embodiment of the present invention;
17 illustrates a variable mode menu configuration according to an embodiment of the present invention;
18 is a view for explaining reference data in the configuration of a variable mode menu according to an embodiment of the present invention;
19 is a diagram illustrating a variable mode configuration similar to that of FIG. 18;
20 and 28 are diagrams showing a screen configuration in a variable mode operation according to an embodiment of the present invention;
FIG. 21 is a diagram illustrating a screen configuration upon completion of a variable mode according to an embodiment of the present invention; FIG.
22 is a diagram illustrating a screen configuration of a variable digital device 2200 according to an embodiment of the present invention;
23 and 24 are diagrams for explaining a variable mode menu configuration according to an embodiment of the present invention;
25 to 27 are diagrams for explaining a method of processing an application of a variable digital device;
29 is a block diagram illustrating a configuration of a variable digital device according to another embodiment of the present invention;
30 is a flowchart for explaining a data processing method in a variable digital device according to an embodiment of the present invention; And
31 is a flowchart for explaining a data processing method in a variable digital device according to another embodiment of the present invention.

Hereinafter, a variable digital device according to the present invention and various embodiments (s) of a data processing method in the variable digital device will be described with reference to the drawings.

The suffix "module "," part ", and the like for components used in the present specification are given only for ease of specification, and both may be used as needed. Also, even when described in ordinal numbers such as " 1st ", "2nd ", and the like, it is not limited to such terms or ordinal numbers.

In addition, although the terms used in the present specification have been selected from the general terms that are widely used in the present invention in consideration of the functions according to the technical idea of the present invention, they are not limited to the intentions or customs of the artisan skilled in the art, It can be different. However, in certain cases, some terms are arbitrarily selected by the applicant, which will be described in the related description section. Accordingly, it should be understood that the term is to be interpreted based not only on its name but on its practical meaning as well as on the contents described throughout this specification.

It is to be noted that the contents of the present specification and / or drawings are not intended to limit the scope of the present invention.

The term " variable digital device " as used herein refers to a flexible digital device that supports both a flat mode in which a flat screen is provided and a flexible mode in which a flexible screen is provided Digital devices.

As used herein, a " digital device " includes all devices that perform at least one of, for example, sending / receiving, processing, and outputting data. The data includes all types of data for a content, a service, and an application, for example. The digital device may be connected to another digital device, an external server, or the like through a wire / wireless network to transmit / receive the data. If necessary, the data can be converted before transmission / reception. Examples of such digital devices include a standing device such as a network TV, a Hybrid Broadcast Broadband TV (HBBTV), a Smart TV, an IPTV (Internet Protocol TV), a PC (Personal Computer) A mobile device or a handheld device such as a personal digital assistant (PDA), a smart phone, a tablet PC, a notebook, and the like. However, in order to facilitate understanding of the present invention, a digital TV (Digital TV) is shown in FIG. 2 and a mobile device is described as an example of a variable digital device in FIG. In addition, the digital device may be a configuration having only a display panel or a SET configuration such as a set-top box (STB).

In the above, the wired / wireless network generally refers to a network for connection / data transmission / reception between a client and a server. Such a wired / wireless network includes all of the communication networks to be supported by the standard now or in the future, and is capable of supporting one or more communication protocols therefor. Such a wired / wireless network includes, for example, a USB (Universal Serial Bus), a Composite Video Banking Sync (CVBS), a Component, an S-Video (Analog), a DVI (Digital Visual Interface) A communication standard or protocol for a wired connection such as an RGB or a D-SUB, a Bluetooth standard, a radio frequency identification (RFID), an infrared data association (IrDA), an ultra wideband (UWB) (ZigBee), DLNA (Digital Living Network Alliance), WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) A long term evolution (LTE-Advanced), and Wi-Fi direct, and a communication standard or protocol for the network.

In addition, when the term is simply referred to as a digital device in this specification, the meaning may mean a fixed device or a mobile device depending on the context, and may be used to mean both, unless specifically stated otherwise.

On the other hand, the digital device is an intelligent device that supports, for example, a broadcast receiving function, a computer function or a support, at least one external input, and the like. The digital device includes e-mail, web browsing, Banking, game, application, and so on. In addition, the digital device may include an interface for supporting at least one input or control means (hereinafter referred to as 'input means') such as a handwriting input device, a touch-screen, .

In addition, the digital device can use a standardized general-purpose OS (Operating System), but in particular, the digital device described in this specification uses Web OS (Web OS) as an embodiment. Therefore, a digital device can handle adding, deleting, amending, and updating various services or applications on a general-purpose OS kernel or a Linux kernel. And through which a more user-friendly environment can be constructed and provided.

Meanwhile, the above-described digital device can receive and process an external input. The external input is connected to an external input device, that is, the digital device, through the wired / wireless network, Input means or digital device. For example, the external input may be a game device such as a high-definition multimedia interface (HDMI), a playstation or an X-Box, a smart phone, a tablet PC, a pocket photo devices such as digital cameras, printing devices, smart TVs, Blu-ray device devices and the like.

In addition, the term "server" as used herein refers to a digital device or system that supplies data to or receives data from a digital device, that is, a client, and may be referred to as a processor do. The server provides a Web server, a portal server, and advertising data for providing a web page, a web content or a web content or a web service, An advertising server, a content server for providing content, an SNS server for providing a social network service (SNS), a service server provided by a manufacturer, a video on demand (VoD) server, A service server providing a Multichannel Video Programming Distributor (MVPD) for providing a streaming service, a pay service, and the like.

Also, even if the application is described herein for convenience, the meaning may be a meaning including a content, a service, and the like such as a broadcast program. On the other hand, an application may refer to a web application according to the web OS platform.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The present invention will be described in more detail with reference to the accompanying drawings.

1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention.

1, a service system includes a content provider 10, a service provider 20, a network provider 30, and a Home Network End User (HNED) (Customer) 40. Here, the HNED 40 includes, for example, a client 100, i.e., a digital device according to the present invention.

The content provider 10 produces and provides various contents. As shown in FIG. 1, the content provider 10 may include a terrestrial broadcast sender, a cable SO (System Operator), a MSO (Multiple SO), a satellite broadcast sender, Personal content providers, and the like. On the other hand, the content provider 10 can produce and provide various services and applications in addition to the broadcast content.

The service provider 20 provides service packetizing of the content produced by the content provider 10 to the HNED 40. [ For example, the service provider 20 packages at least one of the first terrestrial broadcast, the second terrestrial broadcast, the cable MSO, the satellite broadcast, the various internet broadcast, and the contents produced by the application for service, (40).

The service provider 20 provides services to the client 100 in a uni-cast or multi-cast manner. Meanwhile, the service provider 20 can transmit data to a plurality of clients 100 registered in advance, and an Internet Group Management Protocol (IGMP) protocol can be used for this purpose.

The above-described content provider 10 and the service provider 20 may be the same entity. For example, the service provided by the content provider 10 may be packaged and provided to the HNED 40, thereby performing the function of the service provider 20 or vice versa.

The network provider 30 provides a network for exchanging data between the content provider 10 and / or the service provider 20 and the client 100.

The client 100 is a consumer belonging to the HNED 40. The client 100 constructs a home network through the network provider 30 to receive data and stores data related to various services and applications such as VoD and streaming / RTI >

On the other hand, the content provider 10 or / and the service provider 20 in the service system can use conditional access or content protection means for protecting the transmitted content. Accordingly, the client 100 can use a processing means such as a cable card (or a POD: Point of Deployment), a DCAS (Downloadable CAS) or the like in response to the restriction reception or the content protection.

In addition, the client 100 can use the two-way service through the network. Accordingly, the client 100 may rather perform the function or role of the content provider, and the service provider 20 may receive the content and transmit the same to another client or the like.

In FIG. 1, the content provider 10 and / or the service provider 20 may be a server that provides a service described later in this specification. In this case, the server may also mean to own or include network provider 30 as needed. The service or service data includes an internal service or an application, as well as a service or an application received from the outside, and the service or the service includes a service for a Web OS-based client 100 Or application data.

A digital device according to an embodiment of the present invention includes a memory, a first application execution screen and a display unit displaying the requested menu on the screen on the first application execution screen, a first signal requesting a menu display, A user interface for receiving a second signal for selecting one of two or more menu items for the variable between the flat screen and the curved screen from the displayed menu, And a controller for controlling the variable to a screen of a curvature corresponding to the menu item, wherein the controller displays first guide data for identifying the variable on the first application execution screen on the screen before the variable end, After completion of the first application on the screen And controlling to display the second guide data identifying the variable ends in the line screen.

The controller may control at least one of image quality and sound quality set in advance corresponding to the selected menu item to be adjusted at the time of the change, collect reference data, and store the menu item of any one of the menu items, Wherein the control unit controls the display unit to display a menu item corresponding to a screen of an optimum curvature based on the collected reference data and displays the optimum curvature based on reference data including screen size, Can be calculated. In addition, the controller may store at least one of the attribute of the application, the genre of the application, the user setting, the user, the ambient illuminance, the ambient noise, the screen size of the digital device, the number of viewers, Data, and can control to display current curvature information and recommendation / optimum curvature information.

In the above, the user interface further receives a third signal requesting power on or off after the variable, and the controller changes the current screen state from the current screen state to the variable front screen state according to the reception of the third signal, Controlling to display third guide data for maintaining the current screen state on the screen, variably controlling the screen according to selection of the third guide data, and controlling power on or off according to the third signal . Alternatively, the user interface unit may receive a fourth signal requesting variable control according to a wheel movement or a pointer movement of the input means, and the controller may variably control the current screen state according to the fourth signal, When the screen is changed from the flat screen to the curved screen, it is possible to control to display an application or menu screen other than the first application in at least one corner area of the variable corner areas.

A digital device according to another embodiment of the present invention includes a memory, a setting signal for automatically changing between a flat screen and a curved screen, a user interface for receiving an application execution request signal, A control unit for determining whether or not the screen is variable based on the collected reference data and controlling the screen variable based on the determination result and a display unit for displaying an execution screen of the application requested to be executed on the variable screen .

2 is a block diagram illustrating a digital device according to an exemplary embodiment of the present invention.

The digital device of Fig. 2 corresponds to the client 100 of Fig. 1 described above.

The digital device 200 includes a network interface 201, a TCP / IP manager 202, a service delivery manager 203, an SI decoder 204 A demultiplexer (demux or demultiplexer) 205, an audio decoder 206, a video decoder 207, a display A / V and OSD module 208, A service management manager 209, a service discovery manager 210, an SI & metadata DB 211, a metadata manager 212, a service manager 213 ), A UI manager 214, and the like.

The network interface unit 201 receives the IP packet (s) (IP packet (s)) or the IP datagram (s) ) Is transmitted / received. For example, the network interface unit 201 can receive services, applications, content, and the like from the service provider 20 of FIG. 1 through a network.

The TCP / IP manager 202 determines whether the IP packets received by the digital device 200 and the IP packets transmitted by the digital device 200 are packet delivery (i.e., packet delivery) between a source and a destination packet delivery). The service discovery manager 210, the service control manager 209, the meta data manager 212, the service discovery manager 210, the service control manager 209, and the metadata manager 212. The TCP / IP manager 202 classifies the received packet (s) ) Or the like.

The service delivery manager 203 is responsible for controlling the received service data. For example, the service delivery manager 203 may use RTP / RTCP when controlling real-time streaming data. When the real-time streaming data is transmitted using the RTP, the service delivery manager 203 parses the received data packet according to the RTP and transmits the packet to the demultiplexing unit 205 or the control of the service manager 213 In the SI & meta data database 211. [ Then, the service delivery manager 203 feedbacks the network reception information to the server providing the service using RTCP.

The demultiplexer 205 demultiplexes the received packets into audio, video, SI (System Information) data, and transmits them to the audio / video decoder 206/207 and the SI decoder 204, respectively.

The SI decoder 204 decodes the demultiplexed SI data, that is, Program Specific Information (PSI), Program and System Information Protocol (PSIP), Digital Video Broadcasting Service Information (DVB-SI), Digital Television Terrestrial Multimedia Broadcasting / Coding Mobile Multimedia Broadcasting). Also, the SI decoder 204 may store the decoded service information in the SI & meta data database 211. The stored service information can be read out and used by the corresponding configuration, for example, by a user's request.

The audio / video decoder 206/207 decodes each demultiplexed audio data and video data. The decoded audio data and video data are provided to the user through the display unit 208. [

The application manager may include, for example, the UI manager 214 and the service manager 213 and may perform the functions of the controller of the digital device 200. [ In other words, the application manager can manage the overall state of the digital device 200, provide a user interface (UI), and manage other managers.

The UI manager 214 provides a GUI (Graphic User Interface) / UI for a user using an OSD (On Screen Display) or the like, and receives a key input from a user to perform a device operation according to the input. For example, the UI manager 214 receives the key input regarding the channel selection from the user, and transmits the key input signal to the service manager 213.

The service manager 213 controls the manager associated with the service such as the service delivery manager 203, the service discovery manager 210, the service control manager 209, and the metadata manager 212.

In addition, the service manager 213 generates a channel map and controls the selection of a channel using the generated channel map according to a key input received from the UI manager 214. [ The service manager 213 receives the service information from the SI decoder 204 and sets an audio / video PID (Packet Identifier) of the selected channel in the demultiplexer 205. The PID thus set can be used in the demultiplexing process described above. Accordingly, the demultiplexer 205 filters (PID or section) audio data, video data, and SI data using the PID.

The service discovery manager 210 provides information necessary for selecting a service provider that provides the service. Upon receiving a signal regarding channel selection from the service manager 213, the service discovery manager 210 searches for the service using the information.

The service control manager 209 is responsible for selection and control of services. For example, the service control manager 209 uses IGMP or RTSP when a user selects a live broadcasting service such as an existing broadcasting system, and selects a service such as VOD (Video on Demand) , RTSP is used to select and control services. The RTSP protocol may provide a trick mode for real-time streaming. In addition, the service control manager 209 can initialize and manage a session through the IMS gateway 250 using an IP Multimedia Subsystem (IMS) and a Session Initiation Protocol (SIP). The protocols are one embodiment, and other protocols may be used, depending on the implementation.

The metadata manager 212 manages the metadata associated with the service and stores the metadata in the SI & metadata database 211.

The SI & meta data database 211 stores service information decoded by the SI decoder 204, meta data managed by the meta data manager 212, and information necessary for selecting a service provider provided by the service discovery manager 210 . In addition, the SI & meta data database 211 may store set-up data for the system and the like.

The SI & meta data database 211 may be implemented using a non-volatile RAM (NVRAM) or a flash memory.

Meanwhile, the IMS gateway 250 is a gateway that collects functions necessary for accessing the IMS-based IPTV service.

3 is a block diagram illustrating a digital device according to another embodiment of the present invention.

If the above-described Fig. 2 is described with an example of a digital device as a fixing device, Fig. 3 shows a mobile device as another embodiment of a digital device.

3, the mobile device 300 includes a wireless communication unit 310, an A / V input unit 320, a user input unit 330, a sensing unit 340, an output unit 350, A memory 360, an interface unit 370, a control unit 380, a power supply unit 390, and the like.

Hereinafter, each component will be described in detail.

The wireless communication unit 310 may include one or more modules that enable wireless communication between the mobile device 300 and the wireless communication system or between the mobile device and the network in which the mobile device is located. For example, the wireless communication unit 310 may include a broadcast receiving module 311, a mobile communication module 312, a wireless Internet module 313, a short range communication module 314, and a location information module 315 .

The broadcast receiving module 311 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 312.

The broadcast-related information may exist in various forms, for example, in the form of an EPG (Electronic Program Guide) or an ESG (Electronic Service Guide).

The broadcast receiving module 311 may be, for example, an ATSC, a Digital Video Broadcasting-Terrestrial (DVB-T), a Satellite (DVB-S), a Media Forward Link Only And Integrated Services Digital Broadcast-Terrestrial (DRS). Of course, the broadcast receiving module 311 may be adapted to not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 311 may be stored in the memory 360.

The mobile communication module 312 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 313 may be embedded or external to the mobile device 300, including a module for wireless Internet access. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 314 is a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IRDA), Ultra Wideband (UWB), ZigBee, RS-232 and RS-485 are used as short range communication technology. .

The position information module 315 is a module for acquiring position information of the mobile device 300, and may be a GPS (Global Position System) module.

The A / V input unit 320 is for inputting audio and / or video signals. The A / V input unit 320 may include a camera 321, a microphone 322, and the like. The camera 321 processes an image frame such as a still image or moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display section 351. [

The image frame processed by the camera 321 may be stored in the memory 360 or transmitted to the outside via the wireless communication unit 310. [ At least two cameras 321 may be provided depending on the use environment.

The microphone 322 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 312 in the case of the communication mode, and output. The microphone 322 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 330 generates input data for a user to control the operation of the terminal. The user input unit 330 may include a key pad, a dome switch, a touch pad (static pressure / static electricity), a jog wheel, a jog switch, and the like.

The sensing unit 340 senses the current state of the mobile device 300 such as the open / closed state of the mobile device 300, the position of the mobile device 300, the user's contact, the orientation of the mobile device, And generates a sensing signal for controlling the operation of the mobile device 300. For example, when the mobile device 300 is moved or tilted, it may sense the position, slope, etc. of the mobile device. It is also possible to sense whether power is supplied to the power supply unit 390, whether the interface unit 370 is connected to an external device, and the like. Meanwhile, the sensing unit 240 may include a proximity sensor 341 including NFC (Near Field Communication).

The output unit 350 may include a display unit 351, an acoustic output module 352, an alarm unit 353, and a haptic module 354 for generating output related to visual, auditory, have.

The display unit 351 displays (outputs) information processed by the mobile device 300. [ For example, if the mobile device is in call mode, it displays a UI or GUI associated with the call. When the mobile device 300 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 351 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) flexible display, and a three-dimensional display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display portion 351 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 351 of the terminal body.

There may be two or more display units 351 depending on the implementation of the mobile device 300. [ For example, in the mobile device 300, the plurality of display portions may be spaced apart or integrally arranged on one surface, and may be disposed on different surfaces, respectively.

(Hereinafter, referred to as a 'touch screen') in which a display unit 351 and a sensor (hereinafter, referred to as 'touch sensor') for sensing a touch operation form a mutual layer structure It can also be used as a device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display portion 351 or a capacitance generated in a specific portion of the display portion 351 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits corresponding data to the controller 380. Thus, the control unit 380 can know which area of the display unit 351 is touched or the like.

A proximity sensor 341 may be disposed in the interior area of the mobile device or in proximity to the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch" The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 352 can output audio data received from the wireless communication unit 310 or stored in the memory 360 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 352 also outputs sound signals associated with functions performed on the mobile device 300 (e.g., call signal receive tones, message receive tones, etc.). The sound output module 352 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 353 outputs a signal for notifying the occurrence of an event of the mobile device 300. [ Examples of events that occur in the mobile device include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 353 may output a signal for informing occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 351 or the audio output module 352 so that they may be classified as a part of the alarm unit 353.

The haptic module 354 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 354 is vibration. The intensity and pattern of the vibration generated by the haptic module 354 are controllable. For example, different vibrations may be synthesized and output or sequentially output. In addition to the vibration, the haptic module 354 may be arranged in a variety of ways, such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a spit on the skin surface, contact with an electrode, Various effects such as an effect of heat generation and an effect of reproducing a cool / warm feeling using a heat absorbing or heatable element can be generated. The haptic module 354 can be implemented not only to transmit the tactile effect through direct contact but also to allow the user to feel the tactile effect through the muscular sensation of a finger or an arm. More than one haptic module 354 may be provided according to the configuration of the mobile device 300.

The memory 360 may store a program for the operation of the control unit 380 and temporarily store input / output data (e.g., phone book, message, still image, moving picture, etc.). The memory 360 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 360 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) (Random Access Memory), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM) A magnetic disk, and / or an optical disk. The mobile device 300 may operate in association with a web storage that performs storage functions of the memory 360 on the Internet.

The interface unit 370 serves as a pathway to all the external devices connected to the mobile device 300. The interface unit 370 receives data from an external device or receives power from the external device and transfers the data to each component in the mobile device 300 or transmits data in the mobile device 300 to an external device. For example, it may be provided with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port connecting a device with an identification module, an audio I / O port, A video I / O port, an earphone port, and the like may be included in the interface unit 370.

The identification module is a chip for storing various information for authenticating the usage right of the mobile device 300 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. A device having an identification module (hereinafter referred to as 'identification device') can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 200 through the port.

The interface unit 370 may be a communication path through which the power from the cradle is supplied to the mobile device 300 when the mobile device 300 is connected to an external cradle, A command signal may be the path through which it is communicated to the mobile device. The various command signals or the power supply input from the cradle may be operated with a signal for recognizing that the mobile device is correctly mounted on the cradle.

The control unit 380 typically controls the overall operation of the mobile device 300. The control unit 380 performs related control and processing, for example, for voice call, data communication, video call, and the like. The control unit 380 may include a multimedia module 381 for multimedia playback. The multimedia module 381 may be implemented in the control unit 380 or separately from the control unit 380. The control unit 380 can perform pattern recognition processing for recognizing handwriting input or drawing input performed on the touch-screen as characters and images, respectively.

The power supply unit 390 receives external power and internal power under the control of the controller 380 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and an electrical unit for performing other functions. In some cases, the implementation described herein Examples may be implemented by the control unit 380 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Here, the software code is stored in the memory 360 and can be executed by the control unit 380. [

4 is a block diagram illustrating a digital device according to another embodiment of the present invention.

Another example of the digital device 400 includes a broadcast receiving unit 405, an external device interface unit 435, a storage unit 440, a user input interface unit 450, a control unit 470, a display unit 480, An output unit 485, a power supply unit 490, and a photographing unit (not shown). The broadcast receiver 405 may include at least one tuner 410, a demodulator 420, and a network interface 430. In some cases, the broadcast receiver 405 may include a tuner 410 and a demodulator 420, but may not include the network interface 430, or vice versa. The broadcast receiving unit 405 may include a multiplexer to receive a demodulated signal from the demodulator 420 via the tuner 410 and a signal received via the network interface 430, May be multiplexed. In addition, although not shown, the broadcast receiver 425 includes a demultiplexer to demultiplex the multiplexed signals, demultiplex the demodulated signals or the signals passed through the network interface 430 .

The tuner 410 tunes a channel selected by the user or all pre-stored channels of an RF (Radio Frequency) broadcast signal received through the antenna, and receives the RF broadcast signal. In addition, the tuner 410 converts the received RF broadcast signal into an intermediate frequency (IF) signal or a baseband signal.

For example, if the received RF broadcast signal is a digital broadcast signal, the signal is converted into a digital IF signal (DIF). If the received RF broadcast signal is an analog broadcast signal, the signal is converted into an analog baseband image or a voice signal (CVBS / SIF). That is, the tuner 410 can process both a digital broadcast signal and an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner 410 can be directly input to the controller 470.

In addition, the tuner 410 can receive RF broadcast signals of a single carrier or a multiple carrier. Meanwhile, the tuner 410 sequentially tunes and receives RF broadcast signals of all the broadcast channels stored through the channel storage function among the RF broadcast signals received through the antenna, converts the RF broadcast signals into intermediate frequency signals or baseband signals (DIF: Digital Intermediate Frequency or baseband signal).

The demodulator 420 may receive and demodulate the digital IF signal DIF converted by the tuner 410 and perform channel decoding. To this end, the demodulator 420 may include a trellis decoder, a de-interleaver, a Reed-Solomon decoder, or a convolution decoder, a deinterleaver, A reed-solomon decoder, and the like.

The demodulation unit 420 may perform demodulation and channel decoding, and then output a stream signal TS. At this time, the stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like.

The stream signal output from the demodulation unit 420 may be input to the control unit 470. The control unit 470 controls demultiplexing, video / audio signal processing, and the like, and controls the output of audio through the display unit 480 and audio through the audio output unit 485.

The external device interface unit 435 provides an interface environment between the digital device 300 and various external devices. To this end, the external device interface unit 335 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 435 may be a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (notebook), a tablet PC, a smart phone, device, an external device such as a cloud, or the like. The external device interface unit 435 transmits a signal including data such as image, image, and voice input through the connected external device to the control unit 470 of the digital device. The control unit 470 can control the processed image, image, voice, and the like to be output to the external device to which the data signal is connected. To this end, the external device interface unit 435 may further include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog) terminal, A DVI (Digital Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.

The wireless communication unit can perform short-range wireless communication with another digital device. The digital device 400 may be a Bluetooth device such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) Lt; RTI ID = 0.0 > and / or < / RTI > other communication protocols.

Also, the external device interface unit 435 may be connected to the set-top box STB through at least one of the various terminals described above to perform input / output operations with the set-top box STB.

Meanwhile, the external device interface unit 435 may receive an application or an application list in an adjacent external device, and may transmit the received application or application list to the control unit 470 or the storage unit 440.

The network interface unit 430 provides an interface for connecting the digital device 400 to a wired / wireless network including the Internet network. The network interface unit 430 may include an Ethernet terminal or the like for connection with a wired network and may be a WLAN (Wireless LAN) (Wi- Fi, Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access) communication standards.

The network interface unit 430 can transmit or receive data to another user or another digital device via the connected network or another network linked to the connected network. In particular, some of the content data stored in the digital device 400 can be transmitted to a selected user or selected one of other users or other digital devices previously registered with the digital device 400. [

Meanwhile, the network interface unit 430 can access a predetermined web page through the connected network or another network linked to the connected network. That is, it is possible to access a predetermined web page through a network and transmit or receive data with the server. In addition, content or data provided by a content provider or a network operator may be received. That is, it can receive content and related information such as movies, advertisements, games, VOD, broadcasting signals, etc., provided from a content provider or a network provider through a network. In addition, it can receive update information and an update file of firmware provided by the network operator. It may also transmit data to the Internet or to a content provider or network operator.

In addition, the network interface unit 430 can select and receive a desired application from the open applications through the network.

The storage unit 440 may store a program for each signal processing and control in the control unit 470 or may store a signal-processed video, audio, or data signal.

The storage unit 440 may also function for temporarily storing video, audio, or data signals input from the external device interface unit 435 or the network interface unit 430. [ The storage unit 440 can store information on a predetermined broadcast channel through the channel memory function.

The storage unit 440 may store a list of applications or applications input from the external device interface unit 435 or the network interface unit 330.

In addition, the storage unit 440 may store various platforms described later.

The storage unit 440 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD Memory, etc.), RAM (RAM), and ROM (EEPROM, etc.). The digital device 400 may reproduce and provide a content file (a moving image file, a still image file, a music file, a document file, an application file, etc.) stored in the storage unit 440 to a user.

4 illustrates an embodiment in which the storage unit 440 is provided separately from the control unit 470, the present invention is not limited thereto. In other words, the storage unit 440 may be included in the control unit 470.

The user input interface unit 450 transfers a signal input by the user to the controller 470 or a signal from the controller 470 to the user.

For example, the user input interface unit 450 may control power on / off, channel selection, screen setting, etc. from the remote control device 500 according to various communication methods such as an RF communication method and an infrared (IR) And may transmit the control signal of the control unit 470 to the remote control device 500. [

In addition, the user input interface unit 450 can transmit a control signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 470.

The user input interface unit 450 transmits a control signal input from a sensing unit (not shown) that senses a gesture of a user to the control unit 470 or transmits a signal of the control unit 470 to a sensing unit (Not shown). Here, the sensing unit (not shown) may include a touch sensor, an audio sensor, a position sensor, an operation sensor, and the like.

The control unit 470 demultiplexes the streams input through the tuner 410, the demodulation unit 420, or the external device interface unit 435 or processes the demultiplexed signals to generate a signal for video or audio output And can output.

The video signal processed by the control unit 470 may be input to the display unit 480 and displayed as an image corresponding to the video signal. The video signal processed by the controller 470 may be input to the external output device through the external device interface unit 435. [

The audio signal processed by the control unit 470 may be audio-output to the audio output unit 485. The voice signal processed by the control unit 470 may be input to the external output device through the external device interface unit 435. [

Although not shown in FIG. 4, the control unit 470 may include a demultiplexing unit, an image processing unit, and the like.

The control unit 470 can control the overall operation of the digital device 400. [ For example, the controller 470 may control the tuner 410 to control tuning of a RF broadcast corresponding to a channel selected by the user or a previously stored channel.

The control unit 470 can control the digital device 400 by a user command or an internal program input through the user input interface unit 450. [ In particular, the user can access the network and allow a user to download a desired application or application list into the digital device 400.

For example, the control unit 470 controls the tuner 410 so that a signal of a selected channel is input according to a predetermined channel selection command received through the user input interface unit 450. And processes video, audio or data signals of the selected channel. The control unit 470 allows the display unit 480 or the audio output unit 485 to output the video or audio signal processed by the user through the channel information selected by the user.

The control unit 470 may be connected to the external device interface unit 435 through an external device such as a camera or a camcorder in accordance with an external device video playback command received through the user input interface unit 450. [ So that the video signal or the audio signal of the video signal can be output through the display unit 480 or the audio output unit 485.

On the other hand, the control unit 470 can control the display unit 480 to display an image. For example, a broadcast image input through the tuner 410, an external input image input through the external device interface unit 435, an image input through the network interface unit, or an image stored in the storage unit 440 , And display on the display unit 480. At this time, the image displayed on the display unit 480 may be a still image or a moving image, and may be a 2D image or a 3D image.

In addition, the control unit 470 can control to reproduce the content. The content at this time may be the content stored in the digital device 400, or the received broadcast content, or an external input content input from the outside. The content may be at least one of a broadcast image, an external input image, an audio file, a still image, a connected web screen, and a document file.

On the other hand, when entering the application view item, the control unit 470 can control to display a list of applications or applications that can be downloaded from the digital device 300 or from an external network.

The control unit 470, in addition to various user interfaces, can control to install and drive an application downloaded from the external network. In addition, by the user's selection, it is possible to control the display unit 480 to display an image related to the executed application.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided.

The channel browsing processing unit receives a stream signal TS output from the demodulation unit 320 or a stream signal output from the external device interface unit 335 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be encoded as it is or may be input to the controller 470. In addition, the generated thumbnail image may be encoded in a stream form and input to the controller 470. The control unit 470 may display a thumbnail list having a plurality of thumbnail images on the display unit 480 using the input thumbnail images. On the other hand, the thumbnail images in this thumbnail list can be updated in sequence or simultaneously. Accordingly, the user can easily grasp the contents of a plurality of broadcast channels.

The display unit 480 converts an image signal, a data signal, an OSD signal processed by the control unit 470 or a video signal and a data signal received from the external device interface unit 435 into R, G, and B signals, respectively Thereby generating a driving signal.

The display unit 480 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like.

Meanwhile, the display unit 480 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 485 receives a signal processed by the control unit 470, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs it as a voice. The audio output unit 485 may be implemented by various types of speakers.

In order to detect the gesture of the user, a sensing unit (not shown) having at least one of a touch sensor, a voice sensor, a position sensor, and an operation sensor may be further included in the digital device 400 . A signal sensed by a sensing unit (not shown) may be transmitted to the controller 3470 through the user input interface unit 450.

On the other hand, a photographing unit (not shown) for photographing a user may be further provided. The image information photographed by the photographing unit (not shown) may be input to the control unit 470.

The control unit 470 may detect the gesture of the user by combining the images photographed by the photographing unit (not shown) or the sensed signals from the sensing unit (not shown).

The power supply unit 490 supplies the corresponding power to the digital device 400.

Particularly, it is possible to supply power to a control unit 470, a display unit 480 for displaying an image, and an audio output unit 485 for audio output, which can be implemented in the form of a system on chip (SoC) .

To this end, the power supply unit 490 may include a converter (not shown) for converting AC power to DC power. Meanwhile, for example, when the display unit 480 is implemented as a liquid crystal panel having a plurality of backlight lamps, an inverter (not shown) capable of PWM (Pulse Width Modulation) operation for variable luminance or dimming driving and an inverter (not shown).

The remote control device 500 transmits the user input to the user input interface unit 450. To this end, the remote control device 500 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like.

Also, the remote control device 500 can receive the video, audio, or data signal output from the user input interface unit 450 and display it on the remote control device 500 or output sound or vibration.

The digital device 400 may be a digital broadcast receiver capable of processing digital broadcast signals of a fixed or mobile ATSC scheme or a DVB scheme.

In addition, the digital device according to the present invention may further include a configuration that omits some of the configuration shown in FIG. On the other hand, unlike the above, the digital device does not have a tuner and a demodulator, and can receive and reproduce the content through the network interface unit or the external device interface unit.

FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention. Referring to FIG.

An example of the control unit includes a demultiplexer 510, an image processor 5520, an OSD generator 540, a mixer 550, a frame rate converter (FRC) 555, And may include a formatter 560. The control unit may further include a voice processing unit and a data processing unit.

The demultiplexer 510 demultiplexes the input stream. For example, the demultiplexer 510 may demultiplex the received MPEG-2 TS video, audio, and data signals. Here, the stream signal input to the demultiplexer 510 may be a stream signal output from a tuner, a demodulator, or an external device interface.

The image processing unit 420 performs image processing of the demultiplexed image signal. To this end, the image processing unit 420 may include a video decoder 425 and a scaler 435.

The video decoder 425 decodes the demultiplexed video signal, and the scaler 435 scales the decoded video signal so that the resolution of the decoded video signal can be output from the display unit.

The video decoder 525 may support various standards. For example, the video decoder 525 performs the function of an MPEG-2 decoder when the video signal is encoded in the MPEG-2 standard, and the video decoder 525 encodes the video signal in the DMB (Digital Multimedia Broadcasting) It can perform the function of the H.264 decoder.

On the other hand, the video signal decoded by the video processor 520 is input to the mixer 450.

The OSD generation unit 540 generates OSD data according to a user input or by itself. For example, the OSD generating unit 440 generates data for displaying various data in graphic or text form on the screen of the display unit 380 based on the control signal of the user input interface unit. The generated OSD data includes various data such as a user interface screen of a digital device, various menu screens, a widget, an icon, and viewing rate information. The OSD generation unit 540 may generate data for displaying broadcast information based on the caption of the broadcast image or the EPG.

The mixer 550 mixes the OSD data generated by the OSD generator 540 and the image signal processed by the image processor, and provides the mixed image to the formatter 560. Since the decoded video signal and the OSD data are mixed, the OSD is overlaid on the broadcast image or the external input image.

A frame rate conversion unit (FRC) 555 converts a frame rate of an input image. For example, the frame rate conversion unit 555 may convert the frame rate of the input 60 Hz image to have a frame rate of 120 Hz or 240 Hz, for example, in accordance with the output frequency of the display unit. As described above, there are various methods for converting the frame rate. For example, when converting the frame rate from 60 Hz to 120 Hz, the frame rate conversion unit 555 may insert the same first frame between the first frame and the second frame, Three frames can be inserted. As another example, when converting the frame rate from 60 Hz to 240 Hz, the frame rate conversion unit 555 may insert and convert three or more identical or predicted frames between existing frames. On the other hand, when the frame conversion is not performed, the frame rate conversion unit 555 may be bypassed.

The formatter 560 changes the output of the input frame rate conversion unit 555 to match the output format of the display unit. For example, the formatter 560 may output R, G, and B data signals, and the R, G, and B data signals may be output as low voltage differential signals (LVDS) or mini-LVDS . If the output of the input frame rate converter 555 is a 3D video signal, the formatter 560 may configure and output the 3D format according to the output format of the display unit to support the 3D service through the display unit.

On the other hand, the voice processing unit (not shown) in the control unit can perform the voice processing of the demultiplexed voice signal. Such a voice processing unit (not shown) may support processing various audio formats. For example, even when a voice signal is coded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, or BSAC, a corresponding decoder can be provided.

In addition, the voice processing unit (not shown) in the control unit can process the base, the treble, the volume control, and the like.

A data processing unit (not shown) in the control unit can perform data processing of the demultiplexed data signal. For example, the data processing unit can decode the demultiplexed data signal even when it is coded. Here, the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

On the other hand, the above-described digital device is an example according to the present invention, and each component can be integrated, added, or omitted according to specifications of a digital device actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be divided into two or more components. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and devices thereof do not limit the scope of rights of the present invention.

Meanwhile, the digital device may be an image signal processing device that performs signal processing of an image stored in the device or an input image. Other examples of the video signal processing device include a set-top box (STB), a DVD player, a Blu-ray player, a game device, a computer Etc. can be further exemplified.

FIG. 6 is a diagram illustrating input means coupled to the digital device of FIGS. 2 through 4 according to one embodiment of the present invention.

A front panel (not shown) or a control means (input means) provided on the digital device 600 is used to control the digital device 600.

The control means includes a remote controller 610, a keyboard 630, a pointing device 620, and a keyboard 620, which are mainly implemented for the purpose of controlling the digital device 600, as a user interface device (UID) A touch pad, or the like, but may also include control means dedicated to external input connected to the digital device 600. [ In addition, a control device may include a mobile device such as a smart phone, a tablet PC, or the like that controls the digital device 600 through a mode switching or the like, although it is not a control object of the digital device 600. In the following description, a pointing device will be described as an example, but the present invention is not limited thereto.

The input means may be a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) At least one can be employed as needed to communicate with the digital device.

The remote controller 610 is a conventional input device having various key buttons necessary for controlling the digital device 600. [

The pointing device 620 may include a gyro sensor or the like to implement a corresponding pointer on the screen of the digital device 600 based on the user's motion, The control command is transmitted. Such a pointing device 620 may be named with various names such as a magic remote controller, a magic controller, and the like.

Since the digital device 600 provides a variety of services such as a web browser, an application, and a social network service (SNS) as an intelligent integrated digital device beyond the conventional digital device 600 providing only conventional broadcasting, It is not easy, and it is implemented to complement input and realize input convenience such as text by implementing similar to PC keyboard.

On the other hand, the control means such as the remote control 610, the pointing device 620, and the keyboard 630 may be equipped with a touch pad as required to provide more convenient and various control purposes such as text input, pointer movement, .

The digital device described in this specification uses the Web OS as an OS and / or platform. Hereinafter, the processing such as the configuration or algorithm based on the web OS can be performed in the control unit of the above-described digital device or the like. Here, the control unit includes the control unit in FIGS. 2 to 5 described above and uses it as a broad concept. Accordingly, in order to process services, applications, contents, and the like related to the web OS in the digital device, the hardware and components including related software, firmware, and the like are controlled by a controller Named and explained.

Such a web OS-based platform is intended to enhance development independence and function expandability by integrating services, applications, and the like based on, for example, a luna-service bus, Productivity can be increased. Also, multi-tasking can be supported by efficiently utilizing system resources and the like through a Web OS process and resource management.

Meanwhile, the web OS platform described in this specification can be used not only in fixed devices such as a PC, a TV, and a set-top box (STB) but also in mobile devices such as mobile phones, smart phones, tablet PCs, notebooks, wearable devices .

The structure of the software for digital devices is based on a multi-threading based on a conventional problem-solving and market-dependent monolithic structure, a single process and a closed product Since then, there have been difficulties in external applications. Since then, we have been pursuing new platform-based development, pursuing cost innovation through chip-set replacement and efficient application of UI and external application development, layering and composing, componentization was done to have a three-layer structure and an add-on structure for add-ons, single-source products, and open applications. More recently, the software architecture provides a modular architecture for functional units, a Web Open API (Application Programming Interface) for echo-systems, and a game engine And a native open API (Native Open API), and thus, a multi-process structure based on a service structure is being created.

7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention.

Referring to FIG. 7, the architecture of the Web OS platform will be described as follows.

The platform can be largely divided into a kernel, a system library based Web OS core platform, an application, and a service.

The architecture of the Web OS platform is layered structure, with the OS at the lowest layer, the system library (s) at the next layer, and the applications at the top.

First, the lowest layer includes a Linux kernel as an OS layer, and can include Linux as an OS of the digital device.

The OS layer is provided with a BOS (Board Support Package) / HAL (Hardware Abstraction Layer) layer, a Web OS core modules layer, a service layer, a Luna-Service layer Bus layer, Native Developer's Kit (NDK) / QT layer (Enyo framework / NDK / QT layer), and Application layer in the uppermost layer.

Meanwhile, some layers of the above-described web OS layer structure may be omitted, and a plurality of layers may be one layer, or one layer may be a plurality of layer structures.

The web OS core module layer is composed of a Luna Surface Manager (LSM) for managing surface windows and the like, a System & Application Manager (SAM) for managing the execution and execution status of applications, and a WebKit And a WAM (Web Application Manager) for managing web applications and the like.

The LSM manages an application window displayed on the screen. The LSM manages display hardware (Display HW) and provides a buffer for rendering contents necessary for applications. The LSM composes the results of rendering by a plurality of applications, Can be output.

The SAM manages various conditional execution policies of the system and the application.

WAM, on the other hand, is based on the Enyo Framework, which can be regarded as a basic application for web applications.

The use of an application's service is done via a luna-service bus, a new service can be registered on the bus, and an application can find and use the service it needs.

The service layer may include various service level services such as TV service and Web OS service. Meanwhile, the web OS service may include a media server, a Node.JS, and the like. In particular, the Node.JS service supports, for example, javascript.

Web OS services can communicate over a bus to a Linux process that implements function logic. It can be divided into four parts. It is developed from TV process and existing TV to Web OS, services that are differentiated by manufacturer, service which is manufacturer's common service and JavaScript, and is used through Node.js Node.js service.

The application layer may include all applications that can be supported in a digital device, such as a TV application, a showcase application, a native application, a web application, and the like.

An application on the Web OS can be divided into a Web application, a PDK (Palm Development Kit) application, a QT (Qt Meta Language or Qt Modeling Language) application and the like depending on an implementation method.

The web application is based on a WebKit engine and is executed on a WAM runtime. These web applications can be based on the ENI framework, or they can be developed and run on a common HTML5, CSS (Cascading Style Sheets), or JavaScript based.

The PDK application includes a native application developed in C / C ++ based on a PDK provided for third-party or external developers. The PDK refers to a development library and a tool set provided for a third party such as a game to develop a native application (C / C ++). For example, PDK applications can be used to develop applications where performance is critical.

The QML application is a Qt-based native application and includes a basic application provided with a web OS platform such as a card view, a home dashboard, a virtual keyboard, and the like. Here, QML is a mark-up language in the form of a script instead of C ++.

In the meantime, the native application is an application that is developed and compiled in C / C ++ and executed in a binary form. The native application has a high speed of execution.

FIG. 8 is a diagram illustrating an architecture of a Web OS device according to an exemplary embodiment of the present invention. Referring to FIG.

8 is a block diagram based on the runtime of the Web OS device, which can be understood with reference to the layered structure of FIG.

The following description will be made with reference to FIGS. 7 and 8. FIG.

Referring to FIG. 8, services and applications and WebOS core modules are included on the system OS (Linux) and system libraries, and communication between them can be done via the Luna-Service bus.

E-mail, contact, calendar, etc. Node.js services based on HTML5, CSS, java script, logging, backup, file notify notify, database (DB), activity manager, system policy, audio daemon (AudioD), update, media server, TV services such as Electronic Program Guide (PVD), Personal Video Recorder (PVR), data broadcasting, voice recognition, Now on, Notification, search, , CP services such as Auto Content Recognition (ACR), Contents List Browser (CBOX), wfdd, DMR, Remote Application, download, Sony Philips Digital Interface Format (SDPIF), PDK applications, , QML applications, etc. And the enyo framework based on the TV UI-related applications and web applications, Luna - made the process via the Web OS core modules, such as the aforementioned SAM, WAM, LSM via the service bus. Meanwhile, in the above, TV applications and web applications may not necessarily be UI-based or UI-related.

CBOX can manage the list and metadata of external device contents such as USB, DLNA, cloud etc. connected to TV. Meanwhile, the CBOX can output a content listing of various content containers such as a USB, a DMS, a DVR, a cloud, etc. to an integrated view. In addition, CBOX can display various types of content listings such as pictures, music, video, and manage the metadata. In addition, the CBOX can output the contents of the attached storage in real-time. For example, when a storage device such as a USB is plugged in, the CBOX must be able to immediately output the content list of the storage device. At this time, a standardized method for processing the content listing may be defined. In addition, CBOX can accommodate various connection protocols.

The SAM is intended to improve module complexity and scalability. For example, the existing system manager processes various functions such as system UI, window management, web application runtime, and UX constraint processing in one process to separate the main functions to solve the large implementation complexity, Clarify the implementation interface by clarifying the interface.

LSM supports independent development and integration of system UX implementations such as card view, launcher, etc., and supports easy modification of product requirements. LSM can make multi-tasking possible by utilizing hardware resources (HW resources) when synthesizing a plurality of application screens such as an application in-app, A window management mechanism can be provided.

LSM supports implementation of system UI based on QML and improves its development productivity. Based on MVC, QML UX can easily construct views for layouts and UI components, and can easily develop code for handling user input. On the other hand, the interface between the QML and the Web OS component is via the QML extension plug-in, and the graphic operation of the application can be based on the wayland protocol, luna-service call, etc. have.

LSM is an abbreviation of Luna Surface Manager, as described above, which functions as an Application Window Compositor.

LSM synthesizes independently generated applications, UI components, etc. on the screen and outputs them. In this regard, when components such as recents applications, showcase applications, launcher applications, etc. render their own contents, the LSM defines the output area, interworking method, etc. as a compositor. In other words, the compositor LSM handles graphics synthesis, focus management, and input events. At this time, the LSM receives an event, focus, etc. from an input manager. Such an input manager may include a HID such as a remote controller, a mouse & keyboard, a joystick, a game pad, an application remote, and a pen touch.

As such, LSM supports multiple window models, which can be performed simultaneously in all applications due to the nature of the system UI. In this regard, it is also possible to provide various functions such as launcher, resent, setting, notification, system keyboard, volume UI, search, finger gesture, Voice Recognition (STT (Sound to Text), TTS LSM can support a pattern gesture (camera, mobile radio control unit (MRCU), live menu, ACR (Auto Content Recognition), etc.) .

9 is a diagram illustrating a graphic composition flow in a web OS device according to an embodiment of the present invention.

9, the graphic composition processing includes a web application manager 910 responsible for the UI process, a webkit 920 responsible for the web process, an LSM 930, and a graphic manager (GM) Lt; RTI ID = 0.0 > 940 < / RTI >

When the web application-based graphic data (or application) is generated as a UI process in the web application manager 910, the generated graphic data is transferred to the LSM 930 if the generated graphic data is not a full-screen application. On the other hand, the web application manager 910 receives an application generated in the web kit 920 for sharing a GPU (Graphic Processing Unit) memory for graphic management between the UI process and the web process, If it is not an application, it transfers it to the LSM 930. In the case of the full-screen application, the LSM 930 may be bypassed and directly transmitted to the graphic manager 940.

The LSM 930 transmits the received UI application to the wayland compositor via the weir surface, and the weir composer appropriately processes the received UI application and transmits it to the graphic manager. The graphic data transmitted from the LSM 930 is transmitted to the graphic manager composer via the LSM GM surface of the graphic manager 940, for example.

On the other hand, the full-screen application is passed directly to the graphics manager 940 without going through the LSM 930, as described above, and this application is processed in the graphics manager compositor via the WAM GM surface.

The graphical manager processes all the graphic data in the web OS device, including the data through the LSM GM surface described above, the data through the WAM GM surface, as well as the GM surface such as data broadcasting application, caption application, And processes the received graphic data appropriately on the screen. Here, the functions of the GM compositor are the same or similar to those of the compositor described above.

FIG. 10 is a view for explaining a media server according to an embodiment of the present invention, FIG. 11 is a view for explaining a configuration block diagram of a media server according to an embodiment of the present invention, and FIG. 12 Is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention.

The media server supports the execution of various multimedia in the digital device and manages the necessary resources. The media server can efficiently use hardware resources required for media play. For example, the media server requires an audio / video hardware resource for multimedia execution and can efficiently utilize the resource usage status by managing it. In general, a fixed device having a larger screen than a mobile device needs more hardware resources to execute multimedia, and a large amount of data must be encoded / decoded and transmitted at a high speed. On the other hand, the media server is a task that performs broadcasting, recording and tuning tasks in addition to streaming and file-based playback, recording simultaneously with viewing, and simultaneously displaying the sender and recipient screens in a video call And so on. However, the media server is limited in terms of hardware resources such as an encoder, a decoder, a tuner, and a display engine, and thus it is difficult to execute a plurality of tasks at the same time. For example, And processes it.

The media server may be robust in system stability because, for example, a pipeline in which an error occurs during media playback can be removed and restarted on a per-pipeline basis, Even if it does not affect other media play. Such a pipeline is a chain that links each unit function such as decoding, analysis, and output when a media reproduction request is made, and the necessary unit functions may be changed according to a media type and the like.

The media server may have extensibility, for example, adding new types of pipelines without affecting existing implementations. As an example, the media server may accommodate a camera pipeline, a video conference (Skype) pipeline, a third-party pipeline, and the like.

The media server can process general media playback and TV task execution as separate services because the interface of the TV service is different from the case of media playback. In the above description, the media server supports operations such as 'setchannel', 'channelup', 'channeldown', 'channeltuning', and 'recordstart' in relation to the TV service, ',' stop ', and so on, so that they can support different operations for both, and can be processed as separate services.

The media server can control or integrally manage the resource management function. The allocation and recall of hardware resources in the device are performed integrally in the media server. In particular, the TV service process transfers the running task and the resource allocation status to the media server. The media server obtains resources and executes the pipeline each time each media is executed, and permits execution by a priority (e.g., policy) of the media execution request, based on the resource status occupied by each pipeline, and And performs resource recall of other pipelines. Here, the predefined execution priority and necessary resource information for the specific request are managed by the policy manager, and the resource manager can communicate with the policy manager to process the resource allocation, the number of times, and the like.

The media server may have an identifier (ID) for all playback related operations. For example, the media server may issue a command to indicate a particular pipeline based on the identifier. The media server may separate the two into pipelines for playback of more than one media.

The media server may be responsible for playback of the HTML 5 standard media.

In addition, the media server may follow the TV restructuring scope of the TV pipeline as a separate service process. The media server can be designed regardless of the TV restructuring scope. If the TV is not a separate service process, it may be necessary to re-execute the entire TV when there is a problem with a specific task.

The media server is also referred to as uMS, i.e., a micro media server. Here, the media player is a media client, which is a media client, for example, a web page for an HTML5 video tag, a camera, a TV, a Skype, a second screen, It can mean a kit (Webkit).

In the media server, management of micro resources such as a resource manager, a policy manager, and the like is a core function. In this regard, the media server also controls the playback control role for the web standard media content. In this regard, the media server may also manage pipeline controller resources.

Such a media server supports, for example, extensibility, reliability, efficient resource usage, and the like.

In other words, the uMS or media server may be a web OS device, such as a resource, such as a cloud game, a MVPD (pay service), a camera preview, a second screen, a Skype, And manage and control the use of resources for proper processing in an overall manner so as to enable efficient use. On the other hand, each resource uses, for example, a pipeline in its use, and the media server can manage and control the generation, deletion, and use of a pipeline for resource management as a whole.

Here, the pipeline refers to, for example, when a media associated with a task starts a series of operations, such as parsing of a request, a decoding stream, and a video output, . For example, with respect to a TV service or an application, watching, recording, channel tuning, and the like are each individually handled under the control of resource utilization through a pipeline generated according to the request .

The processing structure and the like of the media server will be described in more detail with reference to FIG.

10, an application or service is connected to a media server 1020 via a luna-to-service bus 1010, and the media server 1020 is connected to pipelines generated again via the luna- Connected and managed.

The application or service can have various clients depending on its characteristics and can exchange data with the media server 1020 or the pipeline through it.

The client includes, for example, a uMedia client (web kit) and a RM (resource manager) client (C / C ++) for connection with the media server 1020.

The application including the uMedia client is connected to the media server 1020, as described above. More specifically, the uMedia client corresponds to, for example, a video object to be described later, and the client uses the media server 1020 for the operation of video by a request or the like.

Here, the video operation relates to the video state, and the loading, unloading, play, playback, or reproduce, pause, stop, Data may be included. Each operation or state of such video can be processed through individual pipeline generation. Accordingly, the uMedia client sends the state data associated with the video operation to the pipeline manager 1022 in the media server.

The pipeline manager 1022 obtains information on resources of the current device through data communication with the resource manager 1024 and requests resource allocation corresponding to the state data of the uMedia client. At this time, the pipeline manager 1022 or the resource manager 1024 controls resource allocation through the data communication with the policy manager 1026 when necessary in connection with the resource allocation or the like. For example, when the resource manager 1024 requests or does not have resources to allocate according to a request of the pipeline manager 1022, appropriate resource allocation or the like may be performed according to the priority comparison of the policy manager 1026 or the like .

On the other hand, the pipeline manager 1022 requests the media pipeline controller 1028 to generate a pipeline for an operation according to the request of the uMedia client with respect to resources allocated according to the resource allocation of the resource manager 1024 .

The media pipeline controller 1028 generates necessary pipelines under the control of the pipeline manager 1022. [ The generated pipeline can generate pipelines related to playback, pause, suspension, etc., as well as media pipelines and camera pipelines, as shown. The pipeline may include pipelines for HTML5, Web CP, smartshare playback, thumbnail extraction, NDK, Cinema, MHEG (Multimedia and Hypermedia Information Coding Experts Group), and the like.

In addition, the pipeline may include, for example, a service-based pipeline (its own pipeline) and a URI-based pipeline (media pipeline).

Referring to FIG. 10, an application or service including an RM client may not be directly connected to the media server 1020. This is because the application or service may directly process the media. In other words, if the application or service directly processes the media, it may not pass through the media server. However, at this time, uMS connectors need to manage resources for pipeline creation and use. Meanwhile, when receiving a resource management request for direct media processing of the application or service, the uMS connector communicates with the media server 1020 including the resource manager 1024. To this end, the media server 1020 also needs to have an uMS connector.

Accordingly, the application or service can respond to the request of the RM client by receiving the resource management of the resource manager 1024 through the uMS connector. These RM clients can handle services such as native CP, TV service, second screen, Flash player, YouTube Medai Source Extensions (MSE), cloud gaming, Skype. In this case, as described above, the resource manager 1024 can appropriately manage resources through data communication with the policy manager 1026 when necessary for resource management.

On the other hand, the URI-based pipeline is performed through the media server 1020 instead of processing the media directly as in the RM client described above. Such URI-based pipelines may include a player factory, a G streamer, a streaming plug-in, a DRM (Digital Rights Management) plug-in pipeline, and the like.

On the other hand, the interface method between application and media services may be as follows.

It is a way to interface with a service in a web application. This is a way of using Luna Call using the Palm Service Bridge (PSB), or using Cordova, which extends the display to video tags. In addition, there may be a method using the HTML5 standard for video tags or media elements.

And, it is a method of interfacing with PDK using service.

Alternatively, it is a method of using the service in the existing CP. It can be used to extend existing platform plug-ins based on Luna for backward compatibility.

Finally, it is a way to interface in the case of non-web OS. In this case, you can interface directly by calling the Luna bus.

Seamless change is handled by a separate module (eg TVWIN), which is the process for displaying and streamlining the TV on the screen without Web OS, before or during WebOS boot . This is because the boot time of WebOS is delayed, so it is used to provide the basic function of the TV service first for quick response to the user's power on request. In addition, the module is part of the TV service process and supports quick boot and null change, which provides basic TV functions, and factory mode. In addition, the module may also switch from the Non-Web OS mode to the Web OS mode.

Referring to FIG. 11, a processing structure of a media server is shown.

11, the solid line box represents the process processing configuration, and the dashed box represents the internal processing module during the process. In addition, the solid line arrows represent inter-process calls, that is, Luna service calls, and the dotted arrows may represent notifications or data flows such as register / notify.

A service or a web application or a PDK application (hereinafter 'application') is connected to various service processing components via a luna-service bus, through which an application is operated or controlled.

The data processing path depends on the type of application. For example, when the application is image data related to the camera sensor, the image data is transmitted to the camera processing unit 1130 and processed. At this time, the camera processing unit 1130 processes image data of a received application including a gesture, a face detection module, and the like. Here, the camera processing unit 1130 can generate a pipeline through the media server processing unit 1110 and process the corresponding data if the user desires to use the pipeline or the like.

Alternatively, when the application includes audio data, the audio processing unit (AudioD) 1140 and the audio module (PulseAudio) 1150 can process the audio. For example, the audio processing unit 1140 processes the audio data received from the application and transmits the audio data to the audio module 1150. At this time, the audio processing unit 1140 may include an audio policy manager to determine processing of the audio data. The processed audio data is processed in the audio module 1160. Meanwhile, the application can notify the audio data processing related data to the audio module 1160, which can also perform the notification to the audio module 1160 in the associated pipeline. The audio module 1150 includes an Advanced Linux Sound Architecture (ALSA).

Alternatively, when the application includes or processes (includes) DRM-attached content, the DRM service processing unit 1170 transmits the content data to the DRM service processing unit 1160, and the DRM service processing unit 1170 generates a DRM instance And processes the DRM-enabled content data. Meanwhile, the DRM service processing unit 1160 may process the DRM pipeline in the media pipeline through the Luna-Service bus to process the DRM-applied content data.

Hereinafter, processing in the case where the application is media data or TV service data (e.g., broadcast data) will be described.

FIG. 12 shows only the media server processing unit and the TV service processing unit in FIG. 11 described above in more detail.

Therefore, the following description will be made with reference to FIGS. 11 and 12. FIG.

First, when the application includes TV service data, it is processed in the TV service processing unit 1120/1220.

The TV service processing unit 1120 includes at least one of a DVR / channel manager, a broadcasting module, a TV pipeline manager, a TV resource manager, a data broadcasting module, an audio setting module, and a path manager. 12, the TV service processing unit 1220 includes a TV broadcast handler, a TV broadcast interface, a service processing unit, a TV middleware (TV MW (middleware)), a path manager, a BSP For example, NetCast). Here, the service processing unit may be a module including, for example, a TV pipeline manager, a TV resource manager, a TV policy manager, a USM connector, and the like.

In this specification, the TV service processing unit may have a configuration as shown in Fig. 11 or 12, or may be implemented by a combination of both, in which some configurations are omitted or some configurations not shown may be added.

The TV service processing unit 1120/1220 transmits the data to the DVR / channel manager in the case of DVR (Digital Video Recorder) or channel related data based on the attribute or type of the TV service data received from the application, To generate and process the TV pipeline. On the other hand, when the attribute or type of the TV service data is broadcast content data, the TV service processing unit 1120 generates and processes the TV pipeline through the TV pipeline manager for processing the corresponding data via the broadcasting module.

Alternatively, a json (JavaScript standard object notation) file or a file created in c is processed by a TV broadcast handler and transmitted to a TV pipeline manager through a TV broadcast interface to generate and process a TV pipeline. In this case, the TV broadcast interface unit may transmit the data or the file that has passed through the TV broadcast handler to the TV pipeline manager based on the TV service policy and refer to it when creating the pipeline.

On the other hand, the TV pipeline manager can be controlled by the TV resource manager in generating one or more pipelines according to a TV pipeline creation request from a processing module in a TV service, a manager, or the like. Meanwhile, the TV resource manager can be controlled by the TV policy manager to request the status and allocation of resources allocated for the TV service according to the TV pipeline creation request of the TV pipeline manager, and the media server processing unit 1110 / 1210) and uMS connectors. The resource manager in the media server processing unit 1110/1210 transmits the status of the current TV service, the allocation permission, etc. according to the request of the TV resource manager. For example, if a resource manager in the media server processing unit 1110/1210 confirms that all the resources for the TV service have already been allocated, the TV resource manager can notify that all the resources are currently allocated. At this time, the resource manager in the media server processing unit, together with the notify, removes a predetermined TV pipeline according to a priority or a predetermined criterion among the TV pipelines preliminarily allocated for the TV service, And may request or assign generation. Alternatively, the TV resource manager can appropriately remove, add, or control the TV pipeline in the TV resource manager according to the status report of the resource manager in the media server processing unit 1110/1210.

Meanwhile, the BSP supports, for example, backward compatibility with existing digital devices.

The TV pipelines thus generated can be appropriately operated according to the control of the path manager in the process. The path manager can determine and control the processing path or process of the pipelines by considering not only the TV pipeline but also the operation of the pipeline generated by the media server processing unit 1110/1210.

Next, when the application includes media data, rather than TV service data, it is processed by the media server processing unit 1110/1210. Here, the media server processing units 1110 and 1210 include a resource manager, a policy manager, a media pipeline manager, a media pipeline controller, and the like. On the other hand, the pipeline generated according to the control of the media pipeline manager and the media pipeline controller can be variously generated such as a camera preview pipeline, a cloud game pipeline, and a media pipeline. On the other hand, the media pipeline may include a streaming protocol, an auto / static gstreamer, and a DRM, which can be determined according to the control of the path manager. The specific processing in the media server processing units 1110 and / or 1210 cites the description of FIG. 10 described above, and will not be repeated here.

In this specification, the resource manager in the media server processing unit 1110/1210 can perform resource management with, for example, a counter base.

Hereinafter, a data processing method in a variable-type digital device according to the present invention will be described in detail with reference to the accompanying drawings. As used herein, a variable digital device refers to a digital device operated in a planar mode of a flat screen and in a variable mode for screen variable between a flat screen and a curved screen. In the present specification, the variable mode may mean a curved screen state. In other words, the variable digital device refers to a digital device that supports both a flat screen and a curved screen.

Various embodiments for a variable mode and UI / UX and variable mode operation therefor are described in this specification so that a user can more easily and conveniently use the variable digital device. On the other hand, the variable mode operation can be performed manually or automatically at the digital device according to the user's request according to the setting.

13 and 16 are diagrams showing digital devices of a flat screen and a curved screen.

As shown in Figures 13 and 16, the digital device 1300 provides an application run screen on the screen.

The digital device 1300 may be varied to a curved screen of a predetermined curvature as shown in FIG. 16, or vice versa, according to a variable mode request in a flat screen state as shown in FIG. The content being output on the screen in the variable process is generally not changed, but may be changed in some cases. On the other hand, an attribute such as a resolution, a volume, and a size of an application execution screen provided on the screen according to the screen state change, that is, the variable, may be changed. The description related to this will be described in detail in the following section, and will be omitted here.

On the other hand, when a predetermined area in the screen of FIG. 13 is changed to a curved screen according to the variable mode request, or when a variable mode or a variable function is executed, an identifier 1320 capable of identifying the curved screen may be output. This is also true in the variable mode operation on the curved screen of Fig. In addition, although the variable identifier 1320 has been described in the form of an icon, it may be text data, and may help distinguish the variable mode operation in various forms. In addition, the variable identifier 1320 is not only for identifying a variable mode operation, but may also perform an interface function for activating the variable mode operation or for entering the menu for the variable mode.

Referring to Fig. 16, a part of the left and right edges of the flat screen is curved (A) at a predetermined curvature. As described above, when the curved screen is used, the left and right corner portions curved in comparison with the flat screen correspond to the viewing direction of the user, thereby enhancing the immersion feeling of digital device use as a whole.

FIG. 14 illustrates a variable mode menu according to an embodiment of the present invention, and FIG. 15 illustrates input means for the variable mode menu access and variable mode control according to an embodiment of the present invention.

For example, the variable mode menu call shown in FIG. 14 may be performed through the remote control 1500 of FIG. When a variable key button (Flex) 1510 of the remote control 1500 of Fig. 15 is pressed, a menu screen as shown in Fig. 14 can be called and provided on the screen. Here, the remote controller is, for example, an embodiment of the input means according to the present invention for convenience of explanation. Therefore, the input means may include various digital devices such as a PC, a smart phone, and a tablet PC, which can exchange data with the variable digital device, in addition to the remote controller shown in the figure. The data may be various data including data corresponding to the key button signal of the remote controller. 14 may be invoked via a variable key button provided on the front panel of the variable-type digital device in addition to the variable key button through the remote controller 1500 of FIG. 15, for example, , A gesture, a pointer, etc., or a combination thereof.

14, when a control signal according to the access to the variable mode key button 1510 of the remote control 1500 of FIG. 15 is received while the application execution screen 1410 is being provided on the screen, the digital device displays the menu screen 1420 ). Here, the screen is a flat screen for the sake of convenience, but it may be the same on a curved screen.

In Fig. 14, a menu called according to the input means of Fig. 15 constitutes a general menu, that is, an entire menu screen of the digital device. Here, when the user selects the flex mode menu item 1430 selection signal, the digital device displays a menu item of Auto mode or Auto mode 1432 and Manual ) 1434 submenu items.

As described above, in FIG. 14, the entire menu 1420 is provided on the screen, and the setting and function call for the variable mode can be performed through several operations. If the variable mode key button access signal of FIG. 15 is received instead of the entire menu, the digital device provides only a menu item for setting a variable mode or performing a function, Operation may be performed.

17 is a diagram illustrating a variable mode menu configuration according to an embodiment of the present invention.

The variable mode menu 1720 is provided overlaid on the application execution screen 1710 being provided on the screen in accordance with the access signal of the variable key button 1510 of the remote control of Fig.

In Fig. 17, the variable mode menu 1720 provides five menu items. Each menu item is, for example, an icon according to degree of curvature change. For example, the leftmost menu item among the menu items is an icon indicating a flat screen, that is, a case where the curvature is zero, and the rest is an icon indicating a curved screen, and the curvature becomes larger toward the right. Therefore, the rightmost menu item means a curved screen to which the maximum curvature is applied. 17, a total of five icons from the flat screen icon to the maximum curvature applied curved screen icon are shown, but the present invention is not limited thereto.

In FIG. 17, when a signal for selecting the second icon 1722 is received, the digital device changes the curved screen of the curvature which the second icon 1722 means.

18 is a view for explaining reference data in the configuration of a variable mode menu according to an embodiment of the present invention.

Each menu item constituting the variable mode menu 1720 shown in FIG. 17, that is, an icon, may mean a level. Here, the level refers to a recommended curvature according to the number of viewers and / or viewing distance of the digital device, for example.

FIG. 18A shows levels according to the number of viewers, and FIG. 18B shows levels according to viewing distances. However, the present invention is not limited thereto. 18A and 18B can be referred to in the variable mode menu configuration or both of them in combination.

Referring to FIG. 18A, a total of five levels from level 0 to level 4 are shown, and each level refers to a level according to the optimum curvature recommended in the variable mode. Level 0 represents, for example, a standard mode, i.e., a planar mode. Level 1 indicates a variable mode of curvature recommended when the number of viewers is 3 to 6, level 2 indicates a variable mode of curvature recommended when the number of viewers is 2 to 5, Indicates a variable mode of curvature recommended for three persons, and level 4 indicates a variable mode of curvature recommended when the number of viewers is one or two. In summary, the lower the level, the closer to the flat screen, and the higher the level, the greater the curvature. This is because the larger the number of people, the smaller the screen that can be watched if the curvature is large, which may interfere with the viewing.

Referring to FIG. 18B, the number of people can be calculated assuming that the viewing distance is, for example, 3.2 m. For example, level 0 provides a 9140R, or flat screen. This can be regarded as an optimal level recommended when watching at a distance of about 3.2 m from five people. At level 1, the screen can be changed to a curvature value of 7655R, and it can be seen as an optimal level recommended when four people watch at a distance of 3.2 m. At level 2, the screen is tuned to a curvature value of 6170R and can be viewed at an optimal level recommended when three people watch at a distance of about 3.2m. At level 3, the screen is tuned to a curvature value of 4685R, and can be viewed at an optimal level recommended when two people watch at a distance of about 3.2m. At level 4, the screen is varied in curvature values from 3200R to 4000R, and can be viewed at an optimal level recommended when viewing at a distance of about 3.2m from one person.

In the above, the viewing distance may be adjusted by adding or subtracting 1m, for example, from 3.2m as the viewing distance. In this case, the first level may be about 6 persons in case of 2.2m, and about 3 persons in case of 4.2m. As shown in the remaining levels, the appropriate number of people can be adjusted according to the adjusted viewing distance.

Fig. 19 is a view for explaining a variable mode configuration similar to Fig. 18;

FIG. 19A assumes a screen size of 1,218 mm, or 55 inches, when the screen of the digital device is plane, and an average viewing distance of 2.5 to 3.5 meters.

19B shows that the optimum curvature of the curved screen can be determined by referring to the number of viewers and the viewing distance with respect to the following screen size, in the variable mode, referring to FIG. 19A. For example, if the screen size is 42 inches, the number of viewers is about 2.4, and the viewing distance is about 2.2 meters, about 4700R can be provided with the optimum curvature. Here, the optimal curvature may be calculated as viewing distance * (screen width size + interval between viewers) / screen width size. For example, a curvature of 2560 mm * (1218 mm + (2.4 persons -1)) / 1218 mm is 4848 R, and a curvature of about 5000 R can be calculated with an optimum curvature. In the above, the average viewing distance is assumed based on THX (sound related standard), SMPTE (broadcasting standard body), the average of recommended distance of Japan Ergonomics Society, and the like. In the above example, the interval between viewers is about 800 mm, which is assumed based on a general movie theater-to-audience distance. In addition, the average number of viewers is assumed to be 2.4.

20 and 28 are views showing a screen configuration in a variable mode operation according to an embodiment of the present invention.

FIG. 20 may be a screen configuration upon changing the curvature according to the level selected in FIG.

Referring to FIG. 20, a UI for changing a mode from a flat screen to a curved screen with a specific curvature is configured. The UI is configured in various ways, for example, to guide the user to recognize that the screen is variable.

The first UIs 2012 and 2014 provide a UI in a curved shape so that the left and right side portions that are actually variable according to the variable request can be identified on the screen. The sizes and widths of the UIs 2012 and 2012 may be different depending on the applied curvature degree.

The second UIs 2022 and 2024 are configured in the form of arrows to indicate that the screen is variable. At this time, the sizes, the widths, the colors, and the like of the arrows 2022 and 2024 may be provided differently depending on the applied curvature, as described above.

The third UI 2030 provides guide data for identifying a current variable in a predetermined area on the screen. The guide data is generally provided in the form of text data such as 'variable length', 'level variable x', or the like, but it may be provided so that the variable status can be identified through color change or size change.

On the other hand, in FIG. 28A, the variable mode operation can be displayed in an arrow shape (2812, 2814) similarly to FIG. At this time, the sizes and the numbers of the arrows 2812 and 2814 may differ depending on the degree of curvature. Alternatively, FIG. 28B may provide an identifier 2824 to indicate at the bottom the 'smart curve' and the center of the screen to indicate that the variable mode is in operation. The identifier 2824 may disappear upon completion of the variable mode operation.

In the above description, each of the UIs has been described individually, but each UI may be provided by appropriately combining two or more of them. Also, although not shown or described, the guide data that can identify the variable or variable degree may be configured in various forms and may be combined with the above-described contents as appropriate.

FIG. 21 is a diagram illustrating a screen configuration upon completing the variable mode according to an embodiment of the present invention. FIG.

FIG. 21 shows a UI for guiding the variable curvature when it is finally changed to the requested curvature.

In FIG. 21A, a guide UI called 'Smart Curved' is provided at the lower end 2110 of the center of the screen, and a guide UI is provided at the upper right 2120 of the screen in FIG. 21B.

Here, the text data 'Smart Curved' may be changed into various text data indicating a variable end. In addition, the variable UI may be provided in various forms different from those shown.

Although FIGS. 21A and 21B have been described in terms of the UI provided at the completion or end of the variable mode, they may alternatively be provided to indicate that the variable mode operation is completed before the completion of the variable mode.

21A and 21B, although the variable UI is provided at the bottom of the screen center and the upper right of the screen, the variable UI is not fixed to the corresponding area or position but may be provided at another area or position, Or may continue to move.

In addition, when the variable UI shown in FIGS. 21A and 21B indicates completion of the variable mode, it is automatically implemented on the screen when an arbitrary confirmation key signal is received or a predetermined time elapses.

22 is a diagram illustrating a screen configuration of a variable digital device 2200 according to an embodiment of the present invention.

Referring to Fig. 22, the screen 2210 can be largely composed of three parts. The first part 2214 and the third part 2216 indicate the portion to be curved, that is, the portion to which the curvature is applied when changing from the flat screen to the curved screen, and the second part 2212, There is no change in curvature.

In the above, the widths of the first part 2214, the second part 2212 and the third part 2216 are B2, B1 and B3, respectively, which can be determined by the manufacturer. For example, the manufacturer may determine B1 and B3 so that the width B1 of the second part 2212 may have an aspect ratio of at least 16: 9 if the digital device 2200 is a 55 inch or larger screen. The width may be determined by taking into consideration various factors such as the immersion effect and the user's satisfaction at the time of implementing the curved screen according to the curvature change, not only considering the screen obesity. This can also affect curvature determination.

If the digital device 2200 has a screen to which local dimming is applied, the degree of curvature and the width of each part may be determined based on the local dimming block unit.

On the other hand, as shown in Fig. 22, each part of the screen is divided because the screen configuration of the screen can be made different depending on the mode. A more detailed description thereof will be described later, and a detailed description thereof will be omitted.

23 and 24 are diagrams for explaining a variable mode menu configuration according to an embodiment of the present invention.

23 and 24 can replace the menu 1720 of FIG. 17, for example. FIG. 24 may also show the provision of additional menus by accessing the menu items of the primary menu after providing the primary menu, as in FIG. 23. FIG.

Any one or any combination of Figs. 23A-23J may be provided in place of the menu 1720 of Fig. 17, as described above.

Fig. 23A is provided with menu items showing the maximum curved level from the plane level. Between the plane level and the maximum curved level, menu items may be provided that indicate one or more curved levels.

23B and 23C are similar to Fig. 23A for example, but Fig. 23A shows the menu items displaying numerically curved levels between the plane level and the maximum curved level, 23C), and provides a selection convenience of the user or the like.

23A to 23C, although three curved levels are provided similarly to FIG. 17 for the sake of convenience between the plane level and the maximum curved level, the present invention is not limited thereto. For example, as shown in Fig. 23E, only one half curved level menu item exists, or only a plane and a maximum curved level exist, and there may not be another curved level, as shown in Fig. 23H. On the other hand, as shown in FIG. 23E, an optimal curved level menu item may be provided between the plane level and the maximum curved level instead of the half curved level. Alternatively, as shown in FIGS. 23A to 23C, 23E, 23G, and 23H, it is possible to configure not only a predetermined level but also a bar shape as shown in FIG. 23J so that all levels between the maximum curved level in the plane can be selected, It is also possible to use the mode convenience.

23D provides genre level menu items, for example, between a plane level and a maximum curved level, or without a plane level and a maximum curved level, so that when a specific genre level menu item is selected, Level to a low level. For example, in the case of the personalized variable mode, the user-level menu items are provided, and the preset user-level menu items are provided with a predetermined curve You can also apply de-level. In this case, the user can be recognized by referring to the log-in data of the user, the sensing information such as the gesture, voice, fingerprint, camera, and the like.

In addition, the variable mode application may be such that the second part 2214 and the third part 2216 are always curved together or the same curvature level is not applied in FIG. Accordingly, as shown in FIG. 23F, a menu item (Left curved) for curving only the left area corresponding to the second part and a menu item (Right curved) for curving only the right area may be provided. Although not shown, in this case, a menu item that can control both areas simultaneously, as described above, may also be provided.

Fig. 24 will be described with reference to Fig. 22 for convenience.

24A shows a menu screen in which the variable mode UI can select the left curved area, the center area, and the right curved area.

Basically, Fig. 24 may be menu items configured for individually controlling the left curved area, the center area, and the right curved area in Fig. Here, the individual control may mean that different applications are executed in each area and controlled to be provided.

The digital device arranges one or more submenu items around the selected right curved menu item (e.g., adjacent right side) as shown in FIG. 24B when the right curved menu item 2410 selection signal is received. Here, the sub menu items may be, for example, an application list to be provided in the right curve area. For example, FIG. 24B provides submenu items for the TV application 2412 of the preferred channel xx, the web browser application 2414 for Internet use, and App1 2416 as submenu items.

Fig. 24C is identical in content to Fig. 24B described above, but differs in the manner of providing a menu item or a sub-menu item. For example, in FIG. 24B, menu items 2410 and sub menu items 2412, 2414, and 2416 are highlighted after selection, but the sizes and the like are provided equally. On the other hand, referring to FIG. 24C, the selected menu item 2420 is enlarged in size compared to other menu items, and the sub menu items 2422, 2424, and 2426 of the menu item 2420 have sizes and colors And can be provided.

On the other hand, referring to FIG. 24D, the width of the left curved menu item 2430 may be changed before selection, and the sub menu items 2432, 2434, and 2436 may be provided in the form of an icon on the left.

In FIGS. 23 and 24, the sub-menu items are not only for application selection to be provided in the area, for example, but are not shown, but various contents available in the digital device, such as related applications or functions, may be provided.

25 to 27 are diagrams for explaining an application processing method of the variable digital device.

22, the screen of the digital device, that is, the plane or curved screen, can be divided into a first part 2214, a second part 2212 and a third part 2216, The third part 2214 and the third part 2216 may be a part where the curvature is applied and varied.

Here, through the menu of Figs. 23 to 24, the application output to the first part to the third part can be selected. Embodiments thereof are shown in Figs. 25-27.

25, an EPG is provided to the first part 2514 of the digital device 2500, an existing application execution screen is provided to the second part 2512, a web browser Is provided.

26, a first part 2614 of the digital device 2600 is provided with a news application or a TV application execution screen, a second part 2612 is provided with a game application execution screen, and finally a third part 2616) is provided with a TV application run screen associated with the sport. Here, the first part 2614 and the third part 2616 may be the same TV application execution screen, but they may provide contents for different channels, for example.

When a menu request signal is received during execution of an application on a flat or curved screen, the digital device is provided overlaid on a part of the lower part of the screen and is provided with a recent part including history data and an application part (Menu Launcher) or a Web Launcher (including a part).

However, in the case where the executed application is provided through the entire screen, as described above, a menu launcher in which a recall part and an application part are provided in parallel at the bottom of the screen may be provided. However, Each part can be inconvenient if it runs different applications. Therefore, in this case, as shown in Fig. 27, the recall part and the application part of the menu launcher menu can be divided and provided to different parts of the screen. In Figure 27, for example, a recall part is provided for the first part 2714, and an application part is provided for the third part 2716. [

On the other hand, although the recall part and the application part provided for each part of the screen are shown in the form provided in the entire corresponding part of the screen in Fig. 27, the number of the recall part and the application part It may only be provided for some.

In the above-described FIGS. 25 to 27, the first part to the third part are all provided with different applications or menu screens. Alternatively, applications or menu screens different from the specific parts may be provided, . ≪ / RTI >

29 is a block diagram illustrating a configuration of a variable digital device according to another embodiment of the present invention.

Here, the digital device of Fig. 29 specifically shows only the configuration blocks related to the variable mode operation, but one or more configurations other than the illustrated configuration may be further added or omitted. The configuration added above may be, for example, the configuration shown in Figs. 2 to 5.

The digital device includes a controller 2930, a curve driving unit 2940, a light source driving unit 2950, a storage unit 2960, and the like.

Controller 2930 controls all of the operations described herein with respect to the variable mode.

The curve driving section 2940 performs curve control on the parts 2924 and 2926 curved in accordance with the control command of the controller 2930 with predetermined curvatures R1 and R2, respectively. Here, the curvatures R1 and R2 may or may not be the same. Also, it is apparent that the curve driving unit 2940 can perform the restoration control from the curved state to the planar state under the control of the controller 2930.

The light source driver 2950 provides a light source to the screen 2920. At this time, the light source driver 2950 can control the intensity of the light source, the amount of light, and the like of the first part 2924 and the third part 2926, compared to the second part 2922, under the control of the controller 2930. For example, the light source control contents of the first part 2924 and the third part 2926 may be different from each other when they are in a flat screen state and a curved screen state.

The storage unit 2960 stores various data and control contents required for the digital device. The data thus stored includes UI data or related data related to the various variable mode operations provided in Figs. 13 to 28 described above.

The controller 2930 can change and control at least one of picture quality and sound quality according to the variable mode. For example, the controller 2930 can adjust picture quality / sound quality when the screen is a flat screen and picture quality / sound quality when the screen is a curved screen. In addition, when the curvature is changed according to the above-described level, the controller 2930 may change and provide optimized image quality / sound quality calculated in advance for each curvature.

The controller 2930 can control the image quality / sound quality described above by referring to various data such as genre, application attribute, user setting, ambient illuminance, ambient noise, temperature and the like in consideration of the immersion feeling of the user according to the variable mode.

The controller 2930 can automatically control the flat screen and the curved screen at the time of executing the application by referring to various data such as genre, application property, user setting, ambient illuminance, ambient noise, temperature, etc. in consideration of the immersion feeling of the user have. In this case, when the controller 2930 controls the curved screen, the controller 2930 can perform the variable mode control based on the preset curvature or the calculated optimized curvature.

The controller 2930 can control the variable mode based on the signal of the input means such as a remote controller. For example, the controller 2930 can provide the variable mode menu UI according to the reception of the variable key button signal of the input means. In a state where the variable mode menu UI is provided, when the variable key button signal of the remote controller is received again, the controller 2930 can provide information on the current curvature. Here, when the variable key button signal of the remote controller is received again, the controller 2930 can remove the variable mode menu UI from the screen. When the variable key button signal is first received from the input means, the controller 2930 does not provide the variable mode menu UI described above, but controls the controller 2930 to immediately perform the variable mode by immediately applying the preset curvature or the optimized curvature . The controller 2930 can control the curvature change in the area where the pointer is located when the variable key button signal is received after the pointer is provided. However, when the position of the pointer is a region where the curvature is not applied, the variable mode menu UI or the variable mode related reference data including the current curvature data may be provided. If the position of the pointer is located only in one of the regions where the curvature is changed, the controller 2930 may apply curvature only to the corresponding region or apply curvature to all regions. In addition, the controller 2930 may adjust the current curvature in accordance with the movement of the touch pad or wheel of the input means in the variable mode activation state. In this case, it is possible to intuitively recognize the degree of curvature change control intuitively by providing the curvature change control data according to the touch pad or the wheel movement to the user by digitizing or graphing.

In addition, the controller 2930 can control whether or not the variable mode control content remains immediately before or after the power of the digital device is turned on or off during or after the completion of the variable mode operation. When the controller 2930 is changed from a flat screen to a curved screen before the power is turned on or off, the controller 2930 can keep the change intact after the power on or turn off, or vice versa. For example, when the controller 2930 is changed to a curved screen before the power is turned on, the controller 2930 can control the change from the curved screen to the flat screen when the power is turned off before the power is turned on. This may vary depending on the setting, but it may be for a plurality of users or to prepare for the case of a newspaper, a web browsing, a still image, or a screen division. Therefore, the controller 2930 may control the change to the flat screen in advance in case of the above-described case at power-on based on the sensing data collected in the standby state before the power is turned on. Further, the controller 2930 may provide guide data on the screen in advance on the initial page at power-off time or power-on time, and may select the guide data.

FIG. 30 is a flowchart for explaining a data processing method in a variable digital device according to an embodiment of the present invention, and FIG. 31 is a flowchart for explaining a data processing method in a variable digital device according to another embodiment of the present invention.

Referring to FIG. 30, the digital device displays a first application execution screen on a screen (S3002).

When the first signal requesting menu display is received (S3004), the requested menu is displayed on the first application execution screen according to the first signal (S3006).

The digital device receives a second signal for selecting one of two or more menu items for the variable between the flat screen and the curved screen from the displayed menu (S3008) And changes to a screen having a curvature corresponding to the menu item selected according to the signal (S3010). Wherein the step S3010 is a step of displaying first guide data for identifying the variable being on the first application execution screen on the screen before the variable ending and displaying the first guide data on the first application execution screen on the screen after the variable end, The second guide data is displayed.

The step S3010 may further include adjusting at least one of picture quality and sound quality set in advance corresponding to the selected menu item,

The digital device may further comprise collecting reference data.

Wherein the menu item of any one of the menu items may be a menu item corresponding to a screen of an optimal curvature based on the collected reference data and the optimal curvature may be a screen size of the digital device, And inter-viewer interval data. The reference data may include at least one of an application attribute, an application genre, a user setting, a user, an ambient illuminance, ambient noise, a screen size of the digital device, a number of viewers, can do.

The digital device may further include displaying the current curvature information, and may further include at least one of displaying recommendation or optimal curvature information.

The digital device may further comprise: receiving a third signal requesting power on or off after said variable; and receiving a third signal from said current signal state to said variable front screen state, Displaying the third guide data on the screen, varying the screen according to the selection of the third guide data, and controlling power on or off according to the third signal have.

The digital device may further perform the steps of receiving a fourth signal requesting variable control according to wheel movement or pointer movement of the input means and varying the current screen state according to the fourth signal.

The digital device may display an application or a menu screen other than the first application in at least one corner area of the variable edge areas when the screen is changed from a flat screen to a curved screen .

Referring to FIG. 31, the digital device automatically receives a setting signal for varying between a flat screen and a curved screen (S3102), and receives an application execution request signal (S3104).

The digital device collects the reference data for the screen variable (S3106), determines whether or not the screen is variable based on the collected reference data, and varies the screen based on the determination result (S3108).

The digital device displays the execution screen of the application requested to be executed on the variable screen (S3110).

According to various embodiments of the present invention described above, it is possible to easily and easily control the variable-type digital device by providing an intuitive UI for variable-mode operation control, and to maximize the convenience of use of the variable-type digital device by referring to and using various factors And can further improve the satisfaction of the variable digital device and inspire the desire to purchase the product.

The digital device and the method of processing a service or an application in the digital device equipped with the web OS disclosed in this specification are not limited in the configuration and the method of the embodiments described above but the embodiments can be variously modified All or some of the embodiments may be selectively combined.

Meanwhile, the operation method of the digital device disclosed in this specification can be implemented as a code readable by a processor in a recording medium readable by a processor included in the digital device. The processor-readable recording medium includes all kinds of recording devices in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include ROM (Read Only Memory), RAM (Random Access Memory), CD-ROM, magnetic tape, floppy disk, optical data storage device, And may be implemented in the form of a carrier-wave. In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Of the right. Further, such modifications are not to be understood individually from the technical idea of the present invention.

201: network interface unit 202: TCP / IP manager
203: service delivery manager 204: SI decoder
205 demultiplexer 206 audio decoder
207: Video decoder 208:
209: Service Control Manager 210: Service Discovery Manager
211: SI & Metadata Database 212: Metadata Manager
213: service manager 214: UI manager

Claims (20)

A method of processing data in a digital device,
Displaying a first application execution screen on a screen;
Receiving a first signal requesting a menu display;
Displaying the requested menu on the first application execution screen in accordance with the first signal;
Receiving a second signal from the displayed menu to select one of two or more menu items for variable between a flat screen and a curved screen; And
Varying a screen of a curvature corresponding to a menu item selected according to the second signal,
Wherein said varying comprises:
Displays first guide data for identifying the variable being on the first application execution screen on the screen before the variable end,
And displaying second guide data identifying the variable end on a first application run screen on the screen after the variable end.
The method according to claim 1,
Wherein said varying comprises:
Further comprising adjusting at least one of image quality and sound quality set in advance corresponding to the selected menu item.
3. The method of claim 2,
Further comprising the step of collecting reference data,
Wherein the menu item of any one of the menu items is a menu item corresponding to a screen of an optimal curvature based on the collected reference data.
The method of claim 3,
The optimum curvature may be calculated by:
Wherein the calculation is performed based on reference data including the screen size, the number of viewers, the viewing distance, and the interval data between viewers of the digital device.
The method of claim 3,
The reference data includes:
Wherein the digital device includes at least one of an attribute of an application, a genre of an application, a user setting, a user, an ambient illuminance, ambient noise, a screen size of the digital device, Lt; / RTI >
The method according to claim 1,
Displaying current curvature information; And
And displaying the recommendation or optimal curvature information on the basis of at least one of the following:
The method according to claim 1,
Receiving a third signal requesting power on or off after said variable;
Displaying on the screen third guide data for changing from the current screen state to the variable pre-screen state or for maintaining the current screen state according to the reception of the third signal; And
Changing the screen according to selection of the third guide data, and controlling power on or off according to the third signal.
The method according to claim 1,
Receiving a fourth signal requesting variable control according to wheel movement or pointer movement of the input means; And
And performing variable control in a current screen state according to the fourth signal.
The method according to claim 1,
When the screen is changed from a flat screen to a curved screen,
And displays an application or a menu screen other than the first application in at least one corner area of the variable edge areas.
A method of processing data in a digital device,
Automatically receiving a setting signal for varying between a flat screen and a curved screen;
Receiving an application execution request signal;
Collecting reference data for the screen variable;
Determining whether or not the screen is variable based on the collected reference data, and varying the screen based on the determination result; And
And displaying an execution screen of the execution requested application on the variable screen.
In a digital device,
Memory;
A display unit for displaying the requested menu on a screen on a first application execution screen and the first application execution screen;
A user interface for receiving a first signal requesting a menu display and a second signal for selecting a menu item from one or more menu items for variable between a flat screen and a curved screen from the displayed menu, part; And
And a controller for controlling the variable to a screen of a curvature corresponding to the menu item selected according to the second signal,
The controller comprising:
The first guide data identifying the variable being on the first application execution screen on the screen before the variable end, and the second guide data identifying the variable end on the first application execution screen on the screen after the variable end, To be displayed.
12. The method of claim 11,
The controller comprising:
And controls at least one of image quality and sound quality set in advance corresponding to the selected menu item to be adjusted together with the change.
13. The method of claim 12,
The controller comprising:
And controls to display one of the menu items as a menu item corresponding to a screen of an optimal curvature based on the collected reference data.
14. The method of claim 13,
The controller comprising:
And calculates the optimum curvature based on reference data including the screen size, the number of viewers, the viewing distance, and the interval data between viewers of the digital device.
14. The method of claim 13,
The controller comprising:
At least one of the attribute of the application, the genre of the application, the user setting, the user, the ambient illuminance, the ambient noise, the screen size of the digital device, the number of viewers, the viewing distance, .
12. The method of claim 11,
The controller comprising:
And to display the current curvature information and the recommended / optimal curvature information.
12. The method of claim 11,
Wherein the user interface further receives a third signal requesting power on or off after the variable,
The controller controls to display, on the screen, third guide data for changing from the current screen state to the variable pre-screen state or for maintaining the current screen state according to the reception of the third signal, And controls the power on or off according to the third signal.
12. The method of claim 11,
The user interface unit receives a fourth signal requesting variable control according to a wheel movement or a pointer movement of the input means,
Wherein the controller performs variable control in a current screen state in accordance with the fourth signal.
12. The method of claim 11,
The controller comprising:
Wherein when the screen is changed from a flat screen to a curved screen, control is performed to display an application or a menu screen other than the first application in at least one corner area of the variable corner areas. Digital device.
In a digital device,
Memory;
A user interface unit for receiving a setting signal for automatically changing between a flat screen and a curved screen, and an application execution request signal;
A control unit for collecting reference data for the screen variable, determining whether or not the screen is variable based on the collected reference data, and controlling the screen variable based on the determination result; And
And a display unit for displaying an execution screen of the application requested to be executed on the variable screen.
KR1020150045320A 2014-12-29 2015-03-31 Flexible digital device and method of processing data the same KR20160080041A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462097136P 2014-12-29 2014-12-29
US62/097,136 2014-12-29

Publications (1)

Publication Number Publication Date
KR20160080041A true KR20160080041A (en) 2016-07-07

Family

ID=56499812

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150045320A KR20160080041A (en) 2014-12-29 2015-03-31 Flexible digital device and method of processing data the same

Country Status (1)

Country Link
KR (1) KR20160080041A (en)

Similar Documents

Publication Publication Date Title
KR101567832B1 (en) Digital device and method for controlling the same
KR102288087B1 (en) Multimedia device and method for controlling the same
KR20160066268A (en) Multimedia device and method for controlling the same
KR101632221B1 (en) Digital device and method for processing service thereof
KR20170031370A (en) Mobile terminal and method for controlling the same
KR20160078204A (en) Digital device and method of processing data the same
KR102381141B1 (en) Display device and method for controlling the same
KR102367882B1 (en) Digital device and method of processing application data thereof
KR20170090102A (en) Digital device and method for controlling the same
KR102396035B1 (en) Digital device and method for processing stt thereof
KR20170087307A (en) Display device and method for controlling the same
KR20170126645A (en) Digital device and controlling method thereof
KR102356780B1 (en) Display device and method for controlling the same
KR102384520B1 (en) Display device and controlling method thereof
KR20170138788A (en) Digital device and controlling method thereof
KR20170092408A (en) Digital device and method for controlling the same
KR102439464B1 (en) Digital device and method for controlling the same
KR102418140B1 (en) Digital device and method for controlling the same
KR102294600B1 (en) Digital device and method for controlling the same
KR102439465B1 (en) Connected device system and method for processing application data the same
KR20170010484A (en) Display device and controlling method thereof
KR20150101902A (en) Digital device and method of controlling thereof
KR102268751B1 (en) Digital device and method for controlling the same
KR20160080041A (en) Flexible digital device and method of processing data the same
KR20160026046A (en) Digital device and method for controlling the same