KR20170090102A - Digital device and method for controlling the same - Google Patents
Digital device and method for controlling the same Download PDFInfo
- Publication number
- KR20170090102A KR20170090102A KR1020160010551A KR20160010551A KR20170090102A KR 20170090102 A KR20170090102 A KR 20170090102A KR 1020160010551 A KR1020160010551 A KR 1020160010551A KR 20160010551 A KR20160010551 A KR 20160010551A KR 20170090102 A KR20170090102 A KR 20170090102A
- Authority
- KR
- South Korea
- Prior art keywords
- screen
- video data
- mobile device
- digital device
- controller
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
Abstract
Description
The present invention relates to a digital device, and more specifically, relates to a digital device that displays only a part of the entire screen of a mobile device while communicating with the mobile device in a simple operation.
Mobile devices such as a smart phone and a tablet PC have attracted attention in addition to a standing device such as a personal computer (PC) and a television (TV). Fixed devices and mobile devices originally developed in their respective domains, but the area has become obscure due to the recent boom of digital convergence.
In addition, as the digital device develops or changes environment, the user's eye level increases, and there is a demand for various kinds of high-speed services or applications.
Unlike the application environment used in the conventional operating system (OS), most of the contents exist in the host system of the remote service provider in the Web OS (Web OS). Therefore, the digital device on which the web OS is installed must access the host system of the service provider located at the remote location every time the application is used. This is disadvantageous in that an initial operation according to the power-on of the digital device, for example, not only system booting but also application use after the initial operation.
For example, if there is a problem in the network environment or a limitation of system resources, the system boot or the use of the application may be inconvenient.
Miracast is a transmission technology that can share the same screen and voice on a TV as a smartphone when a smartphone and a TV are connected using a Wi-Fi connection. In this case, you can see all the screens of the smartphone such as games, internet, applications as well as videos and photos on TV.
For example, when mirroring a TV on a smartphone, the entire screen of the smartphone is mirrored. In this case, the screen of the smartphone is shared with everyone around the TV.
In the conventional technology, when a user performs a plurality of tasks on a smartphone, the entire screen of the smartphone is transmitted to the TV, so that portions of the screen of the smartphone that are not desired to be displayed to other users are displayed on the TV screen.
It is an object of the present invention to provide a digital device and a control method thereof for displaying only a part of a mobile device screen when a specific signal is received from the mobile device while the mobile device and the digital device are communicatively connected.
Another object of the present invention is to provide a digital device and a control method thereof for displaying a screen other than a screen obscured by a hand when a part of the mobile device screen is covered with a user's hand.
Another object of the present invention is to provide a digital device and a control method thereof for resetting a screen area desired to be shared by a user using a screen dividing bar of a mobile device and displaying the reset screen area .
The technical problem to be solved by the present invention is not limited to the above-described technical problems and other technical problems which are not mentioned can be clearly understood by those skilled in the art from the following description .
A digital device according to an embodiment of the present invention includes: a wireless communication module that transmits and receives data to and from a mobile device; Receiving video data displayed on a first screen which is a screen of a mobile device from the mobile device via the wireless communication module and receiving a first specific signal from the mobile device via the wireless communication module, And displays a part of the received video data on a second screen which is a screen of the digital device, and when receiving a second specific signal from the mobile device via the wireless communication module, A controller for displaying on the screen; And a display module for displaying all or a part of the video data on the second screen in accordance with a control command from the controller.
According to another aspect of the present invention, there is provided a method of controlling a digital device, comprising: communicating with a mobile device; Receiving from the mobile device video data to be displayed on a first screen which is a screen of the mobile device; Displaying a part of the video data received corresponding to the first specific signal on a second screen of the digital device when the first specific signal is received from the mobile device; And displaying all of the video data on the second screen in response to the second specific signal when the second specific signal is received from the mobile device.
According to an embodiment of the present invention, when a mobile device and a digital device are communicatively connected, when receiving a specific signal from the mobile device, only a part of the mobile device screen is displayed. Therefore, the screen area that the user does not want to share is not displayed, so the user's convenience is improved.
According to another embodiment of the present invention, when a portion of the mobile device screen is covered with the user's hand, the remaining screen except for the screen hidden by the hand is displayed. Therefore, the user can easily set a screen area that the user does not want to share, thereby improving the user's convenience.
According to another embodiment of the present invention, a screen area desired to be shared by a user is reset using a screen division indicator of the mobile device, and a reset screen area is displayed. Accordingly, the user can adjust the first screen area desired to be shared by the user and the second screen area not desired to be shared, thereby improving user convenience.
1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention.
2 is a block diagram illustrating a digital device according to an exemplary embodiment of the present invention.
3 is a block diagram illustrating a digital device according to another embodiment of the present invention.
4 is a block diagram illustrating a digital device according to another embodiment of the present invention.
FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention. Referring to FIG.
Figure 6 is a diagram illustrating an input means coupled to the digital device of Figures 2 through 4, in accordance with an embodiment of the present invention.
7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention.
FIG. 8 is a diagram illustrating an architecture of a Web OS device according to an exemplary embodiment of the present invention. Referring to FIG.
9 is a diagram illustrating a graphical composition flow in a web OS device according to an embodiment of the present invention.
FIG. 10 is a diagram illustrating a media server according to an embodiment of the present invention. Referring to FIG.
11 is a block diagram illustrating a configuration of a media server according to an embodiment of the present invention.
FIG. 12 is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention.
13 is a configuration diagram of a digital device according to an embodiment of the present invention.
14 is a flowchart of a method of controlling a digital device according to an embodiment of the present invention.
FIG. 15 is a view illustrating screen mirroring of the remaining area excluding the obscured area when the screen is covered with the hand floor according to an embodiment of the present invention.
16 is a diagram illustrating setting of a screen mirroring area using a screen division indicator according to an embodiment of the present invention.
17 is a diagram illustrating a screen size adjustment when a volume button of a smartphone is pressed according to an embodiment of the present invention.
FIG. 18 is a diagram illustrating black screen processing of a portion of a non-smart phone screen area in a mirroring state according to an exemplary embodiment of the present invention.
FIG. 19 is a diagram illustrating automatic adjustment and display of video data when a screen of a mobile device according to an exemplary embodiment of the present invention is a specific content screen and a vertical screen.
20 is a diagram illustrating display of video data based on resolution information of a mobile device according to an embodiment of the present invention.
21 is a diagram illustrating a mobile device in association with a digital device according to an embodiment of the present invention.
22 is a diagram illustrating a control method of a mobile device in conjunction with a digital device according to an embodiment of the present invention.
23 is a diagram illustrating a mobile device according to an embodiment of the present invention transmitting video data and a black screen generation signal to a digital device.
24 is a diagram illustrating setting of a shared screen area as a specific gesture of a user according to an embodiment of the present invention.
Hereinafter, various embodiments (s) of a digital device according to the present invention and a method of processing application data in the digital device will be described in detail with reference to the drawings.
The suffix "module "," part ", and the like for components used in the present specification are given only for ease of specification, and both may be used as needed. Also, even when described in ordinal numbers such as " 1st ", "2nd ", and the like, it is not limited to such terms or ordinal numbers.
In addition, although the terms used in the present specification have been selected from the general terms that are widely used in the present invention in consideration of the functions according to the technical idea of the present invention, they are not limited to the intentions or customs of the artisan skilled in the art, It can be different. However, in certain cases, some terms are arbitrarily selected by the applicant, which will be described in the related description section. Accordingly, it should be understood that the term is to be interpreted based not only on its name but on its practical meaning as well as on the contents described throughout this specification.
It is to be noted that the contents of the present specification and / or drawings are not intended to limit the scope of the present invention.
The term " digital device " as used herein refers to a device that transmits, receives, processes, and outputs data, content, service, And includes all devices that perform at least one or more. The digital device can be paired or connected (hereinafter, referred to as 'pairing') with another digital device, an external server, or the like through a wire / wireless network, Can be transmitted / received. At this time, if necessary, the data may be appropriately converted before the transmission / reception. The digital device may be a standing device such as a network TV, a Hybrid Broadcast Broadband TV (HBBTV), a Smart TV, an IPTV (Internet Protocol TV), a PC (Personal Computer) And a mobile device or handheld device such as a PDA (Personal Digital Assistant), a smart phone, a tablet PC, a notebook, and the like. In order to facilitate understanding of the present invention and to facilitate the description of the present invention, FIG. 2, which will be described later, describes a digital TV, and FIG. 3 illustrates and describes a mobile device as an embodiment of a digital device. In addition, the digital device described in this specification may be a configuration having only a panel, a configuration such as a set-top box (STB), a device, a system, etc. and a set configuration .
The term " wired / wireless network " as used herein collectively refers to communication networks that support various communication standards or protocols for pairing and / or data transmission / reception between digital devices or digital devices and external servers. Such a wired / wireless network includes all of the communication networks to be supported by the standard now or in the future, and is capable of supporting one or more communication protocols therefor. Such a wired / wireless network includes, for example, a USB (Universal Serial Bus), a Composite Video Banking Sync (CVBS), a Component, an S-Video (Analog), a DVI (Digital Visual Interface) A communication standard or protocol for a wired connection such as an RGB or a D-SUB, a Bluetooth standard, a radio frequency identification (RFID), an infrared data association (IrDA), an ultra wideband (UWB) (ZigBee), DLNA (Digital Living Network Alliance), WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) A long term evolution (LTE-Advanced), and Wi-Fi direct, and a communication standard or protocol for the network.
In addition, when the term is simply referred to as a digital device in this specification, the meaning may mean a fixed device or a mobile device depending on the context, and may be used to mean both, unless specifically stated otherwise.
Meanwhile, a digital device is an intelligent device that supports, for example, a broadcast receiving function, a computer function or a support, at least one external input, and the like. The digital device may be an e-mail, web browsing, Banking, game, application, and so on. In addition, the digital device may include an interface for supporting at least one input or control means (hereinafter, " input means ") such as a handwriting input device, a touch- .
In addition, the digital device can use a standardized general-purpose OS (Operating System), but in particular, the digital device described in this specification uses the Web OS as an embodiment. Therefore, a digital device can handle adding, deleting, amending, and updating various services or applications on a general-purpose OS kernel or a Linux kernel. And through which a more user-friendly environment can be constructed and provided.
Meanwhile, the above-described digital device can receive and process an external input. The external input is connected to an external input device, that is, the digital device, through the wired / wireless network, An input means or a digital device. For example, the external input may be a game device such as a high-definition multimedia interface (HDMI), a playstation or an X-Box, a smart phone, a tablet PC, a pocket photo devices such as digital cameras, printing devices, smart TVs, Blu-ray device devices and the like.
In addition, the term "server" as used herein refers to a digital device or system that supplies data to or receives data from a digital device, that is, a client, and may be referred to as a processor do. The server provides a Web server, a portal server, and advertising data for providing a web page, a web content or a web content or a web service, An advertising server, a content server for providing content, an SNS server for providing a social network service (SNS), a service server provided by a manufacturer, a video on demand (VoD) server, A service server providing a Multichannel Video Programming Distributor (MVPD) for providing a streaming service, a pay service, and the like.
In addition, in the following description for convenience of explanation, only the application is described in the context of the present invention, and the meaning may include not only the application but also the service based on the context and the like. In addition, the application may refer to a web application according to the webOS platform.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
1 is a schematic diagram illustrating a service system including a digital device according to an exemplary embodiment of the present invention.
1, a service system includes a
The
The
The
The above-described
The
The
Meanwhile, the
In addition, the
In FIG. 1, the
2 is a block diagram illustrating a digital device according to an exemplary embodiment of the present invention.
The digital device described herein corresponds to the
The
The
The TCP /
The
The
The
The audio /
The application manager may include, for example, the
The
The
The
The
The
The
The SI &
The SI &
Meanwhile, the
3 is a block diagram illustrating a digital device according to another embodiment of the present invention.
If the above-described Fig. 2 is described with an example of a digital device as a fixing device, Fig. 3 shows a mobile device as another embodiment of a digital device.
3, the
Hereinafter, each component will be described in detail.
The
The
The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms, for example, in the form of an EPG (Electronic Program Guide) or an ESG (Electronic Service Guide).
The
The broadcast signal and / or broadcast related information received through the
The
The
The short-
The
The A /
The image frame processed by the
The
The
The
The
The
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
The
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits corresponding data to the
A
Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
The
The
The
The
The
The
The identification module is a chip for storing various information for authenticating the usage right of the
The
The
The
The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and an electrical unit for performing other functions. In some cases, the implementation described herein Examples may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Here, the software code is stored in the
4 is a block diagram illustrating a digital device according to another embodiment of the present invention.
Another example of the
The
For example, if the received RF broadcast signal is a digital broadcast signal, the signal is converted into a digital IF signal (DIF). If the received RF broadcast signal is an analog broadcast signal, the signal is converted into an analog baseband image or a voice signal (CVBS / SIF). That is, the
In addition, the
The
The
The stream signal output from the
The external
The external
The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog) terminal, A DVI (Digital Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.
The wireless communication unit can perform short-range wireless communication with another digital device. The
Also, the external
Meanwhile, the external
The
The
Meanwhile, the
In addition, the
The
The
The
In addition, the
The
4 illustrates an embodiment in which the
The user
For example, the user
In addition, the user
The user
The
The video signal processed by the
The audio signal processed by the
Although not shown in FIG. 4, the
The
The
For example, the
The
On the other hand, the
In addition, the
On the other hand, when entering the application view item, the
The
Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided.
The channel browsing processing unit receives a stream signal TS output from the
The
The
Meanwhile, the
The
In order to detect the gesture of the user, a sensing unit (not shown) having at least one of a touch sensor, a voice sensor, a position sensor, and an operation sensor may be further included in the
On the other hand, a photographing unit (not shown) for photographing a user may be further provided. The image information photographed by the photographing unit (not shown) may be input to the
The
The
Particularly, it is possible to supply power to a
To this end, the
The
Also, the
The
In addition, the digital device according to the present invention may further include a configuration that omits some of the configuration shown in FIG. On the other hand, unlike the above, the digital device does not have a tuner and a demodulator, and can receive and reproduce the content through the network interface unit or the external device interface unit.
FIG. 5 is a block diagram illustrating a detailed configuration of the control unit of FIGS. 2 to 4 according to an embodiment of the present invention. Referring to FIG.
An example of the control unit includes a
The
The
The video decoder 425 decodes the demultiplexed video signal, and the
The
On the other hand, the video signal decoded by the
The
The
A frame rate conversion unit (FRC) 555 converts a frame rate of an input image. For example, the frame
The
On the other hand, the voice processing unit (not shown) in the control unit can perform the voice processing of the demultiplexed voice signal. Such a voice processing unit (not shown) may support processing various audio formats. For example, even when a voice signal is coded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, or BSAC, a corresponding decoder can be provided.
In addition, the voice processing unit (not shown) in the control unit can process the base, the treble, the volume control, and the like.
A data processing unit (not shown) in the control unit can perform data processing of the demultiplexed data signal. For example, the data processing unit can decode the demultiplexed data signal even when it is coded. Here, the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.
On the other hand, the above-described digital device is an example according to the present invention, and each component can be integrated, added, or omitted according to specifications of a digital device actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be divided into two or more components. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and devices thereof do not limit the scope of rights of the present invention.
Meanwhile, the digital device may be a video signal processing device that performs signal processing of an image stored in the device or an input image. Other examples of the video signal processing device include a set-top box (STB), a DVD player, a Blu-ray player, a game device, a computer Etc. can be further exemplified.
FIG. 6 is a diagram illustrating input means coupled to the digital device of FIGS. 2 through 4 according to one embodiment of the present invention.
A front panel (not shown) or a control means (input means) provided on the
The control means includes a
The input means may be a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) At least one can be employed as needed to communicate with the digital device.
The
The
Since the
On the other hand, the control means such as the
The digital device described in this specification uses the Web OS as an OS and / or platform. Hereinafter, the processing such as the configuration or algorithm based on the web OS can be performed in the control unit of the above-described digital device or the like. Here, the control unit includes the control unit in FIGS. 2 to 5 described above and uses it as a broad concept. Accordingly, in order to process services, applications, contents, and the like related to the web OS in the digital device, the hardware and components including related software, firmware, and the like are controlled by a controller Named and explained.
Such a web OS-based platform is intended to enhance development independence and function expandability by integrating services, applications, and the like based on, for example, a luna-service bus, Productivity can be increased. Also, multi-tasking can be supported by efficiently utilizing system resources and the like through a Web OS process and resource management.
Meanwhile, the web OS platform described in this specification can be used not only in fixed devices such as a PC, a TV, and a set-top box (STB) but also in mobile devices such as mobile phones, smart phones, tablet PCs, notebooks, wearable devices .
The structure of the software for digital devices is based on a single process and a closed product based on multi-threading with conventional problem solving and market-dependent monolithic structure, And has been pursuing new platform-based development since then, has pursued cost innovation through chip-set replacement, UI application and external application development efficiency, and developed layering and componentization Layer structure and an add-on structure for add-ons, single-source products, and open applications. More recently, the software architecture provides a modular architecture for functional units, a Web Open API (Application Programming Interface) for echo-systems, and a game engine And a native open API (Native Open API), and thus, a multi-process structure based on a service structure is being created.
7 is a diagram illustrating a Web OS architecture according to an embodiment of the present invention.
Referring to FIG. 7, the architecture of the Web OS platform will be described as follows.
The platform can be largely divided into a kernel, a system library based Web OS core platform, an application, and a service.
The architecture of the Web OS platform is layered structure, with the OS at the lowest layer, the system library (s) at the next layer, and the applications at the top.
First, the lowest layer includes a Linux kernel as an OS layer, and can include Linux as an OS of the digital device.
The OS layer is provided with a BOS (Board Support Package) / HAL (Hardware Abstraction Layer) layer, a Web OS core modules layer, a service layer, a Luna-Service layer Bus layer, Native Developer's Kit (NDK) / QT layer (Enyo framework / NDK / QT layer), and Application layer in the uppermost layer.
Meanwhile, some layers of the above-described web OS layer structure may be omitted, and a plurality of layers may be one layer, or one layer may be a plurality of layer structures.
The web OS core module layer is composed of a Luna Surface Manager (LSM) for managing surface windows and the like, a System & Application Manager (SAM) for managing the execution and execution status of applications, and a WebKit And a WAM (Web Application Manager) for managing web applications and the like.
The LSM manages an application window displayed on the screen. The LSM manages display hardware (Display HW) and provides a buffer for rendering contents necessary for applications. The LSM composes the results of rendering by a plurality of applications, Can be output.
The SAM manages various conditional execution policies of the system and the application.
WAM, on the other hand, is based on the Enyo Framework, which can be regarded as a basic application for web applications.
The use of an application's service is done via a luna-service bus, a new service can be registered on the bus, and an application can find and use the service it needs.
The service layer may include various service level services such as TV service and Web OS service. Meanwhile, the web OS service may include a media server, a Node.JS, and the like. In particular, the Node.JS service supports, for example, javascript.
Web OS services can communicate over a bus to a Linux process that implements function logic. It can be divided into four parts. It is developed from TV process and existing TV to Web OS, services that are differentiated by manufacturer, service which is manufacturer's common service and JavaScript, and is used through Node.js Node.js service.
The application layer may include all applications that can be supported in a digital device, such as a TV application, a showcase application, a native application, a web application, and the like.
An application on the Web OS can be divided into a Web application, a PDK (Palm Development Kit) application, a QT (Qt Meta Language or Qt Modeling Language) application and the like depending on an implementation method.
The web application is based on a WebKit engine and is executed on a WAM runtime. These web applications can be based on the ENI framework, or they can be developed and implemented on the basis of regular HTML5, CSS (Cascading Style Sheets), and JavaScript.
The PDK application includes a native application developed in C / C ++ based on a PDK provided for third-party or external developers. The PDK refers to a development library and a tool set provided for a third party such as a game to develop a native application (C / C ++). For example, PDK applications can be used to develop applications where performance is critical.
The QML application is a Qt-based native application and includes a basic application provided with a web OS platform such as a card view, a home dashboard, a virtual keyboard, and the like. Here, QML is a mark-up language in the form of a script instead of C ++.
In the meantime, the native application is an application that is developed and compiled in C / C ++ and executed in a binary form. The native application has a high speed of execution.
FIG. 8 is a diagram illustrating an architecture of a Web OS device according to an exemplary embodiment of the present invention. Referring to FIG.
8 is a block diagram based on the runtime of the Web OS device, which can be understood with reference to the layered structure of FIG.
The following description will be made with reference to FIGS. 7 and 8. FIG.
Referring to FIG. 8, services and applications and WebOS core modules are included on the system OS (Linux) and system libraries, and communication between them can be done via the Luna-Service bus.
E-mail, contact, calendar, etc. Node.js services based on HTML5, CSS, java script, logging, backup, file notify notify, database (DB), activity manager, system policy, audio daemon (AudioD), update, media server, TV services such as Electronic Program Guide (PVD), Personal Video Recorder (PVR), data broadcasting, voice recognition, Now on, Notification, search, , CP services such as Auto Content Recognition (ACR), Contents List Browser (CBOX), wfdd, DMR, Remote Application, download, Sony Philips Digital Interface Format (SDPIF), PDK applications, , QML applications, etc. And the enyo framework based on the TV UI-related applications and web applications, Luna - made the process via the Web OS core modules, such as the aforementioned SAM, WAM, LSM via the service bus. Meanwhile, in the above, TV applications and web applications may not necessarily be UI-based or UI-related.
CBOX can manage the list and metadata of external device contents such as USB, DLNA, cloud etc. connected to TV. Meanwhile, the CBOX can output a content listing of various content containers such as a USB, a DMS, a DVR, a cloud, etc. to an integrated view. In addition, CBOX can display various types of content listings such as pictures, music, video, and manage the metadata. In addition, the CBOX can output the contents of the attached storage in real-time. For example, when a storage device such as a USB is plugged in, the CBOX must be able to immediately output the content list of the storage device. At this time, a standardized method for processing the content listing may be defined. In addition, CBOX can accommodate various connection protocols.
The SAM is intended to improve module complexity and scalability. For example, the existing system manager processes various functions such as system UI, window management, web application runtime, and UX constraint processing in one process to separate the main functions to solve the large implementation complexity, Clarify the implementation interface by clarifying the interface.
LSM supports independent development and integration of system UX implementations such as card view, launcher, etc., and supports easy modification of product requirements. LSM can make multi-tasking possible by utilizing hardware resources (HW resources) when synthesizing a plurality of application screens such as an application in-app, A window management mechanism can be provided.
LSM supports implementation of system UI based on QML and improves its development productivity. Based on MVC, QML UX can easily construct views for layouts and UI components, and can easily develop code for handling user input. On the other hand, the interface between the QML and the Web OS component is via the QML extension plug-in, and the graphic operation of the application can be based on the wayland protocol, luna-service call, etc. have.
LSM is an abbreviation of Luna Surface Manager, as described above, which functions as an Application Window Compositor.
LSM synthesizes independently generated applications, UI components, etc. on the screen and outputs them. In this regard, when components such as recents applications, showcase applications, launcher applications, etc. render their own contents, the LSM defines the output area, interworking method, etc. as a compositor. In other words, the compositor LSM handles graphics synthesis, focus management, input events, and so on. At this time, the LSM receives events, focus, etc. from the input manager. These input managers can include HIDs such as remote controllers, mouse & keyboards, joysticks, game pads, application remotes, and pen touches.
As such, LSM supports multiple window models, which can be performed simultaneously in all applications due to the nature of the system UI. In this regard, it is also possible to provide various functions such as launcher, resent, setting, notification, system keyboard, volume UI, search, finger gesture, Voice Recognition (STT (Sound to Text), TTS LSM can support a pattern gesture (camera, mobile radio control unit (MRCU), live menu, ACR (Auto Content Recognition), etc.) .
9 is a diagram illustrating a graphic composition flow in a web OS device according to an embodiment of the present invention.
9, the graphic composition processing includes a
When the web application-based graphic data (or application) is generated as a UI process in the
The
On the other hand, the full-screen application is passed directly to the
The graphical manager processes all the graphic data in the web OS device, including the data through the LSM GM surface described above, the data through the WAM GM surface, as well as the GM surface such as data broadcasting application, caption application, And processes the received graphic data appropriately on the screen. Here, the functions of the GM compositor are the same or similar to those of the compositor described above.
FIG. 10 is a view for explaining a media server according to an embodiment of the present invention, FIG. 11 is a view for explaining a configuration block diagram of a media server according to an embodiment of the present invention, and FIG. 12 Is a diagram illustrating a relationship between a media server and a TV service according to an embodiment of the present invention.
The media server supports the execution of various multimedia in the digital device and manages the necessary resources. The media server can efficiently use hardware resources required for media play. For example, the media server requires an audio / video hardware resource for multimedia execution and can efficiently utilize the resource usage status by managing it. In general, a fixed device having a larger screen than a mobile device needs more hardware resources to execute multimedia, and a large amount of data must be encoded / decoded and transmitted at a high speed. On the other hand, the media server is a task that performs broadcasting, recording and tuning tasks in addition to streaming and file-based playback, recording simultaneously with viewing, and simultaneously displaying the sender and recipient screens at the time of video call And so on. However, the media server is limited in terms of hardware resources such as an encoder, a decoder, a tuner, and a display engine, and thus it is difficult to execute a plurality of tasks at the same time. For example, And processes it.
The media server may be robust in system stability because, for example, a pipeline in which an error occurs during media playback can be removed and restarted on a per-pipeline basis, Even if it does not affect other media play. Such a pipeline is a chain that links each unit function such as decoding, analysis, and output when a media reproduction request is made, and the necessary unit functions may be changed according to a media type and the like.
The media server may have extensibility, for example, adding a new type of pipeline without affecting existing implementations. As an example, the media server may accommodate a camera pipeline, a video conference (Skype) pipeline, a third-party pipeline, and the like.
The media server can process general media playback and TV task execution as separate services because the interface of the TV service is different from the case of media playback. In the above description, the media server supports operations such as 'setchannel', 'channelup', 'channeldown', 'channeltuning', and 'recordstart' in relation to the TV service, ',' stop ', and so on, so that they can support different operations for both, and can be processed as separate services.
The media server can control or integrally manage the resource management function. The allocation and recall of hardware resources in the device are performed integrally in the media server. In particular, the TV service process transfers the running task and the resource allocation status to the media server. The media server obtains resources and executes the pipeline each time each media is executed, and permits execution by a priority (e.g., policy) of the media execution request, based on the resource status occupied by each pipeline, and And performs resource recall of other pipelines. Here, the predefined execution priority and necessary resource information for the specific request are managed by the policy manager, and the resource manager can communicate with the policy manager to process the resource allocation, the number of times, and the like.
The media server may have an identifier (ID) for all playback related operations. For example, the media server may issue a command to indicate a particular pipeline based on the identifier. The media server may separate the two into pipelines for playback of more than one media.
The media server may be responsible for playback of the
In addition, the media server may follow the TV restructuring scope of the TV pipeline as a separate service process. The media server can be designed regardless of the TV restructuring scope. If the TV is not a separate service process, it may be necessary to re-execute the entire TV when there is a problem with a specific task.
The media server is also referred to as uMS, i.e., a micro media server. Here, the media player is a media client, which is a media client, for example, a web page for an HTML5 video tag, a camera, a TV, a Skype, a second screen, It can mean a kit (Webkit).
In the media server, management of micro resources such as a resource manager, a policy manager, and the like is a core function. In this regard, the media server also controls the playback control role for the web standard media content. In this regard, the media server may also manage pipeline controller resources.
Such a media server supports, for example, extensibility, reliability, efficient resource usage, and the like.
In other words, the uMS or media server may be a web OS device, such as a resource, such as a cloud game, a MVPD (pay service), a camera preview, a second screen, a Skype, And manage and control the use of resources for proper processing in an overall manner so as to enable efficient use. On the other hand, each resource uses, for example, a pipeline in its use, and the media server can manage and control the generation, deletion, and use of a pipeline for resource management as a whole.
Here, the pipeline refers to, for example, when a media associated with a task starts a series of operations, such as parsing of a request, a decoding stream, and a video output, . For example, with respect to a TV service or an application, watching, recording, channel tuning, and the like are each individually handled under the control of resource utilization through a pipeline generated according to the request .
The processing structure and the like of the media server will be described in more detail with reference to FIG.
10, an application or service is connected to a media server 1020 via a luna-to-service bus 1010, and the media server 1020 is connected to pipelines generated again via the luna- Connected and managed.
The application or service can have various clients depending on its characteristics and can exchange data with the media server 1020 or the pipeline through it.
The client includes, for example, a uMedia client (web kit) and a RM (resource manager) client (C / C ++) for connection with the media server 1020.
The application including the uMedia client is connected to the media server 1020, as described above. More specifically, the uMedia client corresponds to, for example, a video object to be described later, and the client uses the media server 1020 for the operation of video by a request or the like.
Here, the video operation relates to the video state, and the loading, unloading, play, playback, or reproduce, pause, stop, Data may be included. Each operation or state of such video can be processed through individual pipeline generation. Accordingly, the uMedia client sends the state data associated with the video operation to the pipeline manager 1022 in the media server.
The pipeline manager 1022 obtains information on resources of the current device through data communication with the resource manager 1024 and requests resource allocation corresponding to the state data of the uMedia client. At this time, the pipeline manager 1022 or the resource manager 1024 controls resource allocation through the data communication with the policy manager 1026 when necessary in connection with the resource allocation or the like. For example, when the resource manager 1024 requests or does not have resources to allocate according to a request of the pipeline manager 1022, appropriate resource allocation or the like may be performed according to the priority comparison of the policy manager 1026 or the like .
On the other hand, the pipeline manager 1022 requests the media pipeline controller 1028 to generate a pipeline for an operation according to the request of the uMedia client with respect to resources allocated according to the resource allocation of the resource manager 1024 .
The media pipeline controller 1028 generates necessary pipelines under the control of the pipeline manager 1022. [ The generated pipeline can generate pipelines related to playback, pause, suspension, etc., as well as media pipelines and camera pipelines, as shown. The pipeline may include pipelines for HTML5, Web CP, smartshare playback, thumbnail extraction, NDK, Cinema, MHEG (Multimedia and Hypermedia Information Coding Experts Group), and the like.
In addition, the pipeline may include, for example, a service-based pipeline (its own pipeline) and a URI-based pipeline (media pipeline).
Referring to FIG. 10, an application or service including an RM client may not be directly connected to the media server 1020. This is because the application or service may directly process the media. In other words, if the application or service directly processes the media, it may not pass through the media server. However, at this time, uMS connectors need to manage resources for pipeline creation and use. Meanwhile, when receiving a resource management request for direct media processing of the application or service, the uMS connector communicates with the media server 1020 including the resource manager 1024. To this end, the media server 1020 also needs to have an uMS connector.
Accordingly, the application or service can respond to the request of the RM client by receiving the resource management of the resource manager 1024 through the uMS connector. These RM clients can handle services such as native CP, TV service, second screen, Flash player, YouTube Medai Source Extensions (MSE), cloud gaming, Skype. In this case, as described above, the resource manager 1024 can appropriately manage resources through data communication with the policy manager 1026 when necessary for resource management.
On the other hand, the URI-based pipeline is performed through the media server 1020 instead of processing the media directly as in the RM client described above. Such URI-based pipelines may include a player factory, a G streamer, a streaming plug-in, a DRM (Digital Rights Management) plug-in pipeline, and the like.
On the other hand, the interface method between application and media services may be as follows.
It is a way to interface with a service in a web application. This is a way of using Luna Call using the Palm Service Bridge (PSB), or using Cordova, which extends the display to video tags. In addition, there may be a method using the HTML5 standard for video tags or media elements.
And, it is a method of interfacing with PDK using service.
Alternatively, it is a method of using the service in the existing CP. It can be used to extend existing platform plug-ins based on Luna for backward compatibility.
Finally, it is a way to interface in the case of non-web OS. In this case, you can interface directly by calling the Luna bus.
Seamless change is handled by a separate module (eg TVWIN), which is the process for displaying and streamlining the TV on the screen without Web OS, before or during WebOS boot . This is because the boot time of WebOS is delayed, so it is used to provide the basic function of the TV service first for quick response to the user's power on request. In addition, the module is part of the TV service process and supports quick boot and null change, which provides basic TV functions, and factory mode. In addition, the module may also switch from the Non-Web OS mode to the Web OS mode.
Referring to FIG. 11, a processing structure of a media server is shown.
11, the solid line box represents the process processing configuration, and the dashed box represents the internal processing module during the process. In addition, the solid line arrows represent inter-process calls, that is, Luna service calls, and the dotted arrows may represent notifications or data flows such as register / notify.
A service or a web application or a PDK application (hereinafter 'application') is connected to various service processing components via a luna-service bus, through which an application is operated or controlled.
The data processing path depends on the type of application. For example, when the application is image data related to the camera sensor, the image data is transmitted to the
Alternatively, when the application includes audio data, the audio processing unit (AudioD) 1140 and the audio module (PulseAudio) 1150 can process the audio. For example, the
Alternatively, when the application includes or processes (includes) DRM-attached content, the DRM service processing unit 1170 transmits the content data to the DRM
Hereinafter, processing in the case where the application is media data or TV service data (e.g., broadcast data) will be described.
FIG. 12 shows only the media server processing unit and the TV service processing unit in FIG. 11 described above in more detail.
Therefore, the following description will be made with reference to FIGS. 11 and 12. FIG.
First, when the application includes TV service data, it is processed in the TV
The TV
In this specification, the TV service processing unit may have a configuration as shown in Fig. 11 or 12, or may be implemented by a combination of both, in which some configurations are omitted or some configurations not shown may be added.
The TV
Alternatively, a json (JavaScript standard object notation) file or a file created in c is processed by a TV broadcast handler and transmitted to a TV pipeline manager through a TV broadcast interface to generate and process a TV pipeline. In this case, the TV broadcast interface unit may transmit the data or the file that has passed through the TV broadcast handler to the TV pipeline manager based on the TV service policy and refer to it when creating the pipeline.
On the other hand, the TV pipeline manager can be controlled by the TV resource manager in generating one or more pipelines according to a TV pipeline creation request from a processing module in a TV service, a manager, or the like. Meanwhile, the TV resource manager can be controlled by the TV policy manager to request the status and allocation of resources allocated for the TV service according to the TV pipeline creation request of the TV pipeline manager, and the media
Meanwhile, the BSP supports, for example, backward compatibility with existing digital devices.
The TV pipelines thus generated can be appropriately operated according to the control of the path manager in the process. The path manager can determine and control the processing path or process of the pipelines by considering not only the TV pipeline but also the operation of the pipeline generated by the media
Next, when the application includes media data, rather than TV service data, it is processed by the media
In this specification, the resource manager in the media
13 is a configuration diagram of a digital device according to an embodiment of the present invention.
13, the
The
The
The IR
The
The
When the
When the
The
The
The
14 is a flowchart of a method of controlling a digital device according to an embodiment of the present invention. The present invention is performed by
First, a communication connection is established with the mobile device (S1410).
The video data displayed on the first screen, i.e., the screen of the mobile device, is received from the mobile device (S1420).
Upon receiving the first specific signal from the mobile device (S1430), a part of the received video data corresponding to the first specific signal is displayed on the second screen, which is the screen of the digital device (S1440).
Upon receipt of the second specific signal from the mobile device (S1450), all of the video data corresponding to the second specific signal is displayed on the second screen.
FIG. 15 is a view illustrating screen mirroring of the remaining area excluding the obscured area when the screen is covered with the hand floor according to an embodiment of the present invention.
Here, the first specific signal corresponds to a signal that the
Referring to an embodiment 1510, a first screen, which is a screen of the
The
For example, when the
Referring to embodiment 1520, corresponding to the input received from the user, the
Upon receiving the first
According to the present invention, when a user hides a part of a screen of a mobile device by hand, other parts except for the hidden area can be screen-mirrored and displayed on a display device. Unwanted areas can be set, which improves user convenience.
16 is a diagram illustrating setting of a screen mirroring area with a screen division indicator according to an embodiment of the present invention.
When the
Referring to an
Here, the
The
Specifically, when receiving an input from the user for setting the
Specifically, when the
Referring to
Upon receiving the third
The video data 10-1 corresponding to the
Referring to
Upon receiving the fourth
According to the present invention, if a first area desired to be shared on the screen and a second area not desired to be shared on the screen are set as the screen division indicator, the first area can be screen-mirrored and displayed on the digital device, The user can easily set the desired region and the region not desired to be shared.
Further, since the first area and the second area can be switched and displayed on the digital device, user convenience is improved.
17 is a diagram illustrating a screen size adjustment when a volume button of a smartphone is pressed according to an embodiment of the present invention.
When the controller receives the fourth specific signal from the mobile device through the wireless communication module in the mirroring state, the controller enlarges the size of the screen in response to the fourth specific signal.
When the
Referring to
Referring to
For example, upon receiving an input from the user touching the
Referring to
For example, upon receiving an input from the user touching the
According to the present invention, a first area desired to be shared on the screen is set, and the size of the first area is adjusted using the volume button of the mobile device, so that the first area can be displayed on the digital device. Since the size of the area to be shared can be adjusted, user convenience is improved.
FIG. 18 is a diagram illustrating black screen processing of a portion of a non-smart phone screen area in a mirroring state according to an exemplary embodiment of the present invention.
In the mirroring state, the controller displays a part or the whole of the received video data on the second screen, and displays a black screen in the area of the second screen not related to the video data.
Referring to FIG. 18, first, the
The
The
The controller displays a black screen in the
According to the present invention, a black screen is displayed in an area irrelevant to video data received from the mobile device, so that the user can concentrate on the content, thereby improving user convenience.
FIG. 19 is a diagram illustrating automatic adjustment and display of video data when a screen of a mobile device according to an exemplary embodiment of the present invention is a specific content screen and a vertical screen.
In the mirroring state, the controller automatically adjusts the video data and displays the adjusted video data on the second screen when the video data is a specific content and a vertical screen.
Referring to FIG. 19, first, the
The
The
The controller automatically adjusts the video data 10-1 and displays the adjusted video data 10-2 on the screen in a full screen when the video data 10-1 is a specific content and is a vertical screen.
Here, the specific content can be a movie, a YouTube video, a drama, a sports, a documentary, and the like.
According to the present invention, in the case of general mirroring, video data can be adjusted and displayed in full screen when the video data is the specific content, while the screen of the mobile device is simply displayed on the digital device similarly, Since the screen can be enlarged and viewed on the entire screen, user convenience is improved.
20 is a diagram illustrating display of video data based on resolution information of a mobile device according to an embodiment of the present invention.
In the mirroring state, the
Referring to FIG. 20, the
Here, the supported
The
Here, the
The
That is, when the
For example, first, when the data transmission rate between the
Secondly, if the data transmission rate between the
The
According to the present invention, when the resolution information between the mobile device and the digital device is different, the mobile device determines the resolution information and transmits the resolution information and the video data information to the digital device. In addition, since the mobile device can determine the resolution information in consideration of the communication state with the digital device and provide a better viewing environment to the user, user convenience is improved.
When the resolution information of the video data is different from the resolution information of the digital device in the mirroring state, the
Referring to FIG. 20, a case where the supported resolution of the digital device is FHD (1920 x 1080) and the resolution of the video data of the mobile device is SD (720 x 480) is an example.
In the related art, mirroring can be performed only when the resolution of the digital device is equal to the resolution of the video data. When the resolution of the digital device is different from the resolution of the video data, mirroring can not be performed.
The
The
The
Next, a description will be given of a case where the supported resolution of the digital device is SD (720 x 480) and the resolution of the video data is FHD (1920 x 1080). That is, the supported resolution of the digital device is lower than the resolution of the video data.
The
The
According to the present invention, even if the resolution of the video data is lower than the resolution supported by the digital device, the resolution of the video data can be scaled and displayed, and even if the resolution of the video data is different from the resolution supported by the digital device, Thereby improving user convenience.
21 is a diagram illustrating a mobile device in association with a digital device according to an embodiment of the present invention.
The
The
The
The
The
The
The
22 is a diagram illustrating a control method of a mobile device in conjunction with a digital device according to an embodiment of the present invention. The present invention is performed by a controller (180).
First, the
The video data displayed on the first screen, which is the screen of the
The first specific signal for displaying a part of the video data on the second screen, which is the screen of the digital device, is transmitted to the digital device (S2230).
The second specific signal for displaying all of the video data on the second screen is transmitted to the digital device 1300 (S2240).
23 is a diagram illustrating a mobile device according to an embodiment of the present invention transmitting video data and a black screen generation signal to a digital device.
When transmitting the video data to the digital device, the
23, when the
The
According to the present invention, a black screen is displayed in an area irrelevant to video data received from the mobile device, so that the user can concentrate on the content, thereby improving user convenience.
24 is a diagram illustrating setting of a shared screen area as a specific gesture of a user according to an embodiment of the present invention.
The controller receives the specific gesture input from the user through the input unit, and transmits the video data corresponding to the specific gesture input to the digital device through the wireless communication unit.
Referring to
Here, the
For example, when the
The area of the
The
Referring to
According to one embodiment of the present invention, when a pinch-out or pinch-in gesture input is received at the
Specifically, referring to
Referring to
Referring to
Referring to
According to the present invention, a first area desired to be shared on the screen and a second area not desired to be shared can be set as a simple touch input, and the first area can be screen-mirrored and displayed on the display device. In addition, since the area size of the video data can be easily adjusted by pinch-out, pinch-in, and simple gestures, user convenience is improved.
According to an embodiment of the present invention, when a mobile device and a digital device are communicatively connected, when receiving a specific signal from the mobile device, only a part of the mobile device screen is displayed. Therefore, the screen area that the user does not want to share is not displayed, so the user's convenience is improved.
According to another embodiment of the present invention, when a portion of the mobile device screen is covered with the user's hand, the remaining screen except for the screen hidden by the hand is displayed. Therefore, the user can easily set a screen area that the user does not want to share, thereby improving the user's convenience.
According to another embodiment of the present invention, a screen area desired to be shared by a user is reset using a screen division indicator of the mobile device, and a reset screen area is displayed. Accordingly, the user can adjust the first screen area desired to be shared by the user and the second screen area not desired to be shared, thereby improving user convenience.
The digital device and the method of processing a service or an application in the digital device equipped with the web OS disclosed in this specification are not limited in the configuration and the method of the embodiments described above but the embodiments can be variously modified All or some of the embodiments may be selectively combined.
Meanwhile, the operation method of the digital device disclosed in this specification can be implemented as a code readable by a processor in a recording medium readable by a processor included in the digital device. The processor-readable recording medium includes all kinds of recording devices in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include ROM (Read Only Memory), RAM (Random Access Memory), CD-ROM, magnetic tape, floppy disk, optical data storage device, And may be implemented in the form of a carrier-wave. In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Of the right. Further, such modifications are not to be understood individually from the technical idea of the present invention.
1300: Digital device
1301: Remote controller
1310: Tuner
1320: Interface module
1330: IR signal receiving module
1340: Controller
1350: Display module
1360: Speaker
1370: Memory
1380: Wireless communication module
Claims (16)
A wireless communication module for transmitting and receiving data to and from the mobile device;
Receiving video data displayed on a first screen which is a screen of a mobile device from the mobile device via the wireless communication module and receiving a first specific signal from the mobile device via the wireless communication module, And displays a part of the received video data on a second screen which is a screen of the digital device, and when receiving a second specific signal from the mobile device via the wireless communication module, A controller for displaying on the screen; And
A display module for displaying all or a part of the video data on the second screen in accordance with a control command from the controller;
≪ / RTI >
When receiving from the user an input for manually masking a part of the first screen, corresponding to a signal transmitted from the mobile device to the digital device
Lt; / RTI >
And adjusting the size of the area of the video data corresponding to the third specific signal when the third specific signal is received from the mobile device via the wireless communication module
Lt; / RTI >
When receiving a fourth specific signal from the mobile device via the wireless communication module in a mirroring state, enlarging the size of the area of the video data corresponding to the fourth specific signal
Lt; / RTI >
When receiving a fifth specific signal from the mobile device via the wireless communication module in the mirroring state, reducing the size of the area of the video data corresponding to the fifth specific signal
Lt; / RTI >
Displaying a part or all of the received video data in the mirroring state on the second screen and displaying a black screen in an area not related to the video data in the second screen
Lt; / RTI >
In the mirroring state, when the video data is a specific content and a vertical screen, automatically adjusting the video data and displaying the adjusted video data on the second screen
Lt; / RTI >
Transmitting resolution information of the digital device to the mobile device in a mirroring state, receiving the resolution information from the mobile device, and displaying the video data on the screen based on the received resolution information
Lt; / RTI >
Wherein when the resolution information of the video data is different from the resolution information of the digital device in the mirroring state, the resolution of the video data is scaled based on the supported resolution information, and the scaled video data is displayed on the second screen To do
Lt; / RTI >
Communicating with a mobile device;
Receiving from the mobile device video data to be displayed on a first screen which is a screen of the mobile device;
Displaying a part of the video data received corresponding to the first specific signal on a second screen of the digital device when the first specific signal is received from the mobile device; And
Displaying all of the video data on the second screen in response to the second specific signal upon receiving a second specific signal from the mobile device
The method comprising:
A wireless communication unit for transmitting and receiving data to and from the digital device;
An input unit for receiving a touch input from a user;
Transmitting a first specific signal for displaying video data displayed on a first screen, which is a screen of the mobile device, to the digital device through the wireless communication unit and displaying a part of the video data on a second screen, A controller for transmitting a second specific signal through the wireless communication unit to the digital device and displaying the entirety of the video data on the second screen through the wireless communication unit; And
A display unit for displaying the video data on the first screen in accordance with a control command from the controller,
≪ / RTI >
And transmitting the first specific signal to the digital device when receiving an input from a user to manually cover a portion of the first screen
Lt; / RTI >
Setting the area of the video data to be transmitted using the screen division indicator
Lt; / RTI >
When transmitting the video data to the digital device, transmitting a black screen generation signal through the wireless communication unit at the same time
Lt; / RTI >
And transmitting the video data corresponding to the specific gesture input to the digital device through the wireless communication unit when the specific gesture input is received from the user through the input unit
Lt; / RTI >
Communicating with the digital device;
Transmitting video data to be displayed on a first screen, which is a screen of the mobile device, to the digital device;
Transmitting, to the digital device, a first specific signal for displaying a part of the video data on a second screen which is a screen of the digital device; And
Transmitting to the digital device a second specific signal for displaying all of the video data on the second screen
Gt; a < / RTI >
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160010551A KR20170090102A (en) | 2016-01-28 | 2016-01-28 | Digital device and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160010551A KR20170090102A (en) | 2016-01-28 | 2016-01-28 | Digital device and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170090102A true KR20170090102A (en) | 2017-08-07 |
Family
ID=59653925
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160010551A KR20170090102A (en) | 2016-01-28 | 2016-01-28 | Digital device and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170090102A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107515717A (en) * | 2017-09-05 | 2017-12-26 | 三星电子(中国)研发中心 | A kind of exchange method of information, system and device |
WO2020111346A1 (en) * | 2018-11-30 | 2020-06-04 | Samsung Electronics Co., Ltd. | Interaction method, system, and device for information |
US11086500B2 (en) | 2018-11-29 | 2021-08-10 | Samsung Electronics Co., Ltd. | Foldable electronic device and method for displaying information in foldable electronic device |
US11893303B2 (en) | 2021-11-30 | 2024-02-06 | Samsung Electronics Co., Ltd. | Device and method for performing mirroring |
-
2016
- 2016-01-28 KR KR1020160010551A patent/KR20170090102A/en unknown
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107515717A (en) * | 2017-09-05 | 2017-12-26 | 三星电子(中国)研发中心 | A kind of exchange method of information, system and device |
CN107515717B (en) * | 2017-09-05 | 2021-07-09 | 三星电子(中国)研发中心 | Information interaction method, system and device |
US11086500B2 (en) | 2018-11-29 | 2021-08-10 | Samsung Electronics Co., Ltd. | Foldable electronic device and method for displaying information in foldable electronic device |
WO2020111346A1 (en) * | 2018-11-30 | 2020-06-04 | Samsung Electronics Co., Ltd. | Interaction method, system, and device for information |
US11281334B2 (en) | 2018-11-30 | 2022-03-22 | Samsung Electronics Co., Ltd. | Interaction method, system, and device for information |
US11893303B2 (en) | 2021-11-30 | 2024-02-06 | Samsung Electronics Co., Ltd. | Device and method for performing mirroring |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101567832B1 (en) | Digital device and method for controlling the same | |
KR101586321B1 (en) | Display device and controlling method thereof | |
KR102413328B1 (en) | Main speaker, sub speaker and system comprising main speaker and sub speaker | |
KR102364674B1 (en) | Display device, and controlling method thereof | |
KR101632221B1 (en) | Digital device and method for processing service thereof | |
KR20160062417A (en) | Multimedia device and method for controlling the same | |
KR20170031370A (en) | Mobile terminal and method for controlling the same | |
KR102381141B1 (en) | Display device and method for controlling the same | |
KR20150101369A (en) | Digital device and method of processing video data thereof | |
KR20170090102A (en) | Digital device and method for controlling the same | |
KR20170087307A (en) | Display device and method for controlling the same | |
KR20150101900A (en) | Digital device and method for processing stt thereof | |
KR102356780B1 (en) | Display device and method for controlling the same | |
KR102603458B1 (en) | A digital device and method for controlling the same | |
KR20170092408A (en) | Digital device and method for controlling the same | |
KR20170138788A (en) | Digital device and controlling method thereof | |
KR20170126645A (en) | Digital device and controlling method thereof | |
KR20170002119A (en) | Display device and controlling method thereof | |
KR102418140B1 (en) | Digital device and method for controlling the same | |
KR102439464B1 (en) | Digital device and method for controlling the same | |
KR102404357B1 (en) | Digital device and method for controlling the same | |
KR102294600B1 (en) | Digital device and method for controlling the same | |
KR102158698B1 (en) | Digital device and method for controlling the same | |
KR20150101902A (en) | Digital device and method of controlling thereof | |
KR20160127438A (en) | Display device and method for controlling the same |