KR20170011763A - Digital device and method of processing data the same - Google Patents

Digital device and method of processing data the same Download PDF

Info

Publication number
KR20170011763A
KR20170011763A KR1020150104984A KR20150104984A KR20170011763A KR 20170011763 A KR20170011763 A KR 20170011763A KR 1020150104984 A KR1020150104984 A KR 1020150104984A KR 20150104984 A KR20150104984 A KR 20150104984A KR 20170011763 A KR20170011763 A KR 20170011763A
Authority
KR
South Korea
Prior art keywords
data
mobile terminal
call application
call
application data
Prior art date
Application number
KR1020150104984A
Other languages
Korean (ko)
Inventor
최우진
장헤라
김영근
김창회
김창훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150104984A priority Critical patent/KR20170011763A/en
Publication of KR20170011763A publication Critical patent/KR20170011763A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L29/00Arrangements, apparatus, circuits or systems, not covered by a single one of groups H04L1/00 - H04L27/00
    • H04L29/02Communication control; Communication processing
    • H04L29/06Communication control; Communication processing characterised by a protocol
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L29/00Arrangements, apparatus, circuits or systems, not covered by a single one of groups H04L1/00 - H04L27/00
    • H04L29/02Communication control; Communication processing
    • H04L29/06Communication control; Communication processing characterised by a protocol
    • H04L29/08Transmission control procedure, e.g. data link level control procedure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/10Network-specific arrangements or communication protocols supporting networked applications in which an application is distributed across nodes in the network
    • H04L67/1095Network-specific arrangements or communication protocols supporting networked applications in which an application is distributed across nodes in the network for supporting replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes or user terminals or syncML
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/36Network-specific arrangements or communication protocols supporting networked applications involving the display of network or application conditions affecting the network application to the application user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Application independent communication protocol aspects or techniques in packet data networks
    • H04L69/28Timer mechanisms used in protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Abstract

The present invention discloses a digital device and a data processing method for the digital device. The digital device for receiving and processing data of a mobile terminal according to an embodiment of the present invention includes: a receiving unit which receives mirroring data from the mobile terminal according to a first communication protocol, guidance data in response to the detection of incoming call from the mobile terminal, a caller-related first calling application data according to the first communication protocol, and a caller-related second calling application data according to the activated second communication protocol; a control unit which activates, for calling service, the mobile terminal and the second communication protocol in response to the receipt of the guidance data, controls each of the caller-related first and second application data to be output after the activation, and controls caller-related third calling application data, received according to the second communication protocol through an input means connected to the digital device, to be transmitted; a screen which outputs the mirroring data, the guidance data, and the caller-related first calling application data; and a speaker which outputs the caller-related second calling application data.

Description

TECHNICAL FIELD [0001] The present invention relates to a digital device and a method for processing data in the digital device.

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a digital device, and more particularly, to a digital device for processing, outputting and receiving data received from external devices.

A mobile device such as a smart phone, a tablet PC, a wearable device and the like, in addition to a standing device such as a personal computer (PC) or a television (TV) Let this snow. Fixed devices and mobile devices have conventionally been developed by building their own areas. However, recently, the area has become obscure due to the boom of digital convergence, and various services are being provided through data communication with each other.

For example, a mobile device employing a relatively small size display is enjoying its own content using a mirror device with a fixed device employing a relatively large size display.

Conventionally, however, there is a restriction or inconvenience in using a specific application or function of a mobile device during data communication using the mirror cast method, and thus the miracle cast only displays the content through a relatively large display There is a problem that it is inconvenient in the use process from the viewpoint of the user.

In order to solve the above inconvenience, the present invention discloses a digital device and a method for processing data in the digital device.

An object of the present invention is to process and output an application of an external device in a digital device through data communication.

Another object of the present invention is to provide a processing method when a calling application is executed in the process of using content of an external device through a data sharing method in a digital device do.

The present invention provides guide data, execution data regarding a call application of an external device in a digital device, and provides execution data of the same or similar quality as that of an external device Another task is to do.

The technical problem to be solved by the present invention is not limited to the above-described technical problems and other technical problems which are not mentioned can be clearly understood by those skilled in the art from the following description .

This document discloses various embodiments (s) of digital devices and processing methods in the digital devices.

A method of processing data of a mobile terminal in a digital device according to an embodiment of the present invention includes: pairing with the mobile terminal; Receiving mirroring data from the mobile terminal in a first communication protocol; Outputting the received mirroring data on a screen; Receiving guide data according to an incoming call detection from the mobile terminal; Activating a second communication protocol for call service; And receiving first call application data related to a caller from the mobile terminal according to the first communication protocol and receiving and outputting second call application data related to the caller according to the activated second communication protocol, And transmitting the third call application data related to the called party received through the input means connected to the digital device to the mobile terminal according to the second communication protocol.

A digital device for receiving and processing data of a mobile terminal according to an embodiment of the present invention may include at least one of mirroring data in a first communication protocol, guide data in response to detection of an incoming call from the mobile terminal, A receiver configured to receive first call application data related to a caller according to a first communication protocol and receive second call application data related to the caller according to the activated second communication protocol; Activating a second communication protocol with the mobile terminal for a call service according to the reception of the guide data, controlling the first call application data and the second call application data related to the caller to be output after the activation, 2 control protocol for controlling the mobile terminal to transmit third call application data related to a called party received through an input means connected to the digital device according to a second communication protocol; A screen for outputting the mirroring data, the guide data, and first call application data related to the caller; And a speaker for outputting second call application data related to the caller.

The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.

The effects of the present invention are as follows.

According to one embodiment of the various embodiments of the present invention, an application of an external device in a digital device can be processed and output through data communication.

According to another embodiment of the various embodiments of the present invention, when a calling application is executed in the process of using content of an external device through a data sharing method or the like in a digital device, There is an effect that a method can be provided.

According to another embodiment of the various embodiments of the present invention there is provided a method for providing guide data, execution data, and the like on an external device's call application in a digital device, It is possible to provide quality execution data.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 schematically illustrates a service system according to an embodiment of the present invention;
2 is a block diagram illustrating a digital device according to one embodiment of the present invention;
FIG. 3 is a block diagram of another configuration or detailed configuration of FIG. 2; FIG.
4 is a block diagram illustrating an external device according to one embodiment of the present invention;
5 is a block diagram illustrating a digital device or external device according to another embodiment of the present invention;
6 illustrates control means for digital device control according to an embodiment of the present invention;
FIG. 7 illustrates a mobile terminal that transmits mirroring data to the digital TV according to a digital TV and a mirror cast method; FIG.
FIG. 8 to FIG. 10 are diagrams illustrating a data processing process when a digital TV receives an incoming call to the mobile terminal during mirroring data processing of the mobile terminal according to an exemplary embodiment of the present invention;
11 is a diagram illustrating a method of processing call application data in a digital TV connected to a mobile terminal according to an exemplary embodiment of the present invention;
12 is a diagram for explaining a method of processing mirroring data and call application data according to an embodiment of the present invention;
13 is a flowchart illustrating a method of processing call application data of a mobile terminal using a digital TV and a remote controller according to an embodiment of the present invention;
FIG. 14 is a diagram for explaining a solution to a lip sync problem occurring in the process of processing call application data through a digital TV according to an embodiment of the present invention; FIG.
15 is a flowchart of a lip sync processing method in a digital TV according to an embodiment of the present invention;
16 is a diagram for explaining an audio engine usage for improving call quality in providing a call service through a digital TV according to an embodiment of the present invention;
17 illustrates one embodiment of guide data for a noise sound reduction method in processing call application data over a digital TV in accordance with the present invention; And
18 is a diagram for explaining a process of acquiring image / video data of a called party according to the present invention.

Hereinafter, various embodiments (s) of a digital device and a data processing method in the digital device according to the present invention will be described with reference to the drawings.

The suffix "module "," part ", and the like for components used in the present specification are given only for ease of specification, and both may be used as needed. Also, even when described in ordinal numbers such as " 1st ", "2nd ", and the like, it is not limited to such terms or ordinal numbers. In addition, although the terms used in the present specification have been selected from the general terms that are widely used in the present invention in consideration of the functions according to the technical idea of the present invention, they are not limited to the intentions or customs of the artisan skilled in the art, It can be different. However, in certain cases, some terms are arbitrarily selected by the applicant, which will be described in the related description section. Accordingly, it should be understood that the term is to be interpreted based not only on its name but on its practical meaning as well as on the contents described throughout this specification. It is to be noted that the contents of the present specification and / or drawings are not intended to limit the scope of the present invention.

In this specification, according to the present invention, an application of an external device in a digital device is processed and output through data communication. Particularly, in the present specification, when a calling application is executed in the process of using content of an external device through a data sharing method or the like in the digital device according to the present invention, Will be described in more detail. At this time, the call application includes, for example, an application for a video call as well as an application for a voice call. In addition, in the present specification, guide data and execution data regarding a call application of the external device are provided in the digital device according to the present invention, and the quality of the guide data, execution data, quality of execution data is described in detail.

A " digital device " as used herein includes all devices that perform at least one of sending / receiving, processing, and outputting data with an external device, for example. Here, the data includes, for example, content, services, applications, and the like. The digital device may be connected to an external device or the like via a wire / wireless network to transmit / receive the data. If necessary in the data transmission / reception process, the data may be converted appropriately. Examples of the digital device include a standing device such as a network TV, a Hybrid Broadcast Broadband TV (HBBTV), a Smart TV, an IPTV (Internet Protocol TV), a PC (Personal Computer) Such as PDA (Personal Digital Assistant), Smart Phone, Tablet PC, Notebook, Digital broadcasting terminal, PMP (Portable Multimedia Player), Navigation, Slate PC, A mobile device or a handheld device such as an ultrabook, a wearable device such as a smart watch, a smart glass, a head mounted display (HMD) . Hereinafter, for the purpose of facilitating understanding of the present invention and for convenience of explanation, the digital device will be described with reference to a digital TV as shown in FIGS. Meanwhile, the digital device may be, for example, a wearable device (e.g., smart watch) as shown in FIG. Meanwhile, the digital device may be a signboard having only a display panel or a SET including a set-top box (STB).

The " external device " described herein may be any of the mobile devices described above. Hereinafter, for the sake of the understanding of the present invention and for the convenience of explanation, the external device will be described with reference to a smart phone as shown in FIG. 4 of a mobile device, which will be described as a mobile terminal. Meanwhile, the wearable device (ex. Smart watch) of FIG. 5 may be a mobile terminal that performs a function or role of an external device instead of the smart phone.

Meanwhile, the wired / wireless network described in this specification includes all communication networks for connection between a server and a client, that is, a connection between a digital device and an external device, data communication, To a communication network that is currently supported or will be supported in the future, and may support both one or more communication protocols for that. Such a wired / wireless network includes, for example, a universal serial bus (USB), a composite video banking sync (CVBS), a component, an S-video (analog), a digital visual interface (DVI) (WLAN), Wi-Fi direct, RFID, and the like for a wire connection such as an RGB, a D-SUB, and a communication standard or protocol therefor, (Radio Frequency Identification), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access) A high speed downlink packet access (HSDPA), a long term evolution (LTE-A), or the like, and a communication standard or protocol for the wireless connection.

Meanwhile, a digital device is an intelligent device that supports a broadcast receiving function, a computer function or a support function, at least one external input, and the like. The digital device is capable of receiving e-mail, web browsing, Banking, game, application, and so on. In addition, the digital device may include an interface for supporting at least one input or control means (hereinafter, " control means ") such as a handwriting input device, a touch-screen, a space remote control, .

In addition, a digital device can use a standardized general-purpose operating system (OS) or a web OS (OS), and various services or applications can be added on a general-purpose OS kernel or a Linux kernel adding, deleting, amending, updating, and the like, thereby providing a more user-friendly environment.

In the present specification, data sharing is used to mean all the technologies related to screen and data sharing between a plurality of devices, for example, a digital device and an external device. The data sharing technology may include Miracast, Mirroring, DLNA (), Screen Share, and the like. Hereinafter, for facilitating understanding of the present invention and for convenience of explanation, a description will be given of a mirror cast of the data sharing technique. Here, the miracle cast refers to a wireless video transmission standard standardized by the Wi-Fi Alliance, and is directly connected between devices based on 'Wi-Fi Direct' (Mirroring) technology.

Also, even if the application is described herein for convenience, the meaning may be a meaning including a content, a service, and the like such as a broadcast program. On the other hand, the application also includes a web application according to the web OS platform.

1 is a schematic diagram illustrating a service system according to an embodiment of the present invention.

Referring to FIG. 1, a service system may be basically implemented including a digital TV 110 and a mobile terminal 120. In the above, the digital TV 110 or the mobile terminal 120 may be replaced with the smart watch 130 shown. In addition, the service system may further include a server 105 for data communication between a plurality of devices. The operation of the digital TV 110 can be controlled by a control means such as a dedicated remote controller 115. [

In brief, the mobile terminal 120 transmits a screen (data) being output on the mobile terminal 120 through a data sharing technique, that is, a mirror cast method, and the digital TV 110 receives the screen (data) The same screen as the screen of the mobile terminal 120 can be provided.

A digital device for receiving and processing data of a mobile terminal according to an embodiment of the present invention may include at least one of mirroring data in a first communication protocol, guide data in response to detection of an incoming call from the mobile terminal, A receiver for receiving first call application data on a caller according to a first communication protocol and for receiving second call application data on the caller according to the activated second communication protocol, Wherein the first communication application data and the second communication application data associated with the caller are output after the activation, and wherein the first communication application data and the second communication application data are respectively output after the activation, mouth Means for transmitting the third call application data related to the called party received via the means to the mobile terminal; a screen for outputting the mirroring data, the guide data, the first call application data related to the caller, And a speaker for outputting second call application data.

2 is a block diagram illustrating a digital device according to one embodiment of the present invention. The digital device of Fig. 2 corresponds to the client 100 of Fig. 1 described above.

The digital device 200 includes a network interface 201, a TCP / IP manager 202, a service delivery manager 203, an SI decoder 204, A demultiplexer (demux or demultiplexer) 205, an audio decoder 206, a video decoder 207, a display A / V and OSD module 208, a service management manager 209, a service discovery manager 210, an SI & metadata DB 211, a metadata manager 212, a service manager 213, A UI manager 214, and the like.

The network interface unit 201 receives the IP packet (s) (IP packet (s)) or the IP datagram (s) ) Is transmitted / received. For example, the network interface unit 201 can receive services, applications, content, and the like from the service provider 20 of FIG. 1 through a network.

The TCP / IP manager 202 determines whether the IP packets received by the digital device 200 and the IP packets transmitted by the digital device 200 are packet delivery (i.e., packet delivery) between a source and a destination packet delivery). The service discovery manager 210, the service control manager 209, the meta data manager 212, the service discovery manager 210, the service control manager 209, and the metadata manager 212. The TCP / IP manager 202 classifies the received packet (s) ) Or the like.

The service delivery manager 203 is responsible for controlling the received service data. For example, the service delivery manager 203 may use RTP / RTCP when controlling real-time streaming data. When the real-time streaming data is transmitted using the RTP, the service delivery manager 203 parses the received data packet according to the RTP and transmits the packet to the demultiplexing unit 205 or the control of the service manager 213 In the SI & meta data database 211. [ Then, the service delivery manager 203 feedbacks the network reception information to the server providing the service using the RTCP.

The demultiplexer 205 demultiplexes the received packets into audio, video, SI (System Information) data, and transmits them to the audio / video decoder 206/207 and the SI decoder 204, respectively.

The SI decoder 204 decodes the demultiplexed SI data, that is, Program Specific Information (PSI), Program and System Information Protocol (PSIP), Digital Video Broadcasting Service Information (DVB-SI), Digital Television Terrestrial Multimedia Broadcasting / Coding Mobile Multimedia Broadcasting). Also, the SI decoder 204 may store the decoded service information in the SI & meta data database 211. The stored service information can be read out and used by the corresponding configuration, for example, by a user's request.

The audio / video decoder 206/207 decodes each demultiplexed audio data and video data. The decoded audio data and video data are provided to the user through the display unit 208. [

The application manager may include, for example, the UI manager 214 and the service manager 213 and may perform the functions of the controller of the digital device 200. [ In other words, the application manager can manage the overall state of the digital device 200, provide a user interface (UI), and manage other managers.

The UI manager 214 provides a GUI (Graphic User Interface) / UI for a user using an OSD (On Screen Display) or the like, and receives a key input from a user to perform a device operation according to the input. For example, the UI manager 214 receives the key input regarding the channel selection from the user, and transmits the key input signal to the service manager 213.

The service manager 213 controls the manager associated with the service such as the service delivery manager 203, the service discovery manager 210, the service control manager 209, and the metadata manager 212.

In addition, the service manager 213 generates a channel map and controls the selection of a channel using the generated channel map according to a key input received from the UI manager 214. [ The service manager 213 receives the service information from the SI decoder 204 and sets an audio / video PID (Packet Identifier) of the selected channel in the demultiplexer 205. The PID thus set can be used in the demultiplexing process described above. Accordingly, the demultiplexer 205 filters (PID or section) audio data, video data, and SI data using the PID.

The service discovery manager 210 provides information necessary for selecting a service provider that provides the service. Upon receiving a signal regarding channel selection from the service manager 213, the service discovery manager 210 searches for the service using the information.

The service control manager 209 is responsible for selection and control of services. For example, the service control manager 209 uses IGMP or RTSP when a user selects a live broadcasting service such as an existing broadcasting system, and selects a service such as VOD (Video on Demand) , RTSP is used to select and control services. The RTSP protocol may provide a trick mode for real-time streaming. Also, the service control manager 209 can initialize and manage a session through the IMS gateway 250 using an IP Multimedia Subsystem (IMS) and a Session Initiation Protocol (SIP). The protocols are one embodiment, and other protocols may be used, depending on the implementation.

The metadata manager 212 manages the metadata associated with the service and stores the metadata in the SI & metadata database 211.

The SI & meta data database 211 stores service information decoded by the SI decoder 204, meta data managed by the meta data manager 212, and information necessary for selecting a service provider provided by the service discovery manager 210 . In addition, the SI & meta data database 211 may store set-up data for the system and the like.

The SI & meta data database 211 may be implemented using a non-volatile RAM (NVRAM) or a flash memory.

Meanwhile, the IMS gateway 250 is a gateway that collects functions necessary for accessing the IMS-based IPTV service.

FIG. 3 is a block diagram of another configuration or detailed configuration of FIG. 2;

3A, the digital device includes a broadcast receiving unit 305, an external device interface unit 316, a storage unit 318, a user input interface unit 320, a control unit 325, a display unit 330, An output unit 335, a power supply unit 340, and a photographing unit (not shown). The broadcast receiving unit 305 may include at least one tuner 310, a demodulating unit 312, and a network interface unit 314. However, in some cases, the broadcast receiver 305 may include a tuner 310 and a demodulator 312, but may not include the network interface 314, or vice versa. Although not shown, the broadcast receiver 305 includes a multiplexer for receiving a signal demodulated by the demodulator 312 via the tuner 310 and a signal demodulated by the network interface 314, May be multiplexed. In addition, although not shown, the broadcast receiver 305 further includes a demultiplexer to demultiplex the multiplexed signals, or demultiplex the demodulated signals or the signals passed through the network interface unit 314 .

The tuner 310 tunes a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna, and receives RF broadcast signals. Also, the tuner 310 converts the received RF broadcast signal into an intermediate frequency (IF) signal or a baseband signal. For example, if the received RF broadcast signal is a digital broadcast signal, the signal is converted into a digital IF signal (DIF). If the received RF broadcast signal is an analog broadcast signal, the signal is converted into an analog baseband image or a voice signal (CVBS / SIF). That is, the tuner 310 can process both a digital broadcast signal and an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner 310 may be directly input to the controller 325. [ In addition, the tuner 310 can receive RF broadcast signals of a single carrier or a multiple carrier. Meanwhile, the tuner 310 sequentially tunes and receives RF broadcast signals of all the broadcast channels stored through the channel storage function among the RF broadcast signals received through the antenna, converts the RF broadcast signals into intermediate frequency signals or baseband signals (DIF: Digital Intermediate Frequency or baseband signal).

The demodulator 312 may receive and demodulate the digital IF signal DIF converted by the tuner 310 and perform channel decoding. For this, the demodulator 312 may include a trellis decoder, a de-interleaver, a Reed-Solomon decoder, or a convolution decoder, a deinterleaver, - Solomon decoder and the like. The demodulator 312 can demodulate and decode the channel, and output the stream signal TS. At this time, the stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. The stream signal output from the demodulation unit 312 may be input to the control unit 325. The control unit 325 controls demultiplexing, video / audio signal processing, and the like, and controls the output of audio through the display unit 330 and the audio output through the audio output unit 335.

The external device interface unit 316 provides an interface environment between the digital device and various external devices. To this end, the external device interface unit 316 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown). The external device interface unit 316 is connected to the external device interface unit 316 through a communication unit such as a DVD (Digital Versatile Disk), a Blu-ray, a game device, a camera, a camcorder, a computer (notebook), a tablet PC, a smart phone, device, an external device such as a cloud, or the like. The external device interface unit 316 transmits a signal including data such as image, image, and voice input through the connected external device to the control unit 325 of the digital device. The control unit 325 can control the processed image, video, audio, and the like to be output to the external device to which the data signal is connected. To this end, the external device interface unit 316 may further include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit is provided with a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), a DVI (digital) Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.

The wireless communication unit can perform short-range wireless communication with another digital device. The digital device can be used for communication such as Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance) Depending on the protocol, it can be networked with other digital devices.

Also, the external device interface unit 316 may be connected to the set-top box (STB) through at least one of the various terminals described above to perform input / output operations with the set-top box (STB). Meanwhile, the external device interface unit 316 may receive an application or an application list in an adjacent external device, and may transmit the received application or application list to the control unit 325 or the storage unit 318.

The network interface unit 314 provides an interface for connecting the digital device to a wired / wireless network including the Internet network. The network interface unit 314 may include an Ethernet terminal or the like for connection to a wired network and may be a WLAN (Wireless LAN) (Wi- Fi, Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access) communication standards. The network interface unit 314 can transmit or receive data to another user or another digital device via the network to which it is connected or another network linked to the connected network. In particular, some content data stored in the digital device can be transmitted to a selected user or selected one of other users or other digital devices pre-registered with the digital device. On the other hand, the network interface unit 314 can access a predetermined web page through the connected network or another network linked to the connected network. That is, it is possible to access a predetermined web page through a network and transmit or receive data with the server. In addition, content or data provided by a content provider or a network operator may be received. That is, it can receive contents and related information of movies, advertisements, games, VOD, broadcasting signals, etc., provided from a content provider or a network provider through a network. In addition, it can receive update information and an update file of firmware provided by the network operator. It may also transmit data to the Internet or to a content provider or network operator.

In addition, the network interface unit 314 can select and receive a desired one of the open applications through the network.

The storage unit 318 may store a program for each signal processing and control in the control unit 325 or may store a signal-processed video, audio, or data signal. The storage unit 318 may also function to temporarily store video, audio, or data signals input from the external device interface unit 316 or the network interface unit 314. The storage unit 318 can store information on a predetermined broadcast channel through the channel memory function. The storage unit 318 may store a list of applications or applications input from the external device interface unit 316 or the network interface unit 314. [ In addition, the storage unit 318 may store various platforms described later. The storage unit 318 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD Memory, etc.), RAM (RAM), and ROM (EEPROM, etc.). The digital device can reproduce and provide a content file (a moving image file, a still image file, a music file, a document file, an application file, etc.) stored in the storage unit 318 to a user. FIG. 3A illustrates an embodiment in which the storage unit 318 is provided separately from the control unit 325, but the present invention is not limited thereto. In other words, the storage unit 318 may be included in the control unit 325.

The user input interface unit 320 transfers a signal input by the user to the control unit 325 or a signal from the control unit 325 to the user. For example, the user input interface unit 320 may control power on / off, channel selection, screen setting, and the like from the remote control device 345 according to various communication methods such as an RF communication method and an infrared (IR) Or to process the signal to send the control signal of the control unit 325 to the remote control device 345. [ The user input interface unit 320 may transmit a control signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the control unit 325. The user input interface unit 320 transmits a control signal input from a sensing unit (not shown) that senses a gesture of a user to the control unit 325 or transmits a signal of the control unit 325 to a sensing unit (Not shown). Here, the sensing unit (not shown) may include a touch sensor, an audio sensor, a position sensor, an operation sensor, and the like.

The control unit 325 demultiplexes the streams input through the tuner 310, the demodulation unit 312 or the external device interface unit 316 or processes the demultiplexed signals to generate a signal for video or audio output And can output. The video signal processed by the control unit 325 may be input to the display unit 330 and displayed as an image corresponding to the video signal. The video signal processed by the control unit 325 may be input to the external output device through the external device interface unit 316. The audio signal processed by the control unit 325 may be output to the audio output unit 335 through audio. The voice signal processed by the control unit 325 may be input to the external output device through the external device interface unit 316. [ Although not shown in FIG. 3A, the control unit 325 may include a demultiplexing unit, an image processing unit, and the like. The control unit 325 can control the overall operation of the digital device. For example, the control unit 325 may control the tuner 310 to control tuning of a RF broadcast corresponding to a channel selected by the user or a previously stored channel. The control unit 325 can control the digital device by the user command or the internal program input through the user input interface unit 320. [ In particular, it is possible to connect to a network so that a user can download a desired application or application list into a digital device. For example, the control unit 325 controls the tuner 310 so that a signal of a selected channel is inputted according to a predetermined channel selection command received through the user input interface unit 320. And processes video, audio or data signals of the selected channel. The control unit 325 allows the display unit 330 or the audio output unit 335 to output the video or audio signal processed by the user through the channel information selected by the user. The control unit 325 may be connected to an external device input through the external device interface unit 316, for example, a camera or a camcorder, according to an external device video playback command received through the user input interface unit 320. [ So that a video signal or a voice signal of the display unit 330 or the audio output unit 335 can be output. On the other hand, the control unit 325 can control the display unit 330 to display an image. For example, a broadcast image input through the tuner 310, an external input image input through the external device interface unit 316, an image input through the network interface unit 314, or a storage unit 318, The display unit 330 can display the image stored in the storage unit 330. [ At this time, the image displayed on the display unit 330 may be a still image or a moving image, and may be a 2D image or a 3D image. Also, the control unit 325 can control to reproduce the content. The content at this time may be content stored in the digital device, received broadcast content, or external input content input from the outside. The content may be at least one of a broadcast image, an external input image, an audio file, a still image, a connected web screen, and a document file. On the other hand, when entering the application view item, the control unit 325 can control to display a list of applications or applications that can be downloaded from the digital device or from an external network. The control unit 325, in addition to various user interfaces, can control to install and drive an application downloaded from the external network. In addition, it is possible to control the display unit 330 to display an image related to the executed application by the user's selection.

Although not shown in the drawing, the digital device may further include a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal. The channel browsing processing unit receives a stream signal TS output from the demodulating unit 312 or a stream signal output from the external device interface unit 316 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image can be directly or encoded and input to the controller 325. The generated thumbnail image may be encoded in a stream format and input to the control unit 325. The control unit 325 can display a thumbnail list having a plurality of thumbnail images on the display unit 330 using the input thumbnail images. On the other hand, the thumbnail images in this thumbnail list can be updated in sequence or simultaneously. Accordingly, the user can easily grasp the contents of a plurality of broadcast channels.

The display unit 330 converts an image signal, a data signal, an OSD signal processed by the control unit 325 or a video signal and a data signal received from the external device interface unit 316 into R, G, and B signals, respectively Thereby generating a driving signal. The display unit 330 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like. On the other hand, the display unit 330 may be configured as a touch screen and used as an input device in addition to the output device. The audio output unit 335 receives a signal processed by the control unit 325, for example, a stereo signal, a 3.1-channel signal, or a 5.1-channel signal, and outputs it as a voice. The audio output unit 335 may be implemented by various types of speakers.

In order to detect the gesture of the user, a sensing unit (not shown) having at least one of a touch sensor, a voice sensor, a position sensor, and an operation sensor may be further included in the digital device, as described above. A signal sensed by a sensing unit (not shown) may be transmitted to the control unit 325 through the user input interface unit 320. On the other hand, a photographing unit (not shown) for photographing a user may be further provided. The image information photographed by the photographing unit (not shown) may be input to the control unit 325. [ The control unit 325 may sense the gesture of the user by respectively combining the images photographed by the photographing unit (not shown) or the signals sensed by the sensing unit (not shown).

The power supply unit 340 supplies corresponding power to the digital device. Particularly, it is possible to supply power to a control unit 325 that can be implemented in the form of a system on chip (SoC), a display unit 330 for displaying an image, and an audio output unit 335 for audio output . To this end, the power supply unit 340 may include a converter (not shown) for converting the AC power to the DC power. For example, when the display unit 330 is implemented as a liquid crystal panel having a plurality of backlight lamps, an inverter (not shown) capable of PWM (Pulse Width Modulation) operation for variable luminance or dimming driving and an inverter (not shown).

The remote control device 345 transmits the user input to the user input interface unit 320. To this end, the remote control device 345 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. The remote control device 345 may receive the video, audio, or data signal output from the user input interface 320 and display it on the remote control device 345 or output voice or vibration.

The digital device may be a digital broadcast receiver capable of processing digital broadcast signals of fixed or mobile type ATSC or DVB. In addition, the digital device according to the present invention may further include a configuration that omits some of the configuration shown in FIG. On the other hand, unlike the above, the digital device does not have a tuner and a demodulator, and can receive and reproduce the content through the network interface unit or the external device interface unit.

3B, an example of the control unit includes a demultiplexer 350, an image processor, an OSD generator 366, a mixer 370, a frame rate converter (FRC) 380 ), And a formatter 390, as shown in FIG. The control unit may further include a voice processing unit and a data processing unit.

The demultiplexer 350 demultiplexes the input stream. For example, the demultiplexer 350 can demultiplex the received MPEG-2 TS video, audio, and data signals. Here, the stream signal input to the demultiplexer 350 may be a stream signal output from a tuner, a demodulator, or an external device interface.

The image processing unit performs image processing of the demultiplexed video signal. To this end, the image processing unit may include a video decoder 362 and a scaler 364. The video decoder 362 decodes the demultiplexed video signal, and the scaler 364 scales the decoded video signal so that the resolution of the decoded video signal can be output from the display unit. The video decoder 362 may support various standards. For example, the video decoder 362 performs the function of the MPEG-2 decoder when the video signal is encoded in the MPEG-2 standard, and the video decoder 362 encodes the video signal in the DMB (Digital Multimedia Broadcasting) It can perform the function of the H.264 decoder. On the other hand, the video signal decoded by the image processing unit is input to the mixer 370.

The OSD generating unit 366 generates OSD data according to a user input or by itself. For example, the OSD generating unit 366 generates data for displaying various data in the form of graphic or text on the screen of the display unit based on the control signal of the user input interface unit. The generated OSD data includes various data such as a user interface screen of a digital device, various menu screens, a widget, an icon, and viewing rate information. The OSD generation unit 366 may generate data for displaying broadcast information based on the caption of the broadcast image or the EPG.

The mixer 370 mixes the OSD data generated by the OSD generating unit 366 and the image signal processed by the image processing unit and provides the mixed signal to the formatter 390. Since the decoded video signal and the OSD data are mixed, the OSD is overlaid on the broadcast image or the external input image.

A frame rate converter (FRC) 380 converts a frame rate of an input image. For example, the frame rate conversion unit 380 may convert the frame rate of the input 60 Hz image to have a frame rate of 120 Hz or 240 Hz, for example, in accordance with the output frequency of the display unit. As described above, there are various methods for converting the frame rate. For example, when the frame rate is changed from 60 Hz to 120 Hz, the frame rate conversion unit 380 may insert the same first frame between the first frame and the second frame, Three frames can be inserted. As another example, when converting the frame rate from 60 Hz to 240 Hz, the frame rate conversion unit 380 may insert and convert three or more identical frames or predicted frames between existing frames. On the other hand, when the frame conversion is not performed, the frame rate conversion unit 380 may be bypassed.

The formatter 390 changes the output of the input frame rate converter 380 to match the output format of the display unit. For example, the formatter 390 may output R, G, and B data signals, and the R, G, and B data signals may be output as low voltage differential signals (LVDS) or mini-LVDS . If the output of the input frame rate converter 380 is a 3D video signal, the formatter 390 may configure and output the 3D format according to the output format of the display unit to support the 3D service through the display unit.

On the other hand, the voice processing unit (not shown) in the control unit can perform the voice processing of the demultiplexed voice signal. Such a voice processing unit (not shown) may support processing various audio formats. For example, even when a voice signal is coded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, or BSAC, a corresponding decoder can be provided. In addition, the voice processing unit (not shown) in the control unit can process the base, the treble, the volume control, and the like. A data processing unit (not shown) in the control unit can perform data processing of the demultiplexed data signal. For example, the data processing unit can decode the demultiplexed data signal even when it is coded. Here, the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

On the other hand, the above-described digital device is an example according to the present invention, and each component can be integrated, added, or omitted according to specifications of a digital device actually implemented. That is, if necessary, two or more components may be combined into one component, or one component may be divided into two or more components. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and devices thereof do not limit the scope of rights of the present invention. Meanwhile, the digital device may be a video signal processing device that performs signal processing of an image stored in the device or an input image. Other examples of the video signal processing device include a set-top box (STB) excluding the display portion 330 and the audio output portion 335 shown in Fig. 3A, the DVD player, the Blu-ray player, Etc. can be further exemplified.

2 and 3 described above are one embodiment of a digital device, and FIG. 4 to be described later relates to an external device, and a mobile terminal is an embodiment.

4 is a block diagram illustrating an external device according to an embodiment of the present invention.

4, the mobile terminal 400 includes a wireless communication unit 410, an A / V input unit 420, a user input unit 430, a sensing unit 440, an output unit 450, A memory 460, an interface unit 470, a control unit 480, a power supply unit 490, and the like.

The wireless communication unit 410 may include one or more modules that enable wireless communication between the mobile terminal 400 and a wireless communication system or between a mobile terminal and a network in which the mobile terminal is located. For example, the wireless communication unit 410 may include a broadcast receiving module 411, a mobile communication module 412, a wireless Internet module 413, a short distance communication module 414, a location information module 415, .

The broadcast receiving module 411 receives broadcast signals and / or broadcast related information from an external broadcast management server through a broadcast channel. Here, the broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal. The broadcast-related information may include information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 412. The broadcast-related information may include, for example, data for generating and outputting an EPG (Electronic Program Guide) or an ESG (Electronic Service Guide) in the mobile terminal 400. The broadcast receiving module 411 may be an ATSC, a Digital Video Broadcasting-Terrestrial (DVB-T), a Satellite DVB-S (Satellite), a MediaFLO (Media Forward Link Only), a DVB- And Integrated Services Digital Broadcast-Terrestrial (DRS). Of course, the broadcast receiving module 411 may be configured to be suitable for other digital broadcasting systems as well as the digital broadcasting system described above. The broadcast signal and the broadcast related information received through the broadcast receiving module 411 may be stored in the memory 460.

The mobile communication module 412 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 413 may be embedded in or enclosed in the mobile terminal 400, including a module for wireless Internet access. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 414 is a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, RS-232, and RS-385 are used as short range communication technology. .

The position information module 415 is a module for acquiring position information of the mobile terminal 400, and may be a GPS (Global Position System) module.

The A / V input unit 420 is for inputting audio and / or video signals. The A / V input unit 420 may include a camera 421, a microphone 422, and the like. The camera 421 processes an image frame such as a still image or a moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 451. [

The image frame processed by the camera 421 can be stored in the memory 460 or transmitted to the outside through the wireless communication unit 410. [ At least two cameras 421 may be provided depending on the use environment.

The microphone 422 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 412 in the case of the communication mode and output. The microphone 422 may be implemented with various noise reduction algorithms for eliminating noise generated in the process of receiving an external sound signal.

The user input unit 430 generates input data for the user's operation control of the terminal. The user input unit 430 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 440 senses the current state of the mobile terminal 400 such as the open / close state of the mobile terminal 400, the location of the mobile terminal 400, the presence of the user, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 400. For example, when the mobile terminal 400 is moved or tilted, it can sense the position, inclination, and the like of the mobile terminal. It is also possible to sense whether power is supplied to the power supply unit 490, whether an external device of the interface unit 470 is coupled, and the like. Meanwhile, the sensing unit 440 may include a proximity sensor 341 including NFC (Near Field Communication).

The output unit 450 may include a display unit 451, an acoustic output module 452, an alarm unit 453, and a haptic module 454 to generate an output related to visual, auditory, have.

The display unit 451 displays (outputs) information processed by the mobile terminal 400. [ For example, when the mobile terminal is in the call mode, a UI or GUI associated with the call is displayed. When the mobile terminal 400 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 451 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) flexible display, and a three-dimensional display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 451 may also be of a light transmission type. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 451 of the terminal body.

There may be two or more display units 451 according to the embodiment of the mobile terminal 400. [ For example, in the mobile terminal 400, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

The display unit 451 may be provided with an input device other than the output device in a case where the display unit 451 and the sensor for sensing the touch operation (hereinafter, referred to as 'touch sensor' It can also be used as a device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display portion 451 or a capacitance generated in a specific portion of the display portion 451 to an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 480. Thus, the control unit 480 can know which area of the display unit 451 is touched or the like.

A proximity sensor 441 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 452 can output audio data received from the wireless communication unit 410 or stored in the memory 460 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 452 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed on the mobile terminal 400. [ The sound output module 452 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 453 outputs a signal for notifying the occurrence of an event of the mobile terminal 400. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 453 may output a signal for informing occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 451 or the audio output module 452 so that they may be classified as a part of the alarm unit 453. [

The haptic module 454 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 454 is vibration. The intensity and pattern of the vibration generated by the hit module 454 and the like are controllable. For example, different vibrations may be synthesized and output or sequentially output. In addition to the vibration, the haptic module 454 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, Various effects such as an effect of heat generation and an effect of reproducing a cool / warm feeling using a heat absorbing or heatable element can be generated. The haptic module 454 can be implemented not only to transmit the tactile effect through direct contact but also to allow the user to feel the tactile effect through the muscular senses such as a finger or an arm. The haptic module 454 may include two or more haptic modules according to the configuration of the mobile terminal 400.

The memory 460 may store a program for the operation of the controller 480 and temporarily store input / output data (e.g., phone book, message, still image, moving picture, etc.). The memory 460 may store data related to vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 460 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) (Random Access Memory), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM) A magnetic disk, and / or an optical disk. The mobile terminal 400 may operate in association with a web storage that performs the storage function of the memory 460 on the internet.

The interface unit 470 serves as a pathway to all the external devices connected to the mobile terminal 400. The interface unit 470 receives data from an external device or receives power from the external device and transmits the data to the respective components in the mobile terminal 400 or allows data in the mobile terminal 400 to be transmitted to an external device. For example, it may be provided with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port connecting a device with an identification module, an audio I / O port, A video I / O port, an earphone port, and the like may be included in the interface unit 470.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 400 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. A device having an identification module (hereinafter referred to as 'identification device') can be manufactured in a smart card format. Accordingly, the identification device can be connected to the mobile terminal 400 through the port.

When the mobile terminal 400 is connected to an external cradle, the interface unit 470 may be a path through which the power from the cradle is supplied to the mobile terminal 400, And a command signal may be a path through which the mobile terminal is transmitted. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The control unit 480 typically controls the overall operation of the mobile terminal 400. The control unit 480 performs related control and processing for, for example, voice call, data communication, video call, and the like. The control unit 480 may include a multimedia module 481 for multimedia playback. The multimedia module 481 may be implemented in the control unit 480 or separately from the control unit 480. The control unit 480 may perform a pattern recognition process for recognizing handwriting input or drawing input on the touch-screen as characters and images, respectively.

The power supply unit 490 receives external power and internal power under the control of the controller 480 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, a controller, micro-controllers, a microprocessor, and an electrical unit for performing other functions. In some cases, May be implemented by the control unit 480 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Here, the software code is stored in the memory 460 and can be executed by the control unit 480. [

On the other hand, a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses can operate or function as a digital device or an external device in this specification. These wearable devices include smart watch, smart glass, and head mounted display (HMD).

As shown in Fig. 1, the wearable device can mutually exchange (or interlock) data with another device. The short range communication module 414 can detect (or recognize) a wearable device that can communicate with the mobile terminal 400. If the detected wearable device is a device authenticated to communicate with the mobile terminal 400, the control unit 480 transmits at least a part of the data processed by the mobile terminal 400 to the wearable device 420 via the short- Lt; / RTI > Accordingly, the user can use the data processed by the mobile terminal 400 through the wearable device. For example, when a telephone call is received in the mobile terminal 400, a telephone conversation can be performed through the wearable device, or the received message can be confirmed through the wearable device when a message is received in the mobile terminal 400.

5 is a block diagram illustrating a digital device or external device in accordance with another embodiment of the present invention.

5, a watch type mobile terminal, that is, a smart watch 500 includes a main body 501 having a display unit 551, a band 502 connected to the main body 501 and configured to be worn on the wrist, . In general, the smart watch 500 may include features of the mobile terminal 400 of FIG. 4 or similar features.

The main body 501 includes a case which forms an appearance. As shown, the case may include a first case 501a and a second case 501b that provide an internal space for accommodating various electronic components. However, the present invention is not limited to this, and one case may be configured to provide the internal space so that a mobile terminal 500 of a unibody may be implemented.

The smart watch 500 is configured to enable wireless communication, and the main body 501 may be provided with an antenna for the wireless communication. On the other hand, the antenna can expand its performance by using a case. For example, a case including a conductive material may be configured to electrically connect with the antenna to extend the ground or radiating area.

A display unit 551 is disposed on the front surface of the body 501 to output information, and a touch sensor is provided on the display unit 551 to implement a touch screen. The window 551a of the display unit 551 may be mounted on the first case 501a to form a front surface of the terminal body together with the first case 501a.

The main body 501 may include an acoustic output unit 552, a camera 521, a microphone 522, a user input unit 523, and the like. When the display unit 551 is implemented as a touch screen, the display unit 551 may function as a user input unit 523, so that the main body 501 may not have a separate key.

The band 502 is worn on the wrist so as to enclose the wrist, and may be formed of a flexible material for easy wearing. As an example, the band 502 may be formed of leather, rubber, silicone, synthetic resin, or the like. The band 502 may be detachably attached to the main body 501 and may be configured to be replaceable by various types of bands according to the user's preference.

On the other hand, the band 502 can be used to extend the performance of the antenna. For example, the band may include a ground extension (not shown) that is electrically connected to the antenna and extends the ground region.

The band 502 may be provided with a fastener 502a. The fastener 502a may be embodied by a buckle, a snap-fit hook structure, or a velcro (trademark), and may include a stretchable section or material . In this figure, an example in which the fastener 502a is embodied as a buckle is shown.

6 is a diagram illustrating control means for digital device control according to an embodiment of the present invention.

A front panel (not shown) or a control means (input means) provided on the digital device 600 is used to control the digital device 600.

The control means includes a remote controller 610, a keyboard 630, a pointing device 620, and a keyboard 620, which are mainly implemented for the purpose of controlling the digital device 600, as a user interface device (UID) A touch pad, or the like, but may also include control means dedicated to external input connected to the digital device 600. [ In addition, the control means includes a mobile terminal such as a smart phone, a tablet PC, or the like which controls the digital device 600 through a mode switching or the like for the purpose of not controlling the digital device 600. In the following description, a pointing device will be described as an example, but the present invention is not limited thereto.

The input means may be a communication protocol such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (UDA), Ultra Wideband (UWB), ZigBee, Digital Living Network Alliance (DLNA) At least one can be employed as needed to communicate with the digital device.

The remote controller 610 is a conventional input device having various key buttons necessary for controlling the digital device 600. [

The pointing device 620 may include a gyro sensor or the like to implement a corresponding pointer on the screen of the digital device 600 based on the user's motion, The control command is transmitted. Such a pointing device 620 may be named with various names such as a magic remote controller, a magic controller, and the like.

Since the digital device 600 provides a variety of services such as a web browser, an application, and a social network service (SNS) as an intelligent integrated digital device beyond the conventional digital device 600 providing only conventional broadcasting, It is not easy, and it is implemented to complement input and realize input convenience such as text by implementing similar to PC keyboard.

The control means such as the remote control 610, the pointing device 620 and the keyboard 630 may be provided with a touch pad as required to provide more convenient and various control purposes such as text input, pointer movement, .

Hereinafter, various embodiments of a digital device for processing data received from an external device and a method of processing data in the digital device according to the present invention will be described in detail with reference to the accompanying drawings.

As described above, the digital device will hereinafter be described as a digital TV and the external device as a mobile terminal as an embodiment.

In particular, the present invention relates to a process when a call connection request (incoming call) to a mobile terminal is received while a user is using the digital TV. The present invention is applicable not only to the incoming call but also to the outgoing call.

The use of a digital TV by a user includes a case of sharing a screen of the mobile terminal in the digital TV as well as a case of using a TV service including a broadcasting service (e.g., viewing or the like).

The incoming and outgoing calls to the mobile terminal can be made through a call application. Such a call application may include both a voice call only application and a video call application.

In the present specification, the processing will be described in detail in the case where a user is called by the mobile terminal or originating through the mobile terminal while using the TV service or the screen sharing service of the mobile terminal through the digital TV . On the other hand, the present invention relates to a case of making a call through a digital TV, rather than making a call through a mobile terminal in the case of incoming / outgoing call. Hereinafter, for facilitating the understanding of the present invention and for convenience of explanation, an incoming call will be described as an embodiment. However, as described above, the present invention is not limited to receiving an incoming call, but is applicable to a calling.

FIG. 7 is a diagram illustrating a mobile terminal for transmitting mirroring data to the digital TV according to a digital TV and a mirror cast method.

The digital TV 710 and the mobile terminal 720 belonging to the same network can be paired and can perform data communication according to various communication protocols after the pairing.

The digital TV 710 can output a TV service through a screen and a speaker. Meanwhile, the digital TV 710 may receive and output mirroring data of the mobile terminal 720 for screen sharing. Here, the mirroring data is transmitted and received by Wi-Fi direct, for example, and may be variously called miracast data, screen share data, or the like. However, in the present specification, mirroring data will be named for convenience.

Hereinafter, a description will be given of an incoming call (call) to the mobile terminal while the digital TV is processing the mirroring data of the mobile terminal.

8 to 10 are diagrams illustrating a process of data processing in a case where an incoming call is made to the mobile terminal during mirroring data processing of the mobile terminal in the digital TV according to an embodiment of the present invention.

FIG. 8 is a flow chart of the data processing process, FIG. 9 is a block diagram of a digital TV processing the mirroring data of the mobile terminal, and FIG. 10 is a block diagram of the digital TV processing the mirroring data. (UI / UX) for providing guide data for informing the user of the arrival of the call via the display unit 710.

A method of processing data of a mobile terminal in a digital device according to an embodiment of the present invention includes the steps of pairing with the mobile terminal, receiving the mirroring data in the first communication protocol from the mobile terminal, Outputting on a screen, receiving guide data according to an incoming call detection from the mobile terminal, activating a second communication protocol for a call service, and transmitting, from the mobile terminal, Receiving first call application data, receiving and outputting second call application data related to the caller according to the activated second communication protocol, and outputting the second call application data via input means connected to the digital device according to the second communication protocol The third party < RTI ID = 0.0 > It comprises the step of transmitting the application data to the MS.

Demultiplexing the received first call application data; performing video decoding on the video data of the sender among the demultiplexed first call application data and not performing audio decoding on the voice data; As shown in FIG.

The method of claim 1, further comprising the steps of: obtaining video data of the called party via a camera sensor; outputting video data of the called party with the received first call application data; The method may further include transmitting. Determining whether there is video data of the called party obtained through the camera sensor; and if there is no video data of the called party as a result of the determination, outputting the preset substitute image data together with the first call application data, Lt; RTI ID = 0.0 > 1 < / RTI > communication protocol to the mobile terminal.

Determining whether a time stamp exists from the first call application data and the second call application data related to the caller; and if the time stamp exists, And outputting the synchronized call application. As a result of the determination, if there is no time stamp, the first call application and the second application to be decoded can be immediately output without delay.

Meanwhile, at least one of the second call application data and the third call application data received according to the second communication protocol may include at least one of noise reduction, auto gain control, and acoustic echo cancellation echo cancellation processing and volume reduction may be performed.

In addition, the digital device may be one of a digital TV and a wearable device, and the input means may be any one of a remote controller connected to the digital TV and a wearable device.

The method may further include outputting the guide data received from the mobile terminal to a predetermined area on the screen, and receiving a selection of the user from the output guide data.

The received first call application data may be output through any one of the screen front, the PIP (Picture in Picture) window, and the screen division window.

The digital TV 710 paired with the mobile terminal 720 (S802). Here, the pairing may be performed according to a known technique, and a detailed description thereof will be omitted herein.

After being paired in step S802, the mobile terminal 720 transmits the mirroring data to the digital TV 710 (S804). In step S804, a process of requesting and responding to a mirror cast between both devices for sharing the screen of the mobile terminal 720 may be performed in preparation for the transmission of the mirroring data.

The digital TV 710 processes and outputs the mirroring data received from the mobile terminal 720 (S806). Accordingly, the user can view the same screen as the screen of the mobile terminal 720 on the large screen of the digital TV.

Here, the mirroring data in steps S804 and S806 may be transmitted and received through the Wi-Fi direct communication protocol, for example, with reference to FIG. 9, the mirroring data is transmitted from the mobile terminal 720 in accordance with the Wi-Fi direct communication protocol in the form of a transport stream (hereinafter, referred to as a mirror TS) in which audio data and video data are multiplexed And transmitted to the digital TV 710. The digital TV 710 includes a receiver or reception interface 910 for receiving the mirroring TS and a demultiplexer 920 for demultiplexing the mirroring TS received through the receiver 910, Demultiplexed into video data. The demultiplexed video data is transmitted to a video decoder 930 and decoded, and the demultiplexed audio data is transmitted to an audio decoder 940 and decoded. In each of the decoders, the decoded video data and the audio data are outputted through the screen and the speaker in synchronization with each other. In FIG. 9, only the configuration essential for the processing of the mirroring TS has been shown and described for convenience, but other configurations may be further included or used by referring to FIG. 2 or 3 to process the mirroring TS. The video decoder 930 and the audio decoder 940 can decode video data and audio data in the mirroring TS using a decoding scheme corresponding to the video data and the audio data in a format different from the TV service data.

As described above, if an incoming call (call connection request) is sensed to the mobile terminal 720 during the provision of the mirroring service in the digital TV 710 in step S806 (S808), the mobile terminal transmits the detected incoming call The first call application data including the guide data, that is, the call reception guide data, to the digital TV 710 (step S810). Here, the first call application data may be transmitted according to the Wi-Fi direct communication protocol described above, but is not limited thereto. When the first call application data is received from the mobile terminal 720, the digital TV 710 reads out the reception guide data 1010 from the received first call application data and processes it as shown in FIG. Has come. Do you want to connect? (Y / N) " on the screen (S812).

When the user accepts the incoming call on the digital TV 710 through the guide data, the digital TV 710 requests the call application data and provides the incoming service through the digital TV according to the response of the mobile terminal 720 . At this time, when the video / audio call application on the mobile terminal 720 is used on the digital TV 710, a problem may arise in processing audio and / or audio data. For example, when the call application is executed on the mobile terminal 720, the speaker and the microphone of the mobile terminal 720 that are turned off for the mirroring service are automatically turned on, the voice data of the caller is output through the speaker, Voice data of the called party is input through the microphone. However, if the user does not have the mobile terminal 720, it is difficult to confirm the voice data of the caller through the speaker of the mobile terminal 720, and to input the voice data of the called party through the microphone of the mobile terminal 720 There is a problem that transmission is difficult. That is, in such a case, there is a problem that a call is connected and it is difficult to use the call service properly. In the conventional mobile terminal 720, when a call application is executed in response to an incoming call (call connection) during the mirroring service, the speaker of the mobile terminal 720 itself, not the speaker of the digital TV 710, And the digital TV 710 can not determine whether or not the microphone device is present in the mobile terminal 720 and the digital TV 710 does not also support the Bluetooth function.

In order to solve this problem, in the present invention, when an incoming call is detected to the mobile terminal 720, the Bluetooth communication protocol is activated to automatically connect the digital TV 710 and the mobile terminal 720. When the digital TV 710 and the mobile terminal 720 are automatically connected to the Bluetooth, the mobile terminal 720 recognizes the digital TV 710 as a Bluetooth head-set, turns off the speaker, The digital TV 710 recognizes the mobile terminal 720 as an audio input / output device. At this time, the voice data of the caller is output through the speaker of the digital TV 710, and the voice data of the called party is transmitted to the mobile terminal through a microphone of a TV microphone, a remote control microphone, So that the call service can be used. Meanwhile, as described above, the communication service through the Bluetooth connection can be used between the digital TV 710 and the mobile terminal 720 irrespective of the mirroring service.

In the present invention, it is assumed that, in relation to the Bluetooth function, various profiles related to HSP (Head Set Profile) and HFP (Hands Free Profile) functions of the Bluetooth standard are supported in the digital TV 710. And a detailed description of the Bluetooth standard and the profiles will be referred to and defined herein.

10B, if the user permits the reception through the digital TV 710 via the remote controller or the like, the digital TV 710 transmits the reception guide data of the mobile terminal to the digital TV 710 in step S812, And requests second call application data (S814). Here, the second call application data includes, for example, voice data of the caller in the case of a voice call and image / video and voice data of the caller in the case of a video call.

Upon receiving the second call application data request in step S814, the mobile terminal 720 activates the Bluetooth connection with the digital TV 710 in order to transmit the requested second call application data (S816).

If the Bluetooth connection between the digital TV 710 and the mobile terminal 720 is activated in step S816, the mobile terminal 720 transmits the requested second call application data to the digital TV 710 S818).

In step S820, the digital TV 710 receives the second call application data from the mobile terminal 720 through the step S818, processes it, and outputs it through the screen and / or the speaker. Accordingly, the called user (user) can use the voice / video call service of the mobile terminal 720 even during the mirroring service through the digital TV 710 (S820).

Thereafter, the digital TV 710 transmits a control command to the mobile terminal 720 according to the input of the called party (S822), and the mobile terminal 720 responds to the control command (S824).

The control command may include contents related to all controls that the called party can make in the process of using the call service of the mobile terminal 720 in the digital TV 710. On the other hand, the digital TV 710 may generate and provide UI / UX related to the control command on the screen in order to receive the control command. Accordingly, the called user (user) can experience the same experience as using the call service on the mobile terminal while using the call service of the mobile terminal through the digital TV. For example, in the case of a video call service, it is possible to control the image resolution, image size, image output position, etc. of the caller and the called party, and to adjust the audio volume of the caller and the called party, UX. Also, it is possible to switch between voice call and video call at the time of call, and terminate the call service through the call end UI. In this case, the mobile terminal is also terminated, and at this time, the Bluetooth connection connected for the call service can be released or continuously maintained. Even if the Bluetooth connection is continued as in the latter case, if the call service is not continuously used for a predetermined time or the battery level of the mobile terminal is lower than a predetermined range, the connection may be released.

11 is a diagram illustrating a method of processing call application data in a digital TV connected to a mobile terminal according to an exemplary embodiment of the present invention.

In FIG. 11B, the digital TV can provide the same screen as the call application screen of the mobile terminal of FIG. 11A. In this case, as shown in FIG. 11B, the call application data on the mobile terminal may be provided through the entire screen 1110 of the digital TV.

11C, the mirroring data or TV service data is output on the screen 1120 and the call application data is displayed on the window 1130 provided in the area A and on the window 1135 provided in the area B, Picture in Picture) or the like. If the image is output only in one of the A region window 1130 and the B region window 1135 of the digital TV, the image of the called party is not provided and only the image of the calling party can be output. However, if both the A region window 1130 and the B region window 1135 are provided for a call application, the image of the caller may be provided on one window and the image of the caller may be provided on another window. On the other hand, various key button UIs for use, control, and the like of the call application can be provided in the corresponding area. The positions and sizes of the A region 1130 and the B region 1135 can be arbitrarily adjusted. For example, the window in which the image of the sender is output may be the same size as or larger than the window in which the image of the called party is output.

Referring to FIG. 11D, in the digital TV, the screen is divided into two, and the mirroring data or the TV service data is continuously provided on the first partition window, and the call application data received from the mobile terminal on the second partition window . The mirroring data or TV service data provided on the first partitioned window may be reconfigured according to the window size according to, for example, screen reconstruction.

11B, when the call application data is provided through the screen front of the digital TV, the mirroring data or the TV service data are switched in the background, and the playback stop, the bookmark, the time-machine ) Function can be done automatically.

11C or 11D, if the call application data is provided through a window on a part of the screen other than the screen front of the digital TV, the mirroring data or TV service data being provided may be stopped to be reproduced, The video can continue to be provided, but the audio can be muted and provided. In this case, the audio data subjected to muting may be subjected to a caption process or a subtitle process corresponding thereto. However, if playback is stopped, the bookmark and time machine functions may be performed automatically.

Providing the call application screen on the digital TV in the mobile terminal of FIGS. 11b to 11d can be done not only during the initial call connection but also during the middle of using the call application through the mobile terminal, as described above.

12 is a view for explaining a method of processing mirroring data and call application data according to an embodiment of the present invention.

The mirroring data is transferred from the mobile terminal 720 to the digital TV 710 in the same manner according to the Wi-Fi direct communication protocol, as shown in Fig. At this time, if the Bluetooth connection is activated between the mobile terminal 720 and the digital TV 710 for the call connection during the processing of the mirroring data, the voice data of the caller is transmitted through the Bluetooth communication protocol. Also, when a video call application is executed in the mobile terminal when a call is connected according to an incoming call, the mirroring TS being transmitted is stopped, and image / video data for video communication is transmitted in TS format according to the Wi-Fi direct communication protocol . The video call TS is demultiplexed by a demultiplexing unit 1220 through a receiving unit 1210 of a digital TV. The demultiplexed image / video data is transmitted to a video decoder 1230, Decoder 1240 without discarding it. In other words, the audio data in the video call TS is not decoded by the audio decoder 1240. The voice data of the caller received from the mobile terminal 720 according to the Bluetooth communication protocol is transmitted to the audio decoder 1240 through the receiving unit 1210 or the receiving unit 1210 and the demultiplexing unit 1220 of the digital TV 710, And decoded. The video data and audio data decoded by the video decoder 1230 and the audio decoder 1240 are output through the screen of the digital TV 710 and the speaker, respectively.

In the above, the image / video data and the voice data of the sender are different from each other in the path or communication protocol transmitted from the mobile terminal 720 to the digital TV 710, and the data format thereof may also be different. In addition, the transmitted data may be different in the process or path in the digital TV 710. [ Further, problems such as noise, lip sync, and the like may occur in processing the call application data on the digital TV 710. This will be described in more detail with reference to the following drawings.

13 is a flowchart illustrating a method of processing call application data of a mobile terminal using a digital TV and a remote controller according to an embodiment of the present invention.

Here, FIG. 13 may be viewed as a detailed flowchart of FIG. Meanwhile, the digital TV 1320 or the remote controller 1310 may be replaced with a wearable device (ex. Smart watch). Also, the mobile terminal 1330 may be replaced with a wearable device.

13, in order to provide a call service in the digital TV 1320, the remote controller 1310 and the mobile terminal 1330 are paired with each other according to the present invention (S1302 / S1304). Its order is not a problem. Meanwhile, the remote controller 1310 may be a set with the digital TV 1320 and may not perform a separate pairing process as a dedicated control means.

In FIG. 13, the process of transmitting / receiving the mirroring data described above is omitted for the sake of convenience.

The mobile terminal 1330 transmits the reception guide data according to the reception detection to the digital TV 1320 (S1308) when the reception is detected after the pairing (S1306), and the digital TV outputs the reception guide data on the screen (S1310).

The digital TV 1320 receives the first control command related to the received call guide data through the remote controller 1310 (S1312) and transmits the first control command to the mobile terminal 1330 to request the first call application data (S1314 ).

At this time, the digital TV 1320 and the mobile terminal 1330 are connected to each other via Bluetooth for the call service (S1316). When the Bluetooth connection is activated, the mobile terminal 1330 transmits the requested first call application data to the digital TV 1320 (S1318). As described above, the image / video data of the caller in the first call application data is delivered in a Wi-Fi direct manner and the voice data of the caller is delivered in a Bluetooth manner.

The digital TV 1320 processes the received first application data and outputs it through a screen and a speaker (S1320).

Then, the remote controller 1310 transmits the second call application data related to the video / audio data of the called party to the digital TV 1320. The digital TV 1320 moves the second call application data received from the remote controller 1310 And transmits it to the terminal 1330. The video / audio data of the called party is transmitted to the caller through the mobile terminal in the same manner as the above-described video / audio data of the caller (S1324).

The remote controller 1310 transmits a second control command related to call service termination to the digital TV 1320 in step S1326 and the digital TV 1320 transmits the second control command to the mobile terminal 1330 in step S1328 ). The mobile terminal 1330 transmits a response to the reception of the second control command to the digital TV 1320. That is, the call service is terminated.

Meanwhile, steps S1318 to S1324 may be repeatedly performed at the end of the call service, that is, until the second control command is received from the mobile terminal 1330. [

FIG. 14 is a view for explaining a solution to the lip sync problem occurring in the process of processing call application data through a digital TV according to an embodiment of the present invention, and FIG. FIG. 2 is a flow chart of a lip sync processing method in a digital TV according to an embodiment of the present invention. FIG.

As described above, the call application data of the mobile terminal, that is, the image / video data of the caller and the voice data of the caller are provided to the digital TV according to different communication protocols and routes. At this time, it is important for the digital TV to match the lip sync between the image / video data of the sender and the voice data of the sender in order to provide the call service. If the lip sync is not met, the called party may feel uncomfortable with the call service. Therefore, the digital TV should solve the lip synch problem.

In the present specification, the following embodiments are disclosed as a solution to the problem of lip sync caused by a call service in a digital TV.

An embodiment according to the present invention is to correct the lip synch using a time-stamp. At this time, the issue subject of the time stamp is the mobile terminal. In other words, the mobile terminal generates a timestamp and transmits it together when transmitting the call application data to the digital TV. Here, the time stamp may be included both in the image / video data of the sender and the voice data of the caller, or may be included only in the voice data of the caller. Meanwhile, the time stamp may be transmitted in a header structure of the data.

For example, the caller may not always speak while using the call service. Therefore, in transmitting the voice data of the caller, the processing of the voice data in the section where the actual voice of the caller does not exist may be a problem. In this case, the mobile terminal does not transmit voice data in a section where the voice of the sender does not exist, or insert null data into the voice data. On the other hand, the time stamp may be included in all or only a part of a data frame to be transmitted. In the latter case, in particular, only the voice data in which the actual voice of the caller exists can include the time stamp.

According to another embodiment of the present invention, lip sync is corrected by instantly reproducing call application data, that is, image / video data and audio data received from a mobile terminal in a digital TV without delay (No delay). Digital devices generally output decoded video data and audio data after temporarily storing them in a buffer. This is one of the ways to match a sink, but if you do not have the timestamp mentioned above, it may be more difficult to fit the sink. Therefore, in order to compensate for this, the present invention outputs the decoded video data and audio data without delay immediately after decoding. Thus, by outputting in the no delay method, the lip synch problem can be solved.

Further, according to the present invention, the time stamp and the no delay method may be combined and used for lip synch correction. For example, the original image / video frame and voice frame of the sender are corrected for lip sync using a timestamp, and then the no-delay scheme is applied to subsequent frames.

The digital TV may be processed as shown in Fig. 15 for lip synch correction. Referring to FIG. 15, the digital TV decodes the call application data (S1502) and decodes it (S1504). Here, the call application data is collectively referred to as audio and video data, and the detailed process is described above.

The digital TV determines whether the time stamp is included in the call application data from the decoded call application data (S1506).

As a result of the determination in step S1506, if the time stamp is included in the call application data, the data is processed according to the time stamp, and the image / video data of the sender is synchronized with the voice data of the sender and output through the screen and the speaker S1512). At this time, as described above, the time stamp method can be used together with the no delay method (S1510).

If it is determined in step S1506 that the time stamp is not included in the call application data, the digital TV processes the image / video data of the caller and the voice data of the caller so that the voice data is immediately output without delay (S1510) Respectively (S1512).

16 is a diagram for explaining an audio engine use for improving call quality in providing a call service through a digital TV according to an embodiment of the present invention.

Referring to FIG. 16, a called party can transmit voice data to a mobile terminal using a microphone of a remote control. At this time, the voice data may include not only the voice data of the called party but also the content being output through the speaker of the digital TV or voice data of the caller.

In other words, according to the present invention, when a video / audio call is made through a digital TV, instead of making a call through the mobile terminal itself, that is, the mobile terminal, Can be different. For example, there may be an echo leak due to a difference in the output of the speaker of the mobile terminal and the speaker of the digital TV. The distance between the speaker and the called party is relatively longer in the digital TV than in the mobile terminal, whereas the distance between the speaker and the called party is short or almost in contact with the mobile terminal. It may be necessary to control the output level accordingly. On the other hand, when the speaker of the mobile terminal is connected to the microphone and the possibility of noise being included between the caller and the caller is significantly low or low, when the microphone of the remote controller is relatively connected to the digital TV, Voice data of the caller output through the service, audio data of services such as other broadcasts, and the like may be included as noise. In addition, audio quality degradation due to increased audio delay may be a problem. Therefore, in order to improve the call quality in the process of processing the call application data of the mobile terminal through the digital TV, individual processing of the audio data in particular may be required.

In the present invention, as shown in FIG. 16, an audio engine is configured in a digital TV or a remote control to improve the call quality. In the audio engine, noise reduction (NR) for eliminating fan noise, auto gain control for adjusting the output level according to the distance between the speaker and the receiver, or the output difference between the mobile terminal and the digital TV speaker, AGC), and Acoustic Echo Cancellation (AEC) for noise reduction that occurs as the recipient's own voice is included. In addition, the digital TV is correcting the service data such as the broadcasting service being outputted through the speaker among the voice data of the called party inputted through the microphone of the remote control and the voice of the caller because it is already known. Meanwhile, in the present invention, such an audio engine can be processed by hardware or software.

Meanwhile, the digital TV can perform not only voice data of the called party to be transmitted to the mobile terminal but also voice data of the caller received from the mobile terminal by using the remote controller to perform processing for audio quality improvement such as output level adjustment .

17 is a diagram showing one embodiment of guide data for noise noise reduction method in processing call application data via digital TV according to the present invention.

FIG. 17 is a screen for providing guide data for controlling the call quality improvement process using the audio engine, as shown in FIG. 16, in the process of using the setup menu or the call application in the digital TV.

Referring to FIG. 17, the digital TV provides guide data for using the audio engine or the volume reduction method to reduce or remove the noise sound to the user, and operates according to the user's selection .

For example, when an audio engine is used, there may be a problem such as consuming power of a chip-set such as a CPU, and therefore, it is guided by whether a method of simply reducing the volume and minimizing the noise is used. For example, such guide data may be useful or efficient in the case of devices that are sensitive to power consumption, battery level, and the like.

On the other hand, in the present invention, a digital device can automatically perform audio processing by applying any of the above two methods in consideration of necessity, device state, and the like without guiding as shown in Fig.

The above description has been mainly explained in the case of a video call during data sharing (mirroring service) such as a mobile terminal and a digital TV. If only the voice call is desired instead of the video call in the above description, it is applicable as it is without the content of the image / video processing among the above contents. On the other hand, when only voice communication is performed, the digital TV can continuously view the mirroring data or the TV data without interruption, but only the audio data among the mirroring data or the TV data can be muted. In this case, if there is a caption or sub-title data, the caption data corresponding to the voice data automatically muted through the caption or sub-title data may be provided.

On the other hand, in a state in which the digital TV and the mobile terminal are not activated and / or the mirror cast is not activated, even if a call connection request is made to the mobile terminal while the digital TV is being viewed, that is, Enjoy.

In the above, when a video call is received, the mobile terminal may be provided with a menu for selecting a device to use the video call service on the screen of the mobile terminal. Briefly, a menu item can be provided so that a mobile terminal, a digital TV, a wearable device, a tablet PC, a notebook, and the like can be selected. The provision of such menus and menu items may be provided, for example, in a setting or in a specific mode, and may be provided in the presence of an external device of a pairable or paired mobile terminal. For example, if an external device capable of providing a call service in the vicinity is not detected, it is unnecessary to provide the menu and induce a user's selection. When the user of the mobile terminal selects to use the call service through the digital TV through the menu, Wi-Fi direct and Bluetooth connection for the mirroring service are automatically executed. And, as described above, a call service can be provided. On the other hand, in the above description, the mobile terminal and the digital TV can be operated without regard to the selection of accepting video call incoming or the like, by simply selecting the call service using device.

In this case, if the voice call is a voice call other than the video call, the digital TV and the mobile terminal activate only the Bluetooth connection to output the voice data of the calling party through the TV speaker, and the voice data of the called party is transmitted through the remote control, To the terminal. Meanwhile, in this case, the digital TV also provides an indication that the call service is being used through the digital TV to one area on the screen and provides it to a user or another user watching together so that it can be recognized, , And various menu items for controlling a call service such as voice / video call switching. In addition, in the case of providing a call service in a digital TV, the audio of the content being automatically used is muted, and a case in which a caption or a subtitle exists can be processed and provided.

In the above description, when the mirroring service is inactivated, the menu is provided to the mobile terminal so that the digital TV can use the call service when the mirroring service is disabled. However, the menu is not necessarily provided to the mobile terminal, A wearable device, and the like, and a call service may be performed by connection or the like in the above-described manner.

18 is a diagram for explaining a process of acquiring image / video data of a called party according to the present invention.

FIG. 18 particularly relates to a case where a video call is incoming or a video call is made. In other words, FIG. 18 illustrates a method for acquiring image / video data of a user for incoming or outgoing video calls. For convenience of explanation, the case where a video call is received will be described below as an example.

As described above, there are three methods for acquiring image / video data of a user for a video call. One is a method using a camera sensor of a mobile terminal, the other is a method using a camera sensor provided in a digital TV, and the other is a method using a peripheral camera sensor.

18, if the mobile terminal 1820 is located in the vicinity of the digital TV 1810, the image / video data of the called party is acquired through the camera sensor of the mobile terminal 1820, The image / video data may be sent to the digital TV 1810 via Wi-Fi Direct and provided with the image / video data of the sender on the screen of the digital TV 1810. On the other hand, the mobile terminal 18200 may be mounted on a support 18300 of a digital TV 1810 or mounted or attached on a bezel 1840 to be used for acquiring image / video data of a recipient, as shown, In order to use the camera sensor of the mobile terminal 1820, a structure similar to or similar to the cradle for the mobile terminal 1820 may be provided on the digital TV 1810.

At this time, the camera of the mobile terminal 1820 may be immediately turned on for the incoming call, and may be turned on according to the selection of the called party to acquire the image / video data of the called party. When the video call application is executed on the mobile terminal 1820, as described above, on the screen of the digital TV 1810, a first window for outputting the image / video data of the sender and a second window for outputting the image / 2 window is provided. At this time, first, the image / video data obtained from the camera sensor of the mobile terminal 1820, which is automatically turned on according to the incoming call, is outputted. However, unlike FIG. 18, when the mobile terminal is not around the user, There may be cases where it is difficult to acquire the image / video of a proper receiver because the direction is not correct. In this case, when the second window on the user digital TV 1810 is accessed, a guide for activating / deactivating the camera sensor of the mobile terminal 1820, using the camera sensor of the digital TV 1810, You can provide the data and follow your choice. On the other hand, if a peripheral camera sensor is detected, a list thereof can also be provided. The video / image data of the called party is transmitted to the mobile terminal 1820 via the digital TV 1810 or directly to the digital TV 1810 via the mobile terminal 1820 And may be provided on the screen of the digital TV 1810. Further, when it is desired to change the image / video data of the receiver provided in this manner, the second window can be accessed and changed.

As described above, the present invention can be applied not only to incoming video calls but also to video call origination.

Thus, according to the present invention, when data is shared between a mobile terminal and a digital TV, such as a mirror cast, a call sound is output through a speaker of the digital TV, I can share the call. In this case, the speaker and the microphone of the mobile terminal may be turned off, and if the Bluetooth device of the mobile terminal is turned off, the speaker and the microphone of the mobile terminal may be turned on again. As described above, according to the present invention, during a data sharing process, a video / voice call can be used by using a certain call application of the mobile terminal.

The digital device disclosed in this specification and the data processing method in the digital device can be applied to a configuration or a method of the embodiments described above in a limited manner, Some of which may be selectively combined.

Meanwhile, the operation method of the digital device disclosed in this specification can be implemented as a code readable by a processor in a recording medium readable by a processor included in the digital device. The processor-readable recording medium includes all kinds of recording devices in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include ROM (Read Only Memory), RAM (Random Access Memory), CD-ROM, magnetic tape, floppy disk, optical data storage device, And may be implemented in the form of a carrier-wave. In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Of the right. Further, such modifications are not to be understood individually from the technical idea of the present invention.

201: network interface unit 202: TCP / IP manager
203: service delivery manager 204: SI decoder
205 demultiplexer 206 audio decoder
207: Video decoder 208:
209: Service Control Manager 210: Service Discovery Manager
211: SI & Metadata Database 212: Metadata Manager
213: service manager 214: UI manager

Claims (20)

  1. A method of processing data of a mobile terminal in a digital device,
    Paired with the mobile terminal;
    Receiving mirroring data from the mobile terminal in a first communication protocol;
    Outputting the received mirroring data on a screen;
    Receiving guide data according to an incoming call detection from the mobile terminal;
    Activating a second communication protocol for call service; And
    Receiving first call application data related to a caller from the mobile terminal according to the first communication protocol, receiving and outputting second call application data related to the caller according to the activated second communication protocol, And transmitting third call application data related to a called party received via an input means connected to the digital device according to a communication protocol to the mobile terminal.
  2. The method according to claim 1,
    Demultiplexing the received first call application data; And
    Further comprising the steps of: performing video decoding on the video data of the originator of the demultiplexed first call application data; and not performing audio decoding of the voice data.
  3. The method according to claim 1,
    Obtaining video data of the called party through a camera sensor; And
    And outputting the acquired video data of the called party together with the received first call application data and transmitting the video data to the mobile terminal according to the first communication protocol. .
  4. The method of claim 3,
    Determining whether there is video data of a called party obtained through the camera sensor; And
    And outputting the preset alternative image data together with the first call application data if the video data of the called party does not exist as a result of the determination and transmitting the alternative image data to the mobile terminal according to the first communication protocol A method for processing data in a digital device.
  5. The method according to claim 1,
    Determining whether a time stamp exists from the first call application data and the second call application data related to the caller; And
    Further comprising the step of synchronizing the first call application and the second call application according to the time stamp if the time stamp exists, and outputting the synchronized first call application and the second call application according to the time stamp.
  6. 6. The method of claim 5,
    And if the time stamp does not exist, immediately outputs the decoded first call application and the second application without delay.
  7. The method according to claim 1,
    At least one of the second call application data and the third call application data received according to the second communication protocol,
    Wherein at least one of noise reduction, auto gain control, acoustic echo cancellation processing, and volume reduction is performed in the digital device.
  8. The method according to claim 1,
    Wherein the digital device is one of a digital TV and a wearable device,
    Wherein the input means is any one of a remote control connected to the digital TV and a wearable device.
  9. The method according to claim 1,
    Outputting guide data received from the mobile terminal to a predetermined area on the screen; And
    Further comprising the step of receiving a user's selection from the output guide data.
  10. The method according to claim 1,
    Wherein the received first call application data comprises:
    Wherein the data is output through any one of the screen front, the PIP window, and the screen division window.
  11. A digital device for receiving and processing data of a mobile terminal,
    Receiving, from the mobile terminal, mirroring data in a first communication protocol, guide data in response to detection of an incoming call from the mobile terminal, and first call application data related to a caller in accordance with the first communication protocol from the mobile terminal, A receiving unit for receiving second call application data related to the caller according to a second communication protocol;
    Activates a second communication protocol with the mobile terminal for a call service according to the reception of the guide data, controls to output first call application data and second call application data related to the caller after activation, 2 control protocol for controlling the mobile terminal to transmit third call application data related to a called party received through an input means connected to the digital device according to a second communication protocol;
    A screen for outputting the mirroring data, the guide data, and first call application data related to the caller; And
    And a speaker for outputting second call application data related to the caller.
  12. 12. The method of claim 11,
    A demultiplexer for demultiplexing the received first call application data;
    A video decoder for decoding the video data of the screen sharing data and the demultiplexed first call application data in the first communication protocol from the mobile terminal; And
    Further comprising: an audio decoder for decoding audio data of screen sharing data from the mobile terminal in a first communication protocol,
    Wherein the audio decoder does not decode the voice data in the demultiplexed first call application data.
  13. 12. The method of claim 11,
    Further comprising a camera sensor for acquiring video data of the called party,
    Wherein,
    Controls to output the video data of the called party obtained through the camera sensor on the screen together with the received first call application data and controls to be transmitted to the mobile terminal according to the first communication protocol Digital device.
  14. 14. The method of claim 13,
    Wherein,
    And controls to output preset image data on the screen together with the first call application data if there is no video data of the called party, if the video data of the called party is not present, And to transmit the data to the mobile terminal according to the first communication protocol.
  15. 12. The method of claim 11,
    Wherein,
    Determining whether a time stamp exists from the first call application data and the second call application data related to the caller and, if the time stamp exists, synchronizing the first call application and the second call application according to the time stamp; So as to be output.
  16. 16. The method of claim 15,
    Wherein,
    And if the time stamp is not present, controls to output the decoded first call application and the second application immediately without delay.
  17. 12. The method of claim 11,
    Wherein,
    At least one of second call application data and third call application data received according to the second communication protocol is subjected to noise reduction, auto gain control and acoustic echo cancellation ) Processing and volume reduction are performed so as to be performed.
  18. 12. The method of claim 11,
    Wherein the digital device is one of a digital TV and a wearable device,
    Wherein the input means is any one of a remote controller connected to the digital TV and a wearable device.
  19. 12. The method of claim 11,
    Wherein,
    And a controller for controlling to output guide data received from the mobile terminal to a predetermined area on the screen, receiving the user's selection from the output guide data, and transmitting the first call application data and the second call application data to the mobile terminal Lt; RTI ID = 0.0 > 1, < / RTI >
  20. 12. The method of claim 11,
    Wherein,
    And controls the received first call application data to be output through any one of the screen front, the PIP window, and the screen division window.
KR1020150104984A 2015-07-24 2015-07-24 Digital device and method of processing data the same KR20170011763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150104984A KR20170011763A (en) 2015-07-24 2015-07-24 Digital device and method of processing data the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150104984A KR20170011763A (en) 2015-07-24 2015-07-24 Digital device and method of processing data the same
PCT/KR2016/008033 WO2017018737A1 (en) 2015-07-24 2016-07-22 Digital device and method for processing data in digital device

Publications (1)

Publication Number Publication Date
KR20170011763A true KR20170011763A (en) 2017-02-02

Family

ID=57885167

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150104984A KR20170011763A (en) 2015-07-24 2015-07-24 Digital device and method of processing data the same

Country Status (2)

Country Link
KR (1) KR20170011763A (en)
WO (1) WO2017018737A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009453A1 (en) * 2017-07-06 2019-01-10 엘지전자 주식회사 Display device
WO2019039862A1 (en) * 2017-08-22 2019-02-28 에스케이텔레콤 주식회사 Short range wireless communication device and method
KR20190125963A (en) * 2019-10-30 2019-11-07 에스케이텔레콤 주식회사 Apparatus and method for providing short-range wireless communication
US10560654B2 (en) 2017-09-20 2020-02-11 Lg Electronics Inc. Display device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101479087B1 (en) * 2007-11-05 2015-01-13 삼성전자주식회사 A method for providing video telephony using a video apparatus
US9008177B2 (en) * 2011-12-12 2015-04-14 Qualcomm Incorporated Selective mirroring of media output
KR20130089067A (en) * 2012-02-01 2013-08-09 목포대학교산학협력단 Smart television capable of providing videophone service
JP5988900B2 (en) * 2013-03-08 2016-09-07 アルパイン株式会社 In-vehicle electronic device, in-vehicle system, hands-free calling program, and hands-free calling method
KR20150051776A (en) * 2013-11-05 2015-05-13 삼성전자주식회사 Display apparatus and method for controlling of display apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019009453A1 (en) * 2017-07-06 2019-01-10 엘지전자 주식회사 Display device
WO2019039862A1 (en) * 2017-08-22 2019-02-28 에스케이텔레콤 주식회사 Short range wireless communication device and method
KR20190021121A (en) * 2017-08-22 2019-03-05 에스케이텔레콤 주식회사 Apparatus and method for providing short-range wireless communication
US10560654B2 (en) 2017-09-20 2020-02-11 Lg Electronics Inc. Display device
KR20190125963A (en) * 2019-10-30 2019-11-07 에스케이텔레콤 주식회사 Apparatus and method for providing short-range wireless communication

Also Published As

Publication number Publication date
WO2017018737A1 (en) 2017-02-02

Similar Documents

Publication Publication Date Title
EP2838006B1 (en) Display device and method for controlling the same
JP6254718B2 (en) Method for adjusting operating state of smart home equipment, apparatus, program, and recording medium
RU2643478C2 (en) Method, device and system for projecting on screen
US10585554B2 (en) Apparatus and method for managing interactive television and voice communication services
KR102097640B1 (en) Mobile terminal and control method thereof
EP3276976A1 (en) Method, apparatus, host terminal, server and system for processing live broadcasting information
KR101832463B1 (en) Method for controlling a screen display and display apparatus thereof
KR101852818B1 (en) A digital receiver and a method of controlling thereof
US8407749B2 (en) Communication system and method
CN102835124B (en) Image display and the method for operating image display
US20130169765A1 (en) Display apparatus, image processing system, display method and imaging processing thereof
US9794616B2 (en) Digital device and method for controlling the same
KR20140144029A (en) Multimedia device comprising flexible display and method for controlling the same
JP2012120223A (en) Displaying mobile television signals on secondary display device
US20140094151A1 (en) Systems and methods for controlling audio playback on portable devices with vehicle equipment
JP3905509B2 (en) Apparatus and method for processing audio signal during voice call in mobile terminal for receiving digital multimedia broadcast
KR20120051967A (en) Method for operating an apparatus for displaying image
US9613591B2 (en) Method for removing image sticking in display device
CN103491445B (en) Image display device, mobile terminal and the method for operating them
US20140340334A1 (en) Mobile terminal and method of controlling the mobile terminal
KR101774316B1 (en) Image display device and method of managing conents using the same
US9591680B2 (en) Method for controlling system including electronic tag, mobile device, and display device, and mobile device and display device of the same
US8737916B2 (en) Metadata display control method and system for broadcast receiver
CN103686269A (en) Image display apparatus and method for operating the same
US10007325B2 (en) Electronic device, audio device, and methods of controlling electronic device and audio device power supply