KR20170000714A - Video display device and operating method thereof - Google Patents

Video display device and operating method thereof Download PDF

Info

Publication number
KR20170000714A
KR20170000714A KR1020150090035A KR20150090035A KR20170000714A KR 20170000714 A KR20170000714 A KR 20170000714A KR 1020150090035 A KR1020150090035 A KR 1020150090035A KR 20150090035 A KR20150090035 A KR 20150090035A KR 20170000714 A KR20170000714 A KR 20170000714A
Authority
KR
South Korea
Prior art keywords
screen
mirroring
mirroring screen
terminal
pointer
Prior art date
Application number
KR1020150090035A
Other languages
Korean (ko)
Inventor
허승현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150090035A priority Critical patent/KR20170000714A/en
Publication of KR20170000714A publication Critical patent/KR20170000714A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image display device according to an embodiment of the present invention displays a mirroring screen based on mirroring data with regard to a mirroring screen transmitted from a terminal and displays a pointer on the displayed mirroring screen. The image display device moves the pointer in a region corresponding to the mirroring screen, based on a user input with regard to the displayed pointer and performs an operation with regard to the mirroring screen based on the user input with regard to the pointer. Accordingly, the present invention can easily control the mirroring screen transmitted from the terminal.

Description

TECHNICAL FIELD [0001] The present invention relates to a video display device and an operation method thereof.

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and a method of operating the same that display a mirroring screen transmitted from a terminal.

In recent years, digital TV services using wired or wireless communication networks have become popular. The digital TV service can provide various services that can not be provided by the existing analog broadcasting service.

Such digital TV services can be used not only in general TVs but also in terminals such as PCs and smart phones.

Recently, the proportion of users using digital TV services through terminals such as PCs and smart phones is increasing.

In the case of IPTV (Internet Protocol Television) service, which is one type of digital TV service, it provides a bidirectionality allowing a user to actively select the type of viewing program, viewing time, and the like. The IPTV service may provide various additional services such as Internet search, home shopping, online game, etc. based on the bidirectionality.

Meanwhile, in recent years, a screen of a terminal can be displayed on a screen of a digital TV through a screen mirroring function of displaying a terminal screen on a digital TV.

With this screen mirroring technology, users can watch all the screens of games, videos, photos, internet, etc., running on smartphones or tablet PCs, on the big screen of digital TV.

However, when the screen of the terminal is displayed on the digital TV through the screen mirroring, the digital TV can not use the touch input for the touch screen, and the digital TV displays only the screen of the terminal as it is, There is a difficult problem.

In addition, when the ratio of the mirroring screen or the screen display direction is different from the screen ratio of the digital TV or the screen display direction, it is difficult to operate the mirroring screen and a remaining area such as a black area occurs on the TV screen.

An object of the present invention is to easily operate a mirroring screen transmitted from a terminal.

According to another aspect of the present invention, there is provided a method of operating an image display device, the method comprising: displaying the mirroring screen based on mirroring data of the mirroring screen transmitted from the terminal; Displaying a pointer on the displayed mirroring screen; Moving the pointer within an area corresponding to the mirrored screen, based on user input to the displayed pointer; And performing an operation on the mirroring screen based on a user input to the pointer.

An image display apparatus according to an embodiment of the present invention includes a communication unit for receiving mirroring data on a mirroring screen transmitted from a terminal; A display unit for displaying a mirroring screen corresponding to the received mirroring data; And displaying a pointer on the mirroring screen, moving the pointer in an area corresponding to the mirroring screen based on a user input to the pointer, and moving the pointer on the mirroring screen based on a user input to the pointer And a control unit for performing an operation on the display unit.

According to various embodiments of the present invention, various operations on the mirroring screen transmitted from the terminal can be facilitated, so that the user can conveniently use the screen mirroring function in the video display device.

1 is a block diagram illustrating a configuration of a video display device according to an embodiment of the present invention.
2 is a block diagram of a remote control device according to an embodiment of the present invention.
3 shows an example of an actual configuration of a remote control apparatus according to an embodiment of the present invention.
4 is a block diagram of a terminal related to an embodiment of the present invention.
5 is a ladder diagram illustrating a method of operating an image display device according to an embodiment of the present invention.
6 is an exemplary view of a cursor movable area according to an embodiment of the present invention.
7 is an exemplary view of black area utilization in accordance with an embodiment of the present invention.
Figure 8 is an illustration of black area utilization in accordance with another embodiment of the present invention.
FIG. 9 is an exemplary view illustrating guide information about screen rotation of a mirroring screen according to an embodiment of the present invention. Referring to FIG.
FIG. 10 is an exemplary diagram illustrating screen rotation of a mirroring screen according to an embodiment of the present invention.
11 is a diagram illustrating an example of menu display for a mirroring screen according to an embodiment of the present invention.
12 is an exemplary view of a swipe recognition area according to an embodiment of the present invention.
13 to 14 are diagrams illustrating examples of a guide for swipe input according to an embodiment of the present invention.
15 illustrates an upper bar region and a lower bar region in a mirroring screen according to an embodiment of the present invention.
16 is an exemplary view showing an enlarged upper bar area in a mirrored bottom according to an embodiment of the present invention.
17 is an exemplary view illustrating image storage corresponding to a mirroring screen according to an embodiment of the present invention.
18 is an exemplary view of a user input to a cursor according to an embodiment of the present invention.

Hereinafter, embodiments related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

Hereinafter, a screen output control method according to an embodiment of the present invention and an image display device using the same will be described in detail with reference to the accompanying drawings.

The video display device according to the embodiment of the present invention is an intelligent video display device that adds a computer support function to a broadcast reception function, for example, and is equipped with an internet function while being faithful to a broadcast reception function, It is more convenient to use interface such as touch screen or space remote control. Also, it can be connected to the Internet and a computer by the support of a wired or wireless Internet function, and can perform functions such as e-mail, web browsing, banking or game. A standardized general-purpose OS can be used for these various functions.

Therefore, the video display device described in the present invention can freely add or delete various applications, for example, on a general-purpose OS kernel, so that various user-friendly functions can be performed. More specifically, the video display device may be, for example, a network TV, an HBBTV, a smart TV, an LED TV, an OLED TV, or the like, and may be applicable to a smartphone in some cases.

1 is a block diagram illustrating a configuration of an image display device 100 according to an embodiment of the present invention.

1, an image display device 100 includes a sound acquisition unit 110, an image acquisition unit 120, a broadcast reception unit 130, a network interface unit 133, an external device interface unit 135, A user input interface unit 150, a communication unit 160, a control unit 170, a display unit 180, an audio output unit 185, and a power supply unit 190.

The voice acquiring unit 110 can acquire the voice.

The voice acquiring unit 110 may include at least one microphone so as to acquire voice through the included microphone. And the voice acquisition unit 110 may transmit the acquired voice to the control unit 170. [

The image acquisition unit 120 may acquire an image.

The image acquiring unit 120 may include at least one or more cameras to acquire an image through the included camera.

The image acquiring unit 120 may transmit the acquired image to the controller 170. [

The broadcast receiver 130 may include a tuner 131, a demodulator 132, and a network interface 133.

The external device interface unit 135 may receive an application or application list in an adjacent external device and may transmit the received application or application list to the control unit 170 or the storage unit 140.

The network interface unit 133 may provide an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. The network interface unit 133 can transmit or receive data to another user or another electronic device via the network connected thereto or another network linked to the connected network.

Further, it is possible to transmit some content data stored in the video display device 100 to another user registered in advance in the video display device 100 or a selected user or selected one of the other electronic devices.

The network interface unit 133 can access a predetermined web page through the connected network or another network linked to the connected network. That is, it is possible to access a predetermined web page through a network and transmit or receive data with the server.

The network interface unit 133 may receive content or data provided by the content provider or the network operator. That is, the network interface unit 133 can receive contents such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider or a network provider through a network.

In addition, the network interface unit 133 can receive update information and an update file of the firmware provided by the network operator, and can transmit data to the Internet or a content provider or a network operator.

The network interface unit 133 can select and receive a desired application among applications open to the public via the network.

The storage unit 140 stores programs for signal processing and control in the controller 170, and can store image-processed video, audio, or data signals.

The storage unit 140 may perform a function for temporarily storing video, audio, or data signals input from the external device interface unit 135 or the network interface unit 133, It may also store information about the image.

The storage unit 140 may store a list of applications or applications input from the external device interface unit 135 or the network interface unit 133. [

The video display device 100 may reproduce and provide a content file (a moving image file, a still image file, a music file, a document file, an application file, etc.) stored in the storage unit 140 to a user.

The user input interface unit 150 may transmit a signal input by the user to the control unit 170 or may transmit a signal from the control unit 170 to the user. For example, the user input interface unit 150 may be implemented by various communication methods such as Bluetooth, Ultra Wideband (WB), ZigBee, RF (Radio Frequency) communication, and infrared (IR) It is possible to receive and process control signals such as power on / off, channel selection, screen setting, and the like from the remote control device 200, or transmit control signals from the control unit 170 to the remote control device 200. [

The user input interface unit 150 may transmit a control signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170.

The communication unit 160 is connected to the video display device 100 and the network, the video display device 100 and the terminal 300 or the video display device 100 and the server (not shown) Modules. The communication unit 160 may perform wired communication or wireless communication with a server (not shown) by wired communication or wireless communication with a gateway (not shown).

For example, the communication unit 160 may include an Internet module for accessing the Internet, and may allow the video display device 100 to access the Internet through wired or wireless communication through the Internet module.

For example, the communication unit 160 may include a short-range communication module so that the video display device 100 can communicate wirelessly with other devices. The local communication module included in the communication unit 160 may be Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, WLAN (Wi-Fi), and Near Field Communication (NFC).

The video signal processed by the controller 170 may be input to the display unit 180 and output as an image corresponding to the video signal. The video signal processed by the controller 170 may be input to the external output device through the external device interface 135. [

The audio signal processed by the control unit 170 may be output to the audio output unit 185 through audio. The voice signal processed by the control unit 170 may be output to the external device through the external device interface unit 135. [

In addition, the control unit 170 can control the overall operation in the video display device 100. [

Meanwhile, the control unit 170 can control the video display device 100 through a user command or an internal program input through the user input interface unit 150, and can connect to the network and display a list of applications or applications desired by the user And can be downloaded into the display device 100.

The control unit 170 may output the video or audio signal processed through the channel information selected by the user through the display unit 180 or the audio output unit 185.

The control unit 170 controls the operation of the external device through the external device interface unit 135 according to an external device video playback command received through the user input interface unit 150. For example, So that a video signal or a voice signal can be output through the display unit 180 or the audio output unit 185.

The control unit 170 may control the display unit 180 to output an image and may output the broadcast image input through the tuner 131 or the external input through the external device interface unit 135, Or an image input through the network interface unit or an image stored in the storage unit 140 may be output from the display unit 180. [ In this case, the image output to the display unit 180 may be a still image or a moving image, and may be a 2D image or a 3D image.

In addition, the controller 170 may control the content stored in the video display device 100, the received broadcast content, and external input content input from the outside to be reproduced, and the content may be a broadcast video, an external input video, , A still image, a connected web screen, and a document file.

Also, the control unit 170 may control the application stored in the video display device 100 to be executed. The control unit 170 can output an image corresponding to the executed application to the display unit 180 and output the audio corresponding to the executed application to the audio output unit 185. [

Meanwhile, the control unit 170 may process one or more tasks at the same time. Here, the task may refer to each of the contents reproduced by the control unit 170 and the application to be executed. When the control unit 170 processes a plurality of jobs in one application, each of the plurality of jobs may correspond to one task. The processing of the plurality of tasks by the control unit 170 may be referred to as multitasking.

The control unit 170 may adjust at least one of a screen size and a resolution for each of the one or more tasks output to the display unit 180. [ The controller 170 may convert at least one of the screen size and the resolution for each of the one or more tasks output to the display unit 180 to a resolution different from that of the output screen.

Here, the resolution may mean the number of pixels constituting an image or a screen within a certain area.

In addition, the control unit 170 can determine an appropriate screen mode for each task based on the usage status of each of the one or more tasks output to the display unit 180, and can change the screen for the task to the determined screen mode And outputs it to the display unit 180.

The control unit 170 may output the mirroring screen to the display unit 180 based on the mirroring data of the mirroring screen transmitted from the terminal 300 directly or through the network, And output it to the audio output unit 185.

The display unit 180 converts the video signal, the data signal, the OSD signal, the video signal, the data signal, and the like received from the external device interface unit 135 into R, G, and B signals, respectively, Signal can be generated.

Since the image display device 100 shown in FIG. 1 is only an embodiment of the present invention, Some of the illustrated components may be integrated, added, or omitted depending on the specifications of the image display device 100 actually implemented.

That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

1, the video display device 100 may include a tuner 131 and a demodulator 132 and may be connected to the network interface 133 or the external device interface 140. [ It is also possible to receive and reproduce an image through the display unit 135.

For example, the video display device 100 may be divided into an image processing device such as a set-top box for receiving broadcast signals or contents according to various network services, and a content playback device for playing back content input from the image processing device .

In this case, the image output method according to the embodiment of the present invention to be described below may be applied not only to the image display apparatus 100 as described with reference to FIG. 1, but also to the image processing apparatus such as the set- And an audio output unit 185. The audio output unit 185 may be an audio output unit or an audio output unit.

Next, a remote control apparatus according to an embodiment of the present invention will be described with reference to Figs. 2 to 3. Fig.

FIG. 2 is a block diagram of a remote control device according to an embodiment of the present invention, and FIG. 3 shows an example of an actual configuration of a remote control device according to an embodiment of the present invention.

2, the remote control apparatus 200 includes a fingerprint recognition unit 210, a wireless communication unit 220, a user input unit 230, a sensor unit 240, an output unit 250, a power supply unit 260 A storage unit 270, a control unit 280, and a voice acquisition unit 290.

Referring to FIG. 2, the wireless communication unit 220 transmits / receives signals to / from any one of the terminals according to the embodiments of the present invention described above.

The remote control device 200 includes an RF module 221 capable of transmitting and receiving signals to and from the image display device 100 according to the RF communication standard and transmits and receives signals to and from the image display device 100 according to the IR communication standard An IR module 223 may be provided. In addition, the remote control device 200 may include a Bluetooth module 225 capable of transmitting and receiving signals with the image display device 100 according to the Bluetooth communication standard. The remote control apparatus 200 includes an NFC module 227 capable of transmitting and receiving signals to and from the image display apparatus 100 according to a NFC (Near Field Communication) communication standard. And a WLAN module 229 capable of transmitting and receiving signals to and from the video display device 100. [

The remote control device 200 transmits a signal including information on the motion of the remote control device 200 to the image display device 100 through the wireless communication unit 220.

The remote control device 200 can receive a signal transmitted from the video display device 100 through the wireless communication unit 220 and transmit the signal to the video display device 100 through the wireless communication unit 220, On / off, channel change, volume change, and the like.

The user input unit 230 may include a keypad, a button, a touch pad, a wheel key, or a touch screen. The user can input a command related to the image display device 100 to the remote control device 200 by operating the user input part 230. [ When the user input unit 230 includes a hard key button, the user can input a command related to the image display device 100 to the remote control device 200 through the push operation of the hard key button. This will be described with reference to FIG.

Referring to FIG. 3, the remote control apparatus 200 may include a plurality of buttons. The plurality of buttons include a fingerprint recognition button 212, a power button 231, a home button 232, a LIVE button 233, an external input button 234, a volume button 235, a voice recognition button 236, A button 237, a wheel key 238, and a cancel button 239. [

The fingerprint recognition button 212 may be a button for recognizing the fingerprint of the user. In one embodiment, the fingerprint recognition button 212 is capable of a push operation and may receive a push operation and a fingerprint recognition operation.

The power button 231 may be a button for turning on and off the image display device 100 and the external device.

The home button 232 may be a button for displaying a basic menu of the image display device 100. [

The LIVE button 233 may be a button for displaying a real time broadcast program.

The external input button 234 may be a button for displaying an image received from an external device.

The volume button 235 may be a button for controlling the audio output of the video display device 100.

The voice recognition button 236 may be a button for recognizing the voice of the user.

The channel button 237 may be a button for receiving a broadcast signal of a specific broadcast channel or a specific broadcast service.

The wheel key 238 may be a wheel-shaped key that can receive user input for more than one direction. For example, the wheel key 238 may be a wheel-shaped key that can receive user inputs in up, down, or up, down, left, and right directions. Further, the wheel key 238 may further include a direction key. Here, the user input for the up and down directions of the wheel key 238 is a user input for rotating the wheel key 238, and the user input for the left and right directions can be a user input for tilting the wheel key 238. [ Wheel key 238 may also receive a push input.

Fig. 2 will be described again.

When the user input unit 230 has a touch screen, the user can touch a soft key of the touch screen to input a command related to the image display device 100 to the remote control device 200. [

In addition, the user input unit 230 may include various types of input means such as a scroll key, a jog key, and a touch pad that can be operated by the user, and the present invention is not limited to the scope of the present invention.

The sensor unit 240 may include a gyro sensor 241 or an acceleration sensor 243 and the gyro sensor 241 may sense information on the motion of the remote control device 200. [

For example, the gyro sensor 241 may sense information about the operation of the remote control device 200 on the basis of the x, y, and z axes, and the acceleration sensor 243 may sense the movement speed of the remote control device 200 And the like can be sensed. The remote control device 200 may further include a distance measuring sensor to sense a distance to the display unit 180 of the image display device 100. [

The output unit 250 may output an image or voice signal corresponding to the operation of the user input unit 235 or corresponding to the signal transmitted from the image display device 100. [ The user can recognize whether the user input unit 235 is operated or whether the video display device 100 is controlled through the output unit 250.

For example, the output unit 250 may include an LED module 251 that is turned on when the user input unit 230 is operated or a signal is transmitted and received between the image display device 100 and the image display device 100 through the wireless communication unit 220, An audio output module 255 for outputting sound, or a display module 257 for outputting an image.

Also, the power supply unit 260 supplies power to the remote control device 200, and if the remote control device 200 is not moved for a predetermined time, the power supply is interrupted, thereby reducing power waste. The power supply unit 260 may resume power supply when a predetermined key provided in the remote control device 200 is operated.

The storage unit 270 may store various types of programs, application data, and the like necessary for the control or operation of the remote control apparatus 200. [ If the remote control device 200 wirelessly transmits and receives signals through the image display device 100 and the RF module 221, the remote control device 200 and the image display device 100 transmit signals through a predetermined frequency band Send and receive.

The control unit 280 of the remote control device 200 stores information on a frequency band or the like capable of wirelessly transmitting and receiving a signal with the image display device 100 paired with the remote control device 200 in the storage unit 270 Can be referenced.

The control unit 280 controls various matters related to the control of the remote control device 200. [ The control unit 280 transmits a signal corresponding to the predetermined key operation of the user input unit 230 or a signal corresponding to the motion of the remote control device 200 sensed by the sensor unit 240, (100).

Also, the voice acquisition unit 290 of the remote control device 200 can acquire voice.

The voice acquisition unit 290 may include at least one microphone (not shown) to acquire voice through the included microphone.

The voice acquisition unit 290 may transmit the acquired voice to the control unit 280. [

Next, the structure of the terminal 300 according to an embodiment of the present invention will be described with reference to FIG.

4 is a block diagram of a terminal related to an embodiment of the present invention.

The terminal 300 includes a wireless communication unit 310, an audio / video input unit 320, a user input unit 330, a sensing unit 340, an output unit 350, a memory 360, A controller 370, a controller 380, a power supply 390, and the like. The components shown in Fig. 4 are not essential, and a terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 310 can perform wireless communication between the terminal 300 and the wireless communication system or between the terminal 300 and another terminal 300 or between the terminal 300 and another device or between the terminal 300 and the external server. Or < / RTI > In addition, the wireless communication unit 310 may include one or more modules for connecting the terminal 300 to one or more networks.

The wireless communication unit 310 may include at least one of a broadcast receiving module 311, a mobile communication module 312, a wireless Internet module 313, a short distance communication module 314, and a location information module 315 .

The broadcast receiving module 311 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 312.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 311 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 311 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 311 may be stored in the memory 360.

The mobile communication module 312 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 313 is a module for wireless Internet access, and may be built in or externally attached to the terminal 300. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 314 is a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IRDA), Ultra Wideband (UWB), ZigBee, NFC (Near Field Communication), etc. can be used as short range communication technology have.

Also, the short range communication module 314 may transmit a magnetic signal.

The position information module 315 is a module for obtaining the position (or current position) of the terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, if the terminal utilizes a GPS module, the terminal can acquire the position of the terminal using a signal transmitted from the GPS satellite. As another example, if the terminal utilizes a Wi-Fi module, it can acquire the position of the terminal based on information of a wireless access point (AP) that transmits or receives a wireless signal with the Wi-Fi module. Optionally, the location information module 315 may perform any of the other functions of the wireless communication unit 310 to obtain data relating to the location of the terminal, in addition or alternatively. The location information module 315 is a module used to obtain the location (or current location) of the terminal, and is not limited to a module that directly calculates or obtains the location of the terminal.

Referring to FIG. 4, the A / V (Audio / Video) input unit 320 is for inputting an audio signal or a video signal, and may include a camera 321 and a microphone 322. The camera 321 processes an image frame such as a still image or moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display section 351. [

The image frame processed by the camera 321 may be stored in the memory 360 or may be transmitted to the outside via the wireless communication unit 310. [ At least two cameras 321 may be provided depending on the use environment.

The microphone 322 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 312 when the voice data is in the call mode, and output. The microphone 322 may be implemented with various noise reduction algorithms for eliminating noise generated in the process of receiving an external sound signal.

The input unit 320 includes a camera 321 or an image input unit for inputting a video signal, a microphone 322 for inputting an audio signal or an audio input unit, a user input unit 323 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 320 may be analyzed and processed by a user's control command.

The user input unit 330 may have the same configuration as the user input unit 323 described above.

The sensing unit 340 may include at least one sensor for sensing at least one of information in the terminal, surrounding environment information surrounding the terminal, and user information. For example, the sensing unit 340 may include a proximity sensor 341, an illumination sensor 342, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 322, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a temperature sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.), and a fingerprint recognition sensor. Meanwhile, the terminal 300 disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 350 includes at least one of a display unit 351, an acoustic output module 352, an alarm unit 353, and a haptic module 354 for generating an output related to visual, auditory, . The display unit 351 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 323 that provides an input interface between the terminal 300 and the user and may provide an output interface between the terminal 300 and the user.

The display unit 351 displays (outputs) information processed by the terminal 300. [ For example, when the terminal is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the terminal 300 is in the video communication mode or the photographing mode, it displays the photographed and / or received image or UI and GUI.

The display unit 351 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display portion 351 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 351 of the terminal body.

There may be two or more display units 351 according to the embodiment of the terminal 300. For example, in the terminal 300, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 351 and a sensor (hereinafter, referred to as 'touch sensor') 344 for sensing touch operation form a mutual layer structure, the display unit 351 It can be used as an input device in addition to an output device. The touch sensor 344 may take the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor 344 may be configured to convert a change in a pressure applied to a specific part of the display part 351 or a capacitance generated in a specific part of the display part 351 into an electrical input signal. The touch sensor 344 can be configured to detect not only the position and area to be touched but also the pressure and the capacitance at the time of touch.

If there is a touch input to the touch sensor 344, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits corresponding data to the controller 380. Thus, the control unit 380 can know which area of the display unit 351 is touched or the like.

In addition, the controller 380 can determine the type of the touch input of the user based on the area at the time of touch, the pressure, and the capacitance. Accordingly, the control unit 380 can distinguish the finger touch of the user, the nail touch, the finger touch, and the multi-touch using a plurality of fingers.

A proximity sensor 341 may be disposed in an inner region of the terminal or the proximity of the touch screen. The proximity sensor 341 refers to a sensor that detects the presence of an object approaching a predetermined detection surface or an object existing in proximity to the proximity sensor 341 without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor 341 has a longer life span than the contact-type sensor and its utilization is also high.

Examples of the proximity sensor 341 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) 344 may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 352 may output audio data received from the wireless communication unit 310 or stored in the memory 360 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 352 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the terminal 300. [ The sound output module 352 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 353 outputs a signal for notifying the occurrence of an event of the terminal 300. Examples of events that occur in a terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 353 may output a signal for informing occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 351 or the audio output module 352 so that they may be classified as a part of the alarm unit 353.

The haptic module 354 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 354 is vibration. The intensity and pattern of the vibration generated by the haptic module 354 are controllable. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 354 may be arranged in a variety of ways, such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a spit on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 354 can be implemented not only to transmit the tactile effect through direct contact but also to allow the user to feel the tactile effect through the muscles of the fingers and arms. The haptic module 354 may include two or more haptic modules according to the configuration of the terminal 300.

In addition, the haptic module 354 may include a vibrating element capable of generating vibration. For example, the haptic module 354 may include one or more vibration motors, and the vibration motors may be in various forms such as bar time, coin type, and the like.

The haptic module 354 may be provided at various positions according to the shape of the terminal 300.

In addition, the memory 360 stores data supporting various functions of the terminal 300. The memory 360 may store a plurality of application programs (application programs or applications) that are driven by the terminal 300, data for operation of the terminal 300, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least some of these application programs may reside on the terminal 300 from the time of departure for the basic functions (e.g., call incoming, outgoing, message receiving, originating functions) of the terminal 300. The application program may be stored in the memory 360 and may be installed on the terminal 300 and may be operated by the control unit 380 to perform the operation (or function) of the terminal.

In addition, the memory 360 stores payment information for various payment means. For example, the memory 360 may store payment card information for payment such as a credit card or a check card, point card information for accumulating points or mileage, discount card information for discount, and the like.

The memory 360 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The terminal 300 may operate in association with a web storage that performs a storage function of the memory 360 on the Internet.

The interface unit 370 serves as a path for communication with all the home devices connected to the terminal 300. The interface unit 370 receives data from an external device or supplies power to each component in the terminal 300 or transmits data in the terminal 300 to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 370.

The identification module is a chip for storing various information for authenticating the usage right of the terminal 300 and includes a user identification module (UIM), a subscriber identity module (SIM), a universal user authentication module A Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 300 through the port.

When the terminal 300 is connected to an external cradle, the interface unit 370 may be a path through which power from the cradle is supplied to the terminal 300, or various command signals input from the cradle by the user And may be a passage to be transmitted to the terminal. Various command signals input from the cradle or the power source may be operated as a signal for recognizing that the terminal is correctly mounted on the cradle.

The controller 380 typically controls the overall operation of the terminal. For example, voice communication, data communication, video communication, and the like. The control unit 380 may include a multimedia module 381 for multimedia playback. The multimedia module 381 may be implemented in the control unit 380 or separately from the control unit 380.

The control unit 380 can perform pattern recognition processing for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

In addition, the control unit 380 can transmit data on the screen displayed on the display unit 351 to another device. For example, the control unit 380 may transmit the mirroring data on the screen displayed on the display unit 351 to the video display device 100 through the wireless communication unit 310. [ The display unit 351 receives the operation signal for the mirroring screen from the image display device 100 and displays a screen corresponding to the received operation signal on the display unit 351. The mirroring data corresponding to the transmitted operation signal is displayed on the display unit 351 To the device (100).

The power supply unit 390 receives external power and internal power under the control of the controller 380 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, May be implemented by the control unit 380.

According to a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that perform at least one function or operation. The software code may be implemented by a software application written in a suitable programming language. The software code is stored in the memory 360 and can be executed by the control unit 380. [

The terminal 300 may be a portable terminal or a stationary terminal. Accordingly, the terminal 300 may be carried by the user directly, or may be mounted in a predetermined area.

An operation method of the image display apparatus according to the present invention will be described with reference to FIG.

5 is a ladder diagram illustrating a method of operating an image display device according to an embodiment of the present invention.

5, the controller 170 of the video display device 100 generates a search signal for searching the terminal 300 (S101), and transmits the generated search signal to the terminal 300 (S103) .

The control unit 170 may generate a search signal for direct connection with one or more terminals 300 or via a network. The control unit 170 may transmit the generated search signal to the terminal 300 through the communication unit 160. [ Here, the video display device 100 can transmit the generated search signal to the terminal 300 through various communication methods.

The terminal 300 receives the transmitted search signal, generates a response signal for the received search signal (S105), and transmits the generated response signal to the image display device 100 (S107).

The control unit 380 of the terminal 300 may receive the search signal through the wireless communication unit 310 and generate a response signal for the received search signal. Here, the response signal may mean a signal for the terminal 300 and the video display device 100 to be connected directly or through a network. The control unit 380 can transmit the generated response signal to the video display device 100 through the wireless communication unit 310. [

The image display apparatus 100 receives the transmitted response signal and pairs it with the terminal 300 corresponding to the received response signal (S109). The terminal 300 generates the first mirroring data for the screen displayed on the display unit 351 (S111), and transmits the generated first mirroring data to the video display device 100 (S113). The image display apparatus 100 displays a mirroring screen on the display unit 180 based on the transmitted mirroring data (S115).

The control unit 170 of the video display device 100 can perform a pairing operation in which the terminal 300 corresponding to the response signal is connected directly or via a network based on the received response signal. Accordingly, the image display device 100 can receive the mirroring data for the mirroring screen from the paired terminal 300. FIG.

The terminal 300 may transmit the mirroring data for the mirroring screen to the paired image display device 100. Specifically, the terminal 300 may generate first mirroring data for a screen displayed on the display unit 351, and may transmit the generated first mirroring data to the image display apparatus 100. [ Here, the mirroring screen may include an image, an image, and an application screen displayed on the display unit 351 of the terminal 300. Also, the terminal 300 can transmit the mirroring data to the image display device 100 even if the display unit 351 is turned off. The video display device 100 can display the mirroring screen on the display unit 180 based on the mirroring data transmitted from the paired terminal 300 and output the audio corresponding to the mirroring screen displayed to the audio output unit 185 ). ≪ / RTI > Accordingly, the video display device 100 can display the mirroring screen, which is the same screen as the screen displayed on the display unit 351 of the terminal 300, on the display unit 180. The video display device 100 may also transmit a signal to the terminal 300 about the user input to the mirroring screen through the remote control device 200. [

The image display apparatus 100 acquires a user input for the mirroring screen displayed on the display unit 180 (S117), and performs an operation corresponding to the acquired user input (S119).

The control unit 170 may acquire various user inputs for the mirroring screen displayed on the display unit 180. [ For example, the control unit 170 can display the cursor 490 on the mirroring screen displayed on the display unit 180, and can acquire the user input for the displayed cursor 490 through the remote control device 200 have. The controller 170 can distinguish the mirroring area where the mirroring screen is displayed and the black area that is not displayed in the display unit 180. [ The controller 170 may move the cursor 490 only in the mirroring area where the mirroring screen is displayed on the mirroring screen displayed on the display unit 180. [

This will be described with reference to FIG.

6 is an exemplary view of a cursor movable area according to an embodiment of the present invention.

Referring to FIG. 6, the controller 170 may display an image 411 corresponding to the first mirroring data transmitted from the terminal 300 on the display unit 180. The control unit 170 may display the black area in which the area other than the mirroring area in which the image 411 is displayed in the entire area of the screen of the display unit 180 is displayed in black. The left side outside the mirroring area in which the image 411 is displayed can be referred to as a first black area 501 and the right side outside the mirroring area in which the image 411 is displayed can be referred to as a second black area 501. [ Area 502. [0060] The controller 170 may distinguish the mirroring area where the mirroring screen is displayed from the black area where the mirroring screen is not displayed based on the received first mirroring data. In addition, the controller 170 may recognize the black area displayed on the display unit 180 and distinguish the mirroring area from the black area. The control unit 170 can recognize the boundary between the mirroring area in which the image 411 is displayed and the first black area 501 as the first boundary 511 on the screen of the display unit 180, Can be recognized as the second boundary 512. The second boundary 512 is a boundary between the mirroring area and the second black area 502, The control unit 170 may display the cursor 490 on the display unit 180 in response to the user input inputted through the remote control device 200. [ The control unit 170 may cause the cursor 490 to move only within the mirroring area. The controller 170 may cause the cursor 490 to move only in the mirroring area in which the image 411 is displayed and the cursor 490 is placed in the first black area 501 and the second black area 502 It can be prevented from being displayed. Thus, the control unit 170 may cause the cursor 490 to move only within the first boundary 511 and the second boundary 512. [ In addition, when the control unit 170 obtains a user input for moving the cursor 490 out of the mirroring area, the feedback to the user input may indicate that the cursor 490 moves to the black area and then moves back into the mirroring area .

Meanwhile, the control unit 170 may display an image related to the mirroring screen in the black area of the display unit 180. [

This will be described with reference to FIGS. 7 to 8. FIG.

7 is an exemplary view of black area utilization in accordance with an embodiment of the present invention.

Referring to FIG. 7, the controller 170 may display a mirroring screen 420 corresponding to the mirroring data transmitted from the terminal 300 in a central area of the display unit 180. Here, the mirroring screen 420 may be the first home screen of the terminal 300. The control unit 170 may display the cursor 490 on the mirroring screen 420. FIG. The control unit 170 may display the first screen 422 related to the mirroring screen 420 in an area corresponding to the black area on the left side of the mirroring screen 420 in the display unit 180. [ The first screen 422 may be a second home screen of the terminal 300. The control unit 170 may display a second screen 423 related to the mirroring screen 420 in an area corresponding to the black area on the right side of the mirroring screen 420 in the display unit 180. [ Here, the second screen 423 may be the third home screen of the terminal 300. Here, the second home screen and the third home screen may be transmitted from the terminal 300 to the image display apparatus 100, and may be a screen stored in the image display apparatus 100 or a screen transmitted in real time. The first screen 422 and the second screen 423 may be the previous or next screen of the mirroring screen 420.

Figure 8 is an illustration of black area utilization in accordance with another embodiment of the present invention.

Referring to FIG. 8, the controller 170 may display a mirroring screen 420 corresponding to the mirroring data transmitted from the terminal 300 in a central area of the display unit 180. Here, the mirroring screen 420 may be the first home screen of the terminal 300. The control unit 170 may display the cursor 490 on the mirroring screen 420. FIG. The controller 170 may display the third screen 425 and the fourth screen 426 in an area corresponding to the black area on the left side of the mirroring screen 420 in the display unit 180. [ Here, the third screen 425 and the fourth screen 426 may be a screen related to the mirroring screen 420 or a screen related to the video display device 100. The controller 170 may display the fifth screen 427 and the sixth screen 428 in an area corresponding to the black area on the right side of the mirroring screen 420 in the display unit 180. [ Here, the fifth screen 427 and the sixth screen 428 may be a screen related to the mirroring screen 420 or a screen related to the video display device 100.

Meanwhile, the control unit 170 may display guide information related to the mirroring screen displayed on the display unit 180. When the controller 180 obtains the user input corresponding to the displayed guide information, the controller 180 can perform the operation corresponding to the user input.

In one embodiment, the controller 180 may display guide information on the screen rotation of the mirroring screen displayed on the display unit 180. When the control unit 180 obtains a user input for rotating the mirroring screen, the controller 180 rotates the displayed mirroring screen and displays the rotated mirroring screen on the display unit 180.

This will be described with reference to Figs. 9 to 10. Fig.

FIG. 9 is an exemplary view illustrating guide information about screen rotation of a mirroring screen according to an embodiment of the present invention. Referring to FIG.

FIG. 10 is an exemplary diagram illustrating screen rotation of a mirroring screen according to an embodiment of the present invention.

Referring to FIG. 9, the controller 170 may display an image 411 corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The controller 170 may display guide information on the screen rotation in which the image 411 is displayed in the landscape in the guide window 481 when the image 411 is displayed in portrait mode. For example, when the cursor 490 is positioned in the upper area 530, which is a part of the displayed image 411, the control unit 170 displays guide information on screen rotation of the mirroring screen on the guide window 481 .

10, the controller 170 clicks the upper area 530, which is a partial area of the image 411 corresponding to the mirroring data, displayed on the display unit 180 with the cursor 490 and drags it clockwise User input may be obtained as a user input for rotating the mirroring screen. Where the user input to rotate the mirroring screen may be an input that the cursor 490 reaches the second boundary 512 or an input that crosses the second boundary 512. [ Accordingly, the control unit 170 can rotate the mirroring screen displayed on the display unit 180. FIG. Thus, the controller 170 may display the image 411 corresponding to the transmitted mirroring data on the display unit 180 in the horizontal direction.

On the other hand, the control unit 170 may display the menu related to the mirroring screen on the display unit 180.

This will be described with reference to FIG.

11 is a diagram illustrating an example of menu display for a mirroring screen according to an embodiment of the present invention.

Referring to FIG. 11, the controller 170 may display the image 411 corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The control unit 170 may display one or more menus for the mirroring screen on the mirroring screen. For example, when the cursor 490 is positioned in the upper area 530, which is a part of the displayed image 411, the control unit 170 displays a slide show menu 481 for displaying a slide show for a plurality of images, The full screen menu 482 for the full screen display and the screen rotation menu 483 for the screen rotation of the mirroring screen can be displayed. Accordingly, when the control unit 170 acquires a user input for selecting the slide show menu 481 with the cursor 490, the control unit 170 can display a slide show displaying a plurality of images corresponding to the mirroring data on the display unit 180 have. The control unit 170 can display the mirroring screen corresponding to the mirroring data on the entire screen of the display unit 180 when acquiring the user input for selecting the full screen menu 482 with the cursor 490. [ The control unit 170 can display the image 411 displayed in the vertical direction on the mirroring screen corresponding to the mirroring data in the horizontal direction when the user obtains the user input for selecting the screen rotation menu 482 with the cursor 490. [

On the other hand, the control unit 170 may display or not display the guide information or the function menu on the display unit 180 according to the user setting. When the guide information or the function menu is displayed a predetermined number of times, the control unit 170 may display a setting menu for not displaying the guide information or the function menu on the display unit 180. [

The description of the operations on the mirroring screen display and the mirroring screen is not limited to the above example. Therefore, it can be variously set according to the selection of the user or the designer.

See FIG. 5 again.

5, the image display apparatus 100 acquires an operation input on the displayed mirroring screen (S121), and transmits an operation signal for the obtained operation input to the terminal 300 (S123). The terminal 300 generates second mirroring data corresponding to the transmitted operation signal (S125), and transmits the generated second mirroring data to the image display device 100 (S127). The image display apparatus 100 displays the mirroring screen on the display unit 180 based on the transmitted second mirroring data (S129).

The image display device 100 can acquire an operation input on the mirroring screen displayed on the display unit 180 through the remote control device 200. [ For example, the video display device 100 can acquire a swipe input for the mirroring screen through the remote control device 200. [

This will be described with reference to Figs. 12 to 14. Fig.

12 is an exemplary view of a swipe recognition area according to an embodiment of the present invention.

Referring to FIG. 12, the controller 170 may display the image 411 corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The control unit 170 can recognize a part of the outer area of the image 411 corresponding to the mirroring screen as a swipe recognition area corresponding to the swipe input. For example, the control unit 170 can recognize a part of the left outer part of the image 411 corresponding to the mirroring screen as the first swipe area 537 and the part of the right outer part as the second swipe area 538 ). If the cursor 490 is located in the first swipe area 537 for a predetermined time, the controller 170 recognizes the swipe input as a left swipe input. When the cursor 490 is positioned in the second swipe area 538 for a predetermined time, Direction swipe input. The control unit 170 recognizes the user input in which the cursor 490 clicks the first swipe area 537 as a left direction swipe input and the user input that clicks the second swipe area 538. In another embodiment, Can be recognized as a rightward direction swipe input. In another embodiment, the controller 170 recognizes the user input that the cursor 490 moves a certain distance from the first swipe area 537 as a leftward swipe input, and the cursor 490 recognizes the second swipe area < The user input for moving the display unit 538 by a predetermined distance may be recognized as a rightward direction swipe input. On the other hand, the video display device 100 can transmit a signal for the above-described swipe input to the terminal 300. The terminal 300 may generate second mirroring data for the mirroring screen corresponding to the swipe input, and may transmit the generated second mirroring data to the display device 100. Accordingly, the video display device 100 can display the mirroring screen corresponding to the inputted swipe on the display unit 180. [

13 to 14 are diagrams illustrating examples of a guide for swipe input according to an embodiment of the present invention.

Referring to FIG. 13, an image 411 corresponding to the mirroring data transmitted from the terminal 300 may be displayed on the display unit 180. The control unit 170 can recognize a part of the outer area of the image 411 corresponding to the mirroring screen as a swipe recognition area corresponding to the swipe input. For example, the control unit 170 can recognize a part of the left outer part of the image 411 corresponding to the mirroring screen as the first swipe area 537 and the part of the right outer part as the second swipe area 538 ). The controller 170 can recognize the user input for moving the cursor 490 to the first swipe area 537 as a left swipe input and move the cursor 490 to the second swipe area 538 The user input can be recognized as a right direction swipe input. The control unit 170 displays the guide for the swipe input on the display unit 180 corresponding to the movement of the cursor 490 to the first swipe area 537 and the second swipe area 538 . For example, as shown in FIG. 13, the control unit 170 may display a ball indicator 471 corresponding to the swipe input corresponding to the cursor 490 moving with respect to the second swipe area 538 Can be displayed. The control unit 170 may display the displayed ball-shaped indicator 471 in a distorted form corresponding to the degree of movement of the cursor 490 to the second swipe area 538. [ The control unit 170 can recognize the swipe input when the ball indicator 471 becomes the maximum distorted shape corresponding to the degree of movement of the cursor 490 to the second swipe area 538. [

On the other hand, when the moving speed of the cursor 490 is equal to or higher than a predetermined speed, the controller 170 recognizes the user input corresponding to the movement of the cursor 490 as a swipe input, and if the moving speed of the cursor ($ 90) It is possible to recognize the reduction or enlargement of the mirroring screen.

The controller 170 may recognize a user input to the upper bar area or the lower bar area of the terminal 300 on the mirroring screen in response to a user input through the remote controller 200. [ Here, the upper bar area may be an area for displaying the upper bar on the display part 351 of the terminal 300, and the lower bar area may be an area for displaying the lower bar on the display part 351 of the terminal 300 It can mean.

This will be described with reference to FIG.

15 illustrates an upper bar region and a lower bar region in a mirroring screen according to an embodiment of the present invention.

Referring to FIG. 15, the controller 170 may display the home screen corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The control unit 170 can recognize a part of the upper part of the mirroring screen as the upper bar area 531 and recognize a part of the lower part of the mirroring screen as the lower bar area 532 on the mirroring screen displayed on the display unit 180 can do. The controller 170 recognizes the cursor 490 as a swipe input to the upper bar when the cursor 490 is positioned in the upper bar area 531 for a predetermined time and displays the cursor 490 in the lower bar area 532 for a predetermined time It can be recognized as a swipe input to the lower bar. In another embodiment, the controller 170 recognizes the user input that the cursor 490 clicks on the top bar area 531 as a swipe input to the top bar, and the cursor 490 clicks the bottom bar area 532 The user input can be recognized as a swipe input to the lower bar. In another embodiment, the controller 170 recognizes that the cursor 490 recognizes a user input moving a certain distance from the upper bar area 531 as a swipe input to the upper bar, and when the cursor 490 moves to the lower bar area 532, Can be recognized as a swipe input to the lower bar. On the other hand, the video display device 100 can transmit a signal for the swipe input to the upper bar or the lower bar to the terminal 300. The terminal 300 may generate second mirroring data for the mirroring screen corresponding to the swipe input to the upper bar or the lower bar and may transmit the generated second mirroring data to the image display device 100. Accordingly, the image display apparatus 100 can display the mirroring screen corresponding to the swipe with respect to the input upper bar or the lower bar on the display unit 180.

Meanwhile, the control unit 170 may enlarge an area corresponding to the upper bar or the lower bar and display the enlarged area on the display unit 180 for user input on the upper bar or the lower bar on the mirroring screen.

This will be described with reference to FIG.

16 is an exemplary view showing an enlarged upper bar area in a mirrored bottom according to an embodiment of the present invention.

Referring to FIG. 16, the control unit 170 may display a home screen corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The control unit 170 can recognize a part of the upper part of the mirroring screen as the upper bar area 531 and recognize a part of the lower part of the mirroring screen as the lower bar area 532 on the mirroring screen displayed on the display unit 180 can do. The control unit 180 may enlarge and display the upper bar area 531 or the lower bar area 532 when the cursor 490 is positioned in the upper bar area 531 or the lower bar area 532. [ For example, as shown in FIG. 16, when the cursor 490 is positioned in the upper bar area 531 on the mirroring screen displayed on the display unit 180, The mirroring screen can be enlarged and displayed.

Meanwhile, the control unit 170 may recognize the direction key input for one direction of the remote control device 200 as a swipe input.

When the control unit 170 acquires a user input for dragging a part of the mirroring screen in one direction in the mirroring screen displayed on the display unit 180, the controller 170 displays an image or an image corresponding to the mirroring screen in the image display device 100. [ .

This will be described with reference to FIG.

17 is an exemplary view illustrating image storage corresponding to a mirroring screen according to an embodiment of the present invention.

17, the controller 170 displays a first image 415, a second image 416, and a third image 417, which are a plurality of images corresponding to the mirroring data transmitted from the terminal 300, (180). The control unit 170 may store the second image 416 in the storage unit 140 when acquiring a user input that causes the cursor 490 to drag a portion of the second image 416 in the upward direction.

When the terminal 300 operates in the sleep state, the control unit 170 displays the mirroring screen corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180, An input may be obtained as a user input to awaken the terminal 300 from the sleep state.

This will be described with reference to FIG.

18 is an exemplary view of a user input to a cursor according to an embodiment of the present invention.

Referring to FIG. 18, the controller 170 may display the image 430 corresponding to the mirroring data transmitted from the terminal 300 on the display unit 180. The controller 170 controls the user to wake up the terminal 300 from the sleep state by waking the cursor 490 through the remote control device 200 when the terminal 300 transmitting the mirroring data operates in the sleep state, Can be obtained as input. Accordingly, the controller 170 may transmit a signal for user input for waking up the terminal 300 from the sleep state to the terminal 300 through the communication unit 160. The control unit 380 of the terminal 300 may display on the display unit 351 a control screen including a progress bar for an image being output corresponding to a signal for waking up the sleep state. The control unit 380 may generate mirroring data for the control screen and transmit the generated mirroring data to the video display device 100 through the wireless communication unit 110. Accordingly, as shown in FIG. 18, the image display device 100 can display a control screen including a progress bar for an image being output on the mirroring screen.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a controller 170 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (12)

A method of operating a video display device for displaying a mirroring screen transmitted from a terminal,
Displaying the mirroring screen on the basis of mirroring data for the mirroring screen transmitted from the terminal;
Displaying a pointer on the displayed mirroring screen;
Moving the pointer within an area corresponding to the mirrored screen, based on user input to the displayed pointer; And
And performing an operation on the mirroring screen based on a user input to the pointer
A method of operating a video display device.
The method according to claim 1,
The step of moving the pointer within an area corresponding to the mirroring screen
The method comprising the steps of: dividing an area for displaying the mirroring screen and an area for not displaying the mirroring screen on a screen of the image display device;
And moving the pointer within an area for displaying the mirroring screen
A method of operating a video display device.
The method according to claim 1,
The step of performing an operation on the mirroring screen
Acquiring a user input for dragging an upper portion of the mirroring screen displayed in the vertical direction clockwise on the mirroring screen displayed in the vertical direction on the image display device;
And displaying the mirroring screen displayed in the vertical direction in a horizontal direction
A method of operating a video display device.
The method according to claim 1,
The step of performing an operation on the mirroring screen
Recognizing a user input for a partial area of the mirroring screen using the pointer as a swipe input;
And transmitting a signal for the recognized swipe input to the terminal
A method of operating a video display device.
The method according to claim 1,
The step of performing an operation on the mirroring screen
Recognizing a user input for a partial area of the mirroring screen using the pointer as an input corresponding to an upper bar or a lower bar of the screen of the terminal;
And transmitting to the terminal a signal for an input corresponding to the recognized top bar or bottom bar
A method of operating an image indicator.
The method according to claim 1,
The step of moving the pointer within an area corresponding to the mirroring screen
Displaying an object corresponding to the pointer when the pointer is located at a certain distance from an area where the mirroring screen is not displayed in an area corresponding to the mirroring screen;
And deforming and displaying the object as the pointer approaches the area where the mirroring screen is not displayed
A method of operating a video display device.
1. An image display device for displaying a mirroring screen transmitted from a terminal,
A communication unit for receiving mirroring data on a mirroring screen transmitted from the terminal;
A display unit for displaying a mirroring screen corresponding to the received mirroring data; And
And displaying a pointer on the mirroring screen and moving the displayed pointer in an area corresponding to the mirroring screen based on a user input to the displayed pointer and displaying the pointer on the mirroring screen based on a user input to the pointer. Comprising a control for performing an operation
Video display device.
8. The method of claim 7,
The control unit
Wherein the mirroring screen is divided into a region for displaying the mirroring screen and a region for not displaying the mirroring screen on the screen of the display unit,
Moving the pointer within an area for displaying the mirroring screen
Video display device.
8. The method of claim 7,
The control unit
When a user input for dragging an upper portion of the mirroring screen displayed in the vertical direction clockwise is acquired in the mirroring screen displayed in the vertical direction on the screen of the display unit, the mirroring screen displayed in the vertical direction is displayed in the horizontal direction
Video display device.
8. The method of claim 7,
The control unit
Recognizes a user input for a partial area of the mirroring screen using the pointer as a swipe input and transmits a signal for the recognized swipe input to the terminal through the communication unit
Video display device.
8. The method of claim 7,
The control unit
Recognizes a user input for a partial area of the mirroring screen using the pointer as an input corresponding to an upper bar or a lower bar of the screen of the terminal and outputs a signal for an input corresponding to the recognized upper bar or lower bar, To the terminal
Video display device.
8. The method of claim 7,
The control unit
Wherein the display unit displays an object corresponding to the pointer when the displayed pointer is located at a certain distance from an area where the mirroring screen is not displayed in an area corresponding to the mirroring screen, The object is deformed and displayed
Video display device.
KR1020150090035A 2015-06-24 2015-06-24 Video display device and operating method thereof KR20170000714A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150090035A KR20170000714A (en) 2015-06-24 2015-06-24 Video display device and operating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150090035A KR20170000714A (en) 2015-06-24 2015-06-24 Video display device and operating method thereof

Publications (1)

Publication Number Publication Date
KR20170000714A true KR20170000714A (en) 2017-01-03

Family

ID=57797320

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150090035A KR20170000714A (en) 2015-06-24 2015-06-24 Video display device and operating method thereof

Country Status (1)

Country Link
KR (1) KR20170000714A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190035414A (en) * 2017-09-26 2019-04-03 엘지전자 주식회사 Wireless device and operating method thereof
WO2020197012A1 (en) * 2019-03-25 2020-10-01 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
CN113362782A (en) * 2020-03-03 2021-09-07 瑞昱半导体股份有限公司 Display device and image display method
US11341882B2 (en) 2019-05-28 2022-05-24 Samsung Electronics Co., Ltd. Electronic device, method, and computer readable medium for providing screen sharing service through external electronic device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190035414A (en) * 2017-09-26 2019-04-03 엘지전자 주식회사 Wireless device and operating method thereof
WO2020197012A1 (en) * 2019-03-25 2020-10-01 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US10949160B2 (en) 2019-03-25 2021-03-16 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US11341882B2 (en) 2019-05-28 2022-05-24 Samsung Electronics Co., Ltd. Electronic device, method, and computer readable medium for providing screen sharing service through external electronic device
CN113362782A (en) * 2020-03-03 2021-09-07 瑞昱半导体股份有限公司 Display device and image display method

Similar Documents

Publication Publication Date Title
CN108632444B (en) Mobile terminal and control method thereof
CN105404412B (en) Portable terminal and control method thereof
CN106506935B (en) Mobile terminal and control method thereof
CN106067834B (en) Wearable device and control method thereof
US10001910B2 (en) Mobile terminal and controlling method thereof for creating shortcut of executing application
KR102188267B1 (en) Mobile terminal and method for controlling the same
US10628014B2 (en) Mobile terminal and control method therefor
KR102216246B1 (en) Mobile terminal and method for controlling the same
EP2637086B1 (en) Mobile terminal
US10567567B2 (en) Electronic device and method for controlling of the same
KR101781909B1 (en) Mobile terminal and method for controlling the same
US20170277499A1 (en) Method for providing remark information related to image, and terminal therefor
US20200371668A1 (en) Application icon display method, terminal and computer readable storage medium
KR20170062327A (en) Rollable mobile terminal and control method thereof
KR20170088691A (en) Mobile terminal for one-hand operation mode of controlling paired device, notification and application
CN108415641B (en) Icon processing method and mobile terminal
US20170075517A1 (en) Mobile terminal and method for controlling the same
US20150207920A1 (en) Mobile terminal and method of controlling the mobile terminal
CN105468261B (en) Mobile terminal and control method thereof
KR20170035153A (en) Image display apparatus and operating method for the same
US20140282204A1 (en) Key input method and apparatus using random number in virtual keyboard
US20130106915A1 (en) Mobile terminal and method of controlling mobile terminal
US11592970B2 (en) Mobile terminal for displaying notification user interface (UI) and control method thereof
EP2947556A1 (en) Method and apparatus for processing input using display
KR20170035678A (en) Mobile terminal and method of controlling the same