KR102039486B1 - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR102039486B1
KR102039486B1 KR1020130011284A KR20130011284A KR102039486B1 KR 102039486 B1 KR102039486 B1 KR 102039486B1 KR 1020130011284 A KR1020130011284 A KR 1020130011284A KR 20130011284 A KR20130011284 A KR 20130011284A KR 102039486 B1 KR102039486 B1 KR 102039486B1
Authority
KR
South Korea
Prior art keywords
screen
displayed
dynamic
user
input
Prior art date
Application number
KR1020130011284A
Other languages
Korean (ko)
Other versions
KR20140098517A (en
Inventor
강형석
유영곤
노성재
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130011284A priority Critical patent/KR102039486B1/en
Priority to US13/922,707 priority patent/US10031637B2/en
Publication of KR20140098517A publication Critical patent/KR20140098517A/en
Application granted granted Critical
Publication of KR102039486B1 publication Critical patent/KR102039486B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4227Providing Remote input by a user located remotely from the client device, e.g. at work
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44204Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations

Abstract

The present invention relates to an image display apparatus and an operation method thereof. An operation method of an image display apparatus according to an exemplary embodiment of the present invention may include displaying a home screen including at least one card object including a content list and, when there is a dynamic screen display input, a dynamic screen in the home screen. And displaying a moving screen of the dynamic screen in the home screen when there is a dynamic screen moving input. As a result, the user's ease of use can be improved.

Description

Image display apparatus, and method for operating the same

The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve user convenience.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. The video display device displays a broadcast selected by a user on a display among broadcast signals transmitted from a broadcasting station. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

SUMMARY OF THE INVENTION An object of the present invention is to provide an image display apparatus and an operation method thereof which can improve user convenience.

According to another aspect of the present invention, there is provided a method of operating an image display apparatus, the method including displaying a home screen including at least one card object including a content list, and a dynamic screen display input. And displaying a dynamic screen in the home screen, and moving and displaying the dynamic screen in the home screen when there is a dynamic screen movement input.

In addition, the image display device according to an embodiment of the present invention for achieving the above object, displays a home screen having a network interface unit for exchanging data with the server, and at least one card object including a content list, And a display for displaying a dynamic screen in the home screen when there is a dynamic screen display input, and a control unit for controlling to display the dynamic screen in the home screen when there is a dynamic screen movement input.

According to an embodiment of the present invention, the image display apparatus displays the dynamic screen in the home screen according to the dynamic screen display input, and if there is a dynamic screen movement input, moves the dynamic screen in the home screen. As a result, the user can easily watch a desired video.

On the other hand, when there is a dynamic screen enlargement input, the dynamic screen can be enlarged and displayed in the home screen, whereby the visibility of the dynamic screen can be improved.

Meanwhile, an object including a screen movement item or a screen magnification item can be displayed for the displayed dynamic screen, whereby the user can easily move or enlarge the dynamic screen.

The dynamic screen is displayed in an area other than the pointer when the pointer on the home screen is displayed superimposed on the card object, which has the least usage or the least usage time, or when the pointer corresponding to the movement of the remote controller is displayed. Or may be displayed in an area other than the user's gaze direction. Accordingly, by displaying the dynamic screen in other areas besides a card object having a high user interest, the user's convenience may be improved.

1A is a diagram illustrating an image display device according to an embodiment of the present invention.
FIG. 1B is a view illustrating a first surface of the remote control device of FIG. 1.
FIG. 1C is a view illustrating a second surface of the remote control device of FIG. 1.
FIG. 2 is an internal block diagram of the image display device of FIG. 1.
3 is an internal block diagram of the controller of FIG. 2.
4 to 5 are diagrams illustrating various examples of a structure diagram of a smart system platform in the image display apparatus of FIG. 2.
6A is a diagram illustrating an operation method using the first surface of the remote control device of FIG. 1B.
6B is a diagram illustrating an operation method using the second surface of the remote control device of FIG. 1C.
7 is an internal block diagram of the remote control device of FIG. 1.
8 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
9A to 14C are views referred to for describing the operating method of FIG. 8.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffixes "module" and "unit" for components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not impart any particular meaning or role by themselves. Therefore, the "module" and "unit" may be used interchangeably.

On the other hand, the video display device described in the present specification is an intelligent video display device that adds a computer support function to the broadcast reception function, and is faithful to the broadcast reception function, but the Internet function is added, so that the handwriting type input device, touch screen or A more convenient interface, such as a 3D pointing device, can be provided. In addition, by being connected to the Internet and a computer with the support of a wired or wireless Internet function, it is possible to perform functions such as email, web browsing, banking or gaming. Standardized general-purpose operating systems can be used for these various functions.

That is, in the image display apparatus described in the present invention, various applications can be freely added or deleted on the general-purpose OS kernel, and thus various user-friendly functions can be performed. For example, it may be a smart TV.

1A is a diagram illustrating an image display device according to an embodiment of the present invention.

Referring to the drawings, the image display apparatus 100 according to an embodiment of the present invention is a device for displaying an image and includes a display (180 in FIG. 1B). The image display apparatus 100 may include a camera 195 for photographing a user.

In the drawing, the camera 195 is illustrated as being disposed above the image display apparatus 100, but various examples are possible. Meanwhile, unlike the drawing, the image display apparatus 100 and the camera may be separate devices.

The video display device 100 may exchange data with each external device through a network.

The image display apparatus 100 may exchange data with an adjacent external device, for example, the home appliance 670, the home server 650, the mobile terminal 600, or the like. And, predetermined content data can be shared. Here, the home appliance 670 may be a concept including a set top box, an audio device, a refrigerator, a cleaner, an air conditioner, a washing machine, a cooking appliance, and the like.

The image display apparatus 100 may exchange data with external servers 600a, 600b, 600c,... Through the network 690. The external servers 600a, 600b, 600c,... May be content providers that provide various contents.

Meanwhile, unlike the drawing, the video display device 100 may exchange data with the mobile terminal 600 via the network 690.

The image display apparatus 100 may operate in response to a remote control signal from the remote control apparatus 200. To this end, the image display apparatus 100 and the remote control apparatus 200 may exchange data through a pairing process.

In particular, the image display apparatus 100 according to the embodiment of the present invention displays a pointer corresponding to the movement of the remote control apparatus 200 by exchanging data or by inputting a character key of the remote control apparatus 200. Characters can be displayed.

Meanwhile, the image display apparatus 100 described in the present specification may include a TV receiver, a monitor, a projector, a notebook computer, a digital broadcasting terminal, and the like.

According to an embodiment of the present invention, the image display apparatus 100 may be a mobile image display apparatus as well as a fixed image display apparatus.

FIG. 1B is a view showing a first surface of the remote control device of FIG. 1, and FIG. 1C is a view showing a second surface of the remote control device of FIG. 1.

First, referring to FIG. 1B, operation keys such as a power key 202 may be disposed on the first surface (front surface) 201 of the remote control apparatus.

Referring to the various operation keys, the power key 202 is used to turn on / off the image display apparatus 100. The home key 204 is used for displaying a home screen when the home screen of the image display apparatus 100 is set. The search key 206 may be used to display a search box on the image display apparatus 100 or to perform a search when inputting a search word.

The four-way key 210 is used to move up, down, left, and right of the pointer or cursor, and the upper key 210c, the lower key 210d, the left key 210b, and the right key 210a are integrally formed. Can be formed. Meanwhile, a wheel key 220 may be disposed at the center of the four-way key 210.

The wheel key 220 is used to move a screen or move an item displayed on the image display device 100. The wheel key 220 may be operated up and down, and accordingly, a screen or items of the image display apparatus 100 may be displayed by moving up and down.

The back key 222 is used for moving to the previous screen and all items displayed on the image display apparatus 100. The menu key 224 is used to display the set menu of the video display device 100. The pointer key 225 is used to display a pointer on the video display device 100.

The volume key 230 is used for adjusting the volume, and the channel key 240 is used for adjusting the channel.

The 3D key 235 may be used to convert a 2D image displayed on the image display apparatus 100 into a 3D image or to display a list of 3D images that can be displayed on the image display apparatus 100.

Next, the PIP key 241 is a key for displaying a plurality of images on the video display device 100. By the operation of the PIP key 241, a plurality of images may be displayed on the display 180 in the form of a picture in picture. Alternatively, a plurality of images may be arranged in parallel and displayed in parallel.

On the other hand, any one of the plurality of images may be displayed in a floating, variable position thereof. In this case, the PIP image may be referred to as a dynamic screen image.

Meanwhile, in the drawings, a pointer key for displaying a pointer, a guide key for displaying a guide, a silent key, a color key, and the like are further illustrated.

Next, referring to FIG. 1C, the second surface (rear surface) 251 of the remote control apparatus 200 may be the opposite surface of the first surface (front surface) 201 of the remote control apparatus 200. On the second side (back side) 251 of the remote control apparatus 200, a text key 260 and a display 270 may be disposed.

The character key 260 may include a numeric key 262 and an English key 264. In addition, the character key 260 may further include various keys such as an enter key, a function key, a space bar key, and an enter key.

The display 270 may display a character input by the character key 260.

On the other hand, when the character key 260 is input, the remote control device 200 transmits the character key information to the image display device 100.

The remote control apparatus 200 may transmit coordinate information corresponding to the movement of the remote control apparatus 200 to the image display apparatus 100. As a result, a pointer corresponding to the movement of the remote control apparatus 200 may be displayed on the display of the image display apparatus. As such, since the corresponding pointer is moved and displayed according to the movement in the 3D space, this may be referred to as a 3D pointing device.

FIG. 2 is an internal block diagram of the image display device of FIG. 1.

Referring to FIG. 2, the image display apparatus 100 according to an exemplary embodiment of the present invention may include a broadcast receiver 105, a network interface 130, an external device interface 135, a storage 140, and a user. An input interface unit 150, a controller 170, a display 180, an audio output unit 185, a power supply unit 190, and a camera 195 may be included. Among these, the broadcast receiver 105 may include a tuner 110 and a demodulator 120. Meanwhile, the broadcast receiving unit 105 may further include a network interface unit 130.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. In addition, the selected RF broadcast signal is converted into an intermediate frequency signal or a baseband video or audio signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it may be converted into a digital IF signal (DIF), and if the analog broadcast signal is an analog baseband video or audio signal (CVBS / SIF).

Meanwhile, the tuner 110 may sequentially select RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and convert them to intermediate frequency signals or baseband video or audio signals. have.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulator 120 may output a stream signal TS after performing demodulation and channel decoding. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 135 may connect the external device to the image display device 100. To this end, the external device interface unit 135 may include an A / V input / output unit (not shown).

The external device interface unit 130 may be connected to an external device such as a DVD (Digital Versatile Disk), Blu-ray (Blu ray), a game device, a camera, a camcorder, a computer (laptop), a set top box, or the like by wire / wireless. It may also perform input / output operations with external devices.

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI so that video and audio signals of an external device can be input to the video display device 100. (Digital Visual Interface) terminal, HDMI (High Definition Multimedia Interface) terminal, RGB terminal, D-SUB terminal and the like.

In addition, the external device interface unit 135 may be connected through at least one of the various set top boxes and the various terminals described above, and perform input / output operations with the set top box.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. For example, the network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network.

Meanwhile, the network interface unit 130 may access a predetermined web page through a connected network or another network linked to the connected network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server. In addition, content or data provided by a content provider or a network operator may be received.

In addition, the network interface unit 130 may select and receive a desired application from among applications that are open to the public through the network.

Meanwhile, the network interface unit 130 may include a wired communication unit (not shown) or a wireless communication unit (not shown).

The wireless communication unit may perform near field communication with another electronic device. The image display device 100 may communicate with Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance (DLNA). Depending on the specification, it can be networked with other electronic devices.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal processed video, audio, or data signal.

In addition, the storage 140 may perform a function for temporarily storing an image, audio, or data signal input from the external device interface 135 or the network interface 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function.

In addition, the storage 140 may store an application or a list of applications input from the external device interface 135 or the network interface 130.

The image display apparatus 100 may reproduce and provide a content file (video file, still image file, music file, document file, application file, etc.) stored in the storage 140 to a user.

2 illustrates an embodiment in which the storage 140 is provided separately from the controller 170, but the scope of the present invention is not limited thereto. The storage unit 140 may be included in the control unit 170.

The user input interface unit 150 transmits a signal input by the user to the controller 170 or transmits a signal from the controller 170 to the user.

For example, the remote controller 200 transmits / receives a user input signal such as power on / off, channel selection, screen setting, or a local key (not shown) such as a power key, a channel key, a volume key, or a set value. Transmits a user input signal input from the control unit 170, or transmits a user input signal input from the sensor unit (not shown) for sensing the user's gesture to the control unit 170, or the signal from the control unit 170 It can transmit to a sensor unit (not shown).

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 135, and outputs a video or audio signal. You can create and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 135.

The audio signal processed by the controller 170 may be audio output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 135.

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 3.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. In particular, the user may access the network to download the desired application or application list into the image display apparatus 100.

For example, the controller 170 controls the tuner 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, the video, audio, or data signal of the selected channel is processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display 180 or the audio output unit 185.

As another example, the controller 170 may receive an external device image playback command from the external device, for example, a camera or a camcorder, according to an external device image playback command received through the user input interface unit 150. The video signal or the audio signal may be output through the display 180 or the audio output unit 185.

The controller 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

The controller 170 may generate and display a 3D object with respect to a predetermined 2D object in an image displayed on the display 180. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

The controller 170 may recognize a location of a user based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the display 180 corresponding to the user position may be determined.

Meanwhile, when entering an application view item, the controller 170 may control to display an application or a list of applications that can be downloaded from the image display apparatus 100 or from an external network.

The controller 170 may control to install and run an application downloaded from an external network along with various user interfaces. In addition, by selecting a user, an image related to an executed application may be controlled to be displayed on the display 180.

The controller 170 may receive a user image captured by the camera 195. In addition, based on the captured user image, user recognition may be performed and the recognized user may be controlled to log in to the image display apparatus 100. The controller 170 may provide a service for each logged-in user.

Alternatively, the controller 170 may recognize the gesture of the user based on the user image captured by the camera 195. In particular, the controller 170 may recognize a face and a hand of the user from the captured image and recognize whether the gesture is a specific gesture.

The display 180 converts an image signal, a data signal, an OSD signal processed by the controller 170, or an image signal, data signal, etc. received from the external device interface unit 135 into R, G, and B signals, respectively. Generate a signal.

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or a 3D display.

The display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170 and outputs the audio signal.

The power supply unit 190 supplies the corresponding power throughout the image display apparatus 100. In particular, power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output. have.

To this end, the power supply unit 190 may include a converter (not shown) for converting AC power into DC power. Meanwhile, for example, when the display 180 is implemented as a liquid crystal panel having a plurality of backlight lamps, an inverter (not shown) capable of PWM operation may be further provided for driving of variable brightness or dimming. have.

The camera 195 may photograph the user and transmit the photographed image to the controller 170 in the image display apparatus 100. Although the camera 195 of FIG. 1A illustrates one camera, a plurality of cameras may be provided. The camera 195 may be a 2D camera or a 3D camera.

The remote control apparatus 200 transmits a user input to the user input interface unit 150. To this end, the remote control apparatus 200 may use RF (Radio Frequency) communication, infrared (IR) communication, Bluetooth, Bluetooth (UWB), ZigBee (ZigBee) method and the like.

In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, display it on the remote control apparatus 200 or output an audio or vibration.

Meanwhile, a block diagram of the image display device 100 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more components may be combined into one component as needed, or one component may be divided into two or more components. In addition, the function performed in each block is for explaining an embodiment of the present invention, the specific operation or device does not limit the scope of the present invention.

On the other hand, the image display device 100, unlike shown in Figure 2, does not include the tuner 110 and the demodulator 120 shown in Figure 2, the network interface unit 130 or the external device interface unit ( Through 135, a broadcast image may be received and reproduced.

3 is an internal block diagram of the controller of FIG. 2.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 310, the image processor 320, the OSD generator 340, the mixer 350, the frame rate converter 355 ), And a formatter 360. In addition, the apparatus may further include a voice processor (not shown) and a data processor (not shown).

The demultiplexer 310 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 135.

The image processor 320 may perform image processing of the demultiplexed image signal. To this end, the image processor 320 may include an image decoder 325 and a scaler 335.

The image decoder 325 decodes the demultiplexed video signal, and the scaler 335 performs scaling to output the resolution of the decoded video signal on the display 180.

The video decoder 325 may include decoders of various standards.

On the other hand, the video signal decoded by the image processor 320 is input to the mixer 350.

The processor 330 may control overall operations in the image display apparatus 100 or the controller 170. For example, the processor 330 may control the tuner 110 to control tuning of an RF broadcast corresponding to a channel selected by a user or a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

In addition, the processor 330 may perform data transmission control with the network interface unit 135 or the external device interface unit 130.

In addition, the processor 330 may control operations of the demultiplexer 310, the image processor 320, the OSD generator 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal or a control signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100.

For example, the OSD generator 340 may generate a signal for displaying broadcast information based on subtitles or EPGs of a broadcast image.

On the other hand, since the OSD generation unit 340 generates an OSD signal or a graphic signal, it may be referred to as a graphic processing unit.

The mixer 350 may mix the OSD signal generated by the OSD generator 340 and the decoded image signal processed by the image processor 220. The mixed signal is provided to the formatter 360. Since the decoded broadcast video signal or the external input signal and the OSD signal are mixed, the OSD may be overlaid and displayed on the broadcast video or the external input video.

The frame rate converter (FRC) 355 may convert the frame rate of the input video. On the other hand, the frame rate converter 350 can output the data as it is without additional frame rate conversion.

The formatter 360 receives the output signal of the frame rate converter 355 and changes the format of the signal to be suitable for the display 180 and outputs the signal. For example, the R, G, B data signals may be output, and the R, G, B data signals may be output as low voltage differential signaling (LVDS) or mini-LVDS.

The formatter 360 may change the format of the 3D video signal or convert the 2D video into a 3D video.

The voice processing unit (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the voice processing unit (not shown) may include various decoders.

Also, the voice processing unit (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.

Meanwhile, a block diagram of the controller 170 shown in FIG. 3 is a block diagram for one embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be provided separately.

4 to 5 are diagrams illustrating various examples of a platform structure diagram of the image display apparatus of FIG. 2.

The platform of the image display apparatus 100 according to an embodiment of the present invention may include OS-based software in order to perform the above-described various operations.

First, referring to FIG. 4, the platform of the image display apparatus 100 according to another embodiment of the present invention is a detachable platform, in which a legacy system platform 400 and a smart system platform 405 are separated. Can be designed. The OS kernel 410 may be commonly used in the legacy system platform 400 and the smart system platform 405.

The legacy system platform 400 may include a driver 420, a middleware 430, an application layer 450 on the OS kernel 410, and the smart system platform 405. May include a library 435, a framework 440, and an application layer 455 on the OS kernel 410.

The OS kernel 410 is a core of an operating system, and when a video display device 100 is driven, a hardware driver is driven, security of hardware and a processor in the video display device 100, efficient management of system resources, and memory. At least one of management, provision of an interface to hardware by hardware abstraction, multiprocessing, and schedule management according to multiprocessing may be performed. The OS kernel 410 may further provide power management and the like.

The hardware driver in the OS kernel 410 may be, for example, at least one of a display driver, a Wi-Fi driver, a Bluetooth driver, a USB driver, an audio driver, a power manager, a binder driver, a memory driver, and the like. It may include.

In addition, the hardware driver in the OS kernel 410 is a driver for a hardware device in the OS kernel 410, and includes a character device driver, a block device driver, and a network device driver. dirver). The block device driver may need a buffer to store as much as the unit size because data is transmitted in a specific block unit, and the character device driver may not need a buffer because it is transmitted in a basic data unit, that is, a character unit.

The OS kernel 410 may be implemented as a kernel based on various operating systems (OS) such as Unix-based (Linux) and Windows-based. In addition, the OS kernel 410 is an open OS kernel and may be general purpose that can be used in other electronic devices.

The driver 420 is located between the OS kernel 410 and the middleware 430 and, together with the middleware 430, drives the device for the operation of the application layer 450. For example, the driver 420 may include a micom, a display module, a graphics processing unit (GPU), a frame rate converter (FRC), a general purpose input / output pin (GPIO), and the like in the video display device 100. It may include a driver such as HDMI, a system decoder or demultiplexer (SDEC), a video decoder (VDEC), an audio decoder (ADEC), a personal video recorder (PVR), or an inter-integrated circuit (I2C). . These drivers operate in conjunction with hardware drivers in the OS kernel 410.

In addition, the driver 420 may further include a driver of the remote control apparatus 200, particularly, a 3D pointing device to be described later. The driver of the 3D pointing device may be provided in various ways in the OS kernel 410 or the middleware 430 in addition to the driver 420.

The middleware 430 may be located between the OS kernel 410 and the application layer 450, and may serve as an intermediary to exchange data between different hardware or software. As a result, a standardized interface can be provided, and various environments can be supported and systems can interoperate with other tasks.

Examples of the middleware 430 in the legacy system platform 400 may include multimedia and hypermedia information coding Experts Group (MHEG), which is data broadcasting related middleware, and middleware of the Advanced Common Application Platform (ACAP), and PSIP, which is broadcast information related middleware. Or there may be a middleware of the SI, and DLNA middleware, which is a middleware related to the peripheral communication.

The application layer 450 on the middleware 430, that is, the application 450 layer in the legacy system platform 400, may, for example, provide a user interface application for various menus in the image display apparatus 100. It may include. The application layer 450 on the middleware 430 may be editable by a user's selection and may be updated through a network. By using the application layer 450, it is possible to enter a desired menu among various user interfaces according to the input of the remote control apparatus 200 while watching a broadcast image.

In addition, the application layer 450 in the legacy system platform 400 may further include at least one of a TV guide application, a Bluetooth application, a reservation application, a digital video recorder (DVR) application, and a hotkey application. Can be.

On the other hand, the library 435 in the smart system platform 405 is located between the OS kernel 410 and the framework 440, and may form the basis of the framework 440. For example, the library 435 may be a security-related library, SSL (Secure Socket Layer), a web engine-related library (WebKit), libc (c library), a video (video) and audio format, etc. The media framework, which is a media related library, may be included. Such a library 435 may be written based on C or C ++. In addition, it may be exposed to the developer through the framework 440.

The library 435 may include a runtime 437 having a core java library and a virtual machine (VM). This runtime 437, together with the library 435, forms the basis of the framework 440.

The virtual machine VM may be a plurality of instances, that is, a virtual machine capable of performing multitasking. Meanwhile, each virtual machine (VM) may be allocated and executed according to each application in the application layer 455. In this case, a binder (in the OS kernel 410) may be used for scheduling or interconnection among a plurality of instances. Binder) driver (not shown) may operate.

The binder driver and the runtime 437 may connect a Java-based application to a C-based library.

The library 435 and the runtime 437 may correspond to the middleware of the legacy system.

Meanwhile, the framework 440 in the smart system platform 405 includes a program on which the application in the application layer 455 is based. The framework 440 may be compatible with any application and may reuse, move, or exchange components. The framework 440 may include a support program, a program that ties other software components, and the like. For example, the resource manager may include a resource manager, an activity manager related to an activity of an application, a notification manager, a content provider that summarizes sharing information between applications, and the like. . The framework 440 may be written based on Java (JAVA).

The application layer 455 on the framework 440 includes various programs that can be driven and displayed in the image display apparatus 100. For example, a core application including at least one of an email, a short message service (SMS), a calendar, a map, a browser, and the like may be included. Can be. The application layer 450 may be written based on JAVA.

In addition, the application layer 455 is an application 465 stored in the image display apparatus 100 that cannot be deleted by the user, and is downloaded and stored through an external device or a network, and is free to install or delete by the user. 475).

Through the application in this application layer 455, by Internet connection, Internet telephony service, video on demand (VOD) service, web album service, social networking service (SNS), location based service (LBS), map service, web search Services, application search services, etc. may be performed. In addition, various functions such as a game and schedule management may be performed.

Next, referring to FIG. 5, the platform of the image display apparatus 100 according to an exemplary embodiment of the present invention is an integrated platform, including an OS kernel 510, a driver 520, and middleware 530. ), A framework 540, and an application layer 550.

5, the library 435 of FIG. 4 is omitted, and the application layer 550 is provided as an integrated layer in comparison with FIG. 4. In addition, the driver 520 and the framework 540 correspond to FIG. 4.

On the other hand, the library 435 of FIG. 4 can be regarded as being merged into the middleware 530. That is, the middleware 530 is a middleware under a legacy system. The middleware of the data broadcasting related middleware, MHEG or ACAP, the broadcasting information related middleware, the PSIP or SI middleware, and the peripheral communication related middleware DLNA middleware, of course, the image display. The middleware under the device system may include a security-related library SSL (Secure Socket Layer), a web engine-related library (WebKit), libc, a media-related library (Media Framework). On the other hand, it may further include the above-described runtime.

The application layer 550 is an application under the legacy system, and includes a menu related application, a TV guide application, a reservation application, and the like, and an application under the video display system. The application layer 550 may include an email, an SMS, a calendar, a map, a browser, and the like. have.

On the other hand, the application layer 550 is an application 565 stored in the image display apparatus 100 that cannot be deleted by the user, and is downloaded and stored through an external device or a network, and is free to install or delete by the user ( 575).

4 and 5 described above can be used in a variety of other electronic devices, as well as the image display device.

Meanwhile, the platform of FIGS. 4 and 5 may be loaded in the above-described storage 140 or the controller 170 or may be loaded in a separate processor (not shown).

6A is a diagram illustrating an operation method using the first surface of the remote control device of FIG. 1B.

Referring to the drawings, FIG. 6A illustrates that the pointer 205 is displayed in response to the movement of the remote controller 200 with the first surface 201 of the remote controller 200 facing upward. .

First, FIG. 6A (a) illustrates that a pointer 205 corresponding to the remote control apparatus 200 is displayed at a predetermined position of the display 180.

The user can move or rotate the remote control device 200 up and down, left and right (FIG. 6A (b)), and back and forth (FIG. 6A (c)). The pointer 205 displayed on the display 180 of the image display device is displayed in response to the movement of the remote controller 200. The remote control apparatus 200 may be referred to as a spatial remote controller or a 3D pointing device because the pointer 205 is moved and displayed according to the movement in the 3D space as shown in the figure.

6A (b) illustrates that when the user moves the remote control apparatus 200 to the left side, the pointer 205 displayed on the display 180 of the image display apparatus also moves to the left side correspondingly.

Information about the movement of the remote control device 200 detected through the sensor of the remote control device 200 is transmitted to the image display device. The image display apparatus 100 may calculate the coordinates of the pointer 205 from the information about the movement of the remote control apparatus 200. The image display apparatus 100 may display the pointer 205 to correspond to the calculated coordinates.

6A (c) illustrates a case in which a user moves the remote control device 200 away from the display 180 while pressing a specific key in the remote control device 200. As a result, the selection area in the display 180 corresponding to the pointer 205 may be zoomed in and enlarged. On the contrary, when the user moves the remote controller 200 to be closer to the display 180, the selection area in the display 180 corresponding to the pointer 205 may be zoomed out and reduced. On the other hand, when the remote control device 200 moves away from the display 180, the selection area is zoomed out, and when the remote control device 200 approaches the display 180, the selection area may be zoomed in.

On the other hand, while pressing a specific key in the remote control device 200, the recognition of the vertical movement can be excluded. That is, when the remote control device 200 moves away from or near the display 180, the up, down, left and right movements are not recognized, and only the front and back movements can be recognized. In a state where a specific button in the remote controller 200 is not pressed, only the pointer 205 moves according to the up, down, left, and right movements of the remote controller 200.

Meanwhile, the moving speed or the moving direction of the pointer 205 may correspond to the moving speed or the moving direction of the remote control apparatus 200.

6B is a diagram illustrating an operation method using the second surface of the remote control device of FIG. 1C.

Referring to FIG. 6B, in the state in which the second surface 251 of the remote control apparatus 200 faces upward and the first surface 201 faces the ground, character key input of the remote control apparatus 200 is input. Illustrate this.

For example, when the first English key 281, the second English key 282, and the third English key 283 are sequentially input among the character keys, the remote controller 200 displays the corresponding key information. The display device 100 transmits the result. Accordingly, the video display device may display the corresponding character (“abc”) 715 on the display window 710.

Meanwhile, the corresponding character "abc" may be displayed on the display 270 of the remote control apparatus 200 according to the input of the corresponding character.

7 is an internal block diagram of the remote control device of FIG. 1.

Referring to the drawings, the remote control device 200 includes a wireless communication unit 820, a user input unit 830, a sensor unit 840, an output unit 850, a power supply unit 860, a storage unit 870, It may include a controller 880.

The wireless communication unit 820 transmits and receives a signal to any one of the image display apparatus according to the embodiments of the present invention described above. Among the image display apparatuses according to the exemplary embodiments of the present invention, one image display apparatus 100 will be described as an example.

In this embodiment, the wireless communication unit 820 may include an RF module 821 capable of transmitting and receiving a signal with the image display device 100 according to the RF communication standard. In addition, the wireless communication unit 820 may include an IR module 823 capable of transmitting and receiving signals with the image display device 100 according to the IR communication standard.

In the present embodiment, the remote control apparatus 200 transmits a signal containing information on the movement of the remote control apparatus 200 to the image display apparatus 100 through the RF module 821.

In addition, the remote control apparatus 200 may receive a signal transmitted from the image display apparatus 100 through the RF module 821. In addition, the remote control apparatus 200 may transmit a command regarding power on / off, channel change, volume change, etc. to the image display apparatus 100 through the IR module 823 as necessary.

In the present embodiment, the user input unit 830 may include an operation key input unit 832 capable of performing operation key input and a text key input unit 834 capable of performing character key input.

The operation key input unit 832 may include various operation keys disposed on the front surface 201 of the remote control apparatus 200 as described in the description of FIG. 1B. For example, the operation key input unit 832 includes a power key 202, a home key 204, a search key 206, a four-way key 210, a wheel key 222, a previous key 222, and a menu key 224. ), A volume key 230, a 3D key 235, a channel key 240, and the like.

Meanwhile, as described in the description of FIG. 1C, the character key input unit 834 may include various character keys disposed on the rear surface 251 of the remote control apparatus 200. For example, the character key input unit 834 may include a numeric key 262, an English key 264, and the like.

The user may manipulate the user input unit 830 to input a related command for remotely controlling the image display apparatus 100. When the user input unit 830 includes a hard key button, the user may input a command related to the image display apparatus 100 to the remote control apparatus 200 through a push operation of the hard key button. When the user input unit 830 includes a touch screen, a user may input a command related to the image display apparatus 100 to the remote controller 200 by touching a soft key of the touch screen. In addition, the user input unit 830 may include various kinds of input means that the user can operate, such as a scroll key or a jog key, and the present embodiment does not limit the scope of the present invention.

The sensor unit 840 may detect motion information of the remote control device and output it. To this end, the sensor unit 840 may include a gyro sensor 841 or an acceleration sensor 843.

The gyro sensor 841 may sense movement information of the remote controller 200. For example, the gyro sensor 841 may sense movement information of the remote controller 200 based on the x, y, and z axes.

The acceleration sensor 843 may sense information about a moving speed of the remote controller 200. For example, the acceleration sensor 843 may sense the moving speed information of the remote controller 200 based on the x, y, and z axes.

On the other hand, the sensor unit 840 may further include a distance measuring sensor, thereby sensing the distance to the display 180.

Meanwhile, the motion information output from the sensor unit 840 includes movement information of the remote control device 200 from the gyro sensor 841 and movement speed information of the remote control device 200 from the acceleration sensor 843. In addition, the distance information may be further included.

The output unit 850 may output a video or audio signal corresponding to a manipulation of the user input unit 830 or corresponding to a signal transmitted from the image display apparatus 100. The user may recognize whether the user input unit 830 is manipulated or whether the image display apparatus 100 is controlled through the output unit 850.

For example, the output unit 850 may be an LED module 851 that is turned on when the user input unit 830 is manipulated or a signal is transmitted to or received from the image display device 100 through the wireless communication unit 825, or a vibration module generating vibration. 853), a sound output module 855 for outputting sound, or a display module 857 for outputting an image.

The power supply unit 860 supplies power to the remote control device 200. The power supply unit 860 may reduce power waste by stopping the power supply when the remote controller 200 does not move for a predetermined time. The power supply unit 860 may resume power supply when a predetermined key provided in the remote control apparatus 200 is operated.

The storage unit 870 may store various types of programs, application data, and the like required for controlling or operating the remote control apparatus 200. If the remote control apparatus 200 transmits and receives a signal wirelessly through the image display apparatus 100 and the RF module 821, the remote control apparatus 200 and the image display apparatus 100 transmit signals through a predetermined frequency band. Send and receive The control unit 880 of the remote control device 200 stores information on a frequency band or the like for wirelessly transmitting and receiving signals with the image display device 100 paired with the remote control device 200 in the storage unit 870. Reference may be made.

The controller 880 controls various items related to the control of the remote controller 200.

The controller 880 may transmit the predetermined key operation information of the user input unit 830 or information corresponding to the movement of the remote control apparatus 200 sensed by the sensor unit 840 through the wireless communication unit 825 to the image display apparatus 100. Can be sent to.

The video display device 100, in particular, the user input interface unit 150, receives key operation information or motion information. To this end, the user input interface unit 150 may include a wireless communication unit 811.

The wireless communication unit 811 may include an RF module 812 for RF communication with the remote control device 200 and an IR module 812 for IR communication with the remote control device 200.

The user input interface unit 150 may further include a coordinate value calculator 815 that calculates coordinates of a pointer using information corresponding to the movement of the remote controller 200.

Meanwhile, the pointer coordinate calculation may be performed by the controller 170 instead of the coordinate value calculator 815. To this end, the user input interface unit 150 may transmit information corresponding to the movement of the remote control apparatus 200 to the controller 170.

8 is a flowchart illustrating a method of operating an image display device according to an exemplary embodiment of the present invention, and FIGS. 9A to 14C are views referred to for describing the method of FIG. 8.

Referring to the drawing, the video display device 100 displays a home screen (S820).

When the image display apparatus 100 is turned on, when the image display apparatus 100 is set to display the home screen, the home screen may be automatically displayed when the image display apparatus 100 is turned on. Alternatively, when there is a home screen display input based on the remote controller 200, a local key (not shown), a user voice, a user gesture, or the like, the controller 170 may display a home screen on the display 180. .

9A illustrates an example of a home screen 900 of an image display apparatus.

The home screen 900 includes a dashboard area 902 made up of card objects having respective contents, and a launcher bar area 903 provided with frequently used application items. can do.

The dash board area 902 may be arranged to display card objects, which may be moveable and replaceable or switchable. In addition, in the dashboard area 1102, an object (not shown) indicating a logged-in user and time information may be displayed.

FIG. 9A illustrates an example of card objects, such as a Live TV card object 920, a premium apps card object 940, and a 3D world card object 960. ).

Meanwhile, FIG. 9B illustrates another example of card objects. In FIG. 9A, when the screen switching object 935 is selected and the screen is switched, the dash board area 902 includes a video card object 950 and a shared content card object smart. A home screen 901 displaying a share card object 970, a my interest card object 990, and the like may be displayed.

In the live TV card object 920, a live broadcast image 910 may be displayed. In addition, broadcast information 912 related to live broadcast and an external input selectable external input object 914. The setting object 916 and the advertisement image 915 capable of inputting the setting related to the image display apparatus may be displayed.

The live broadcast video 910 may be a broadcast video extracted from the live broadcast signal received by the broadcast receiver 105 described above. In particular, the live broadcast signal may be received through the tuner 110 or the network interface unit 130 in the broadcast receiver 105.

The broadcast information 912 related to the live broadcast may be program guide information in a broadcast signal or broadcast information separately received from a broadcast station server or other server through a network.

On the other hand, when the external input selectable external input object 914 is selected, the external input input through the HDMI terminal, VOD terminal, component terminal, RGB terminal, antenna terminal, etc. of the various input terminals of the image display device 100 A screen for selecting an input image may be displayed.

When a setting object 916 capable of inputting a setting related to an image display apparatus is selected, a setting screen for setting various kinds related to various image display apparatuses may be displayed.

The advertisement image 915 may provide a broadcast image to the image display apparatus 100 through an advertisement included in a broadcast signal, an advertisement provided by a network provider that provides a network to the image display apparatus 100, or a network. The video may be based on an advertisement provided by a broadcasting station.

The premium apps card object 940 may include a card object name 942 and a content list 945. In particular, the content list 945 may include content items provided by the manufacturer of the image display apparatus 100. In the drawing, application items are exemplified as content items, and these application items may be application items pre-installed in the image display apparatus 100. When any one application item in the content list 945 is selected, the corresponding application is executed and the executed screen may be displayed on the display 180.

On the other hand, unlike the drawing, it is also possible to display other content items than application items.

Meanwhile, content items, particularly application items, in the premium apps card object 940 may be displayed separately from other application items when the entire application list is displayed.

The 3D world card object 960 may include a card object name 962 and a content list 965. In particular, the content list 965 may include 3D content items. Here, the 3D content items may be shortcut items for executing 3D content. When any 3D content item in the content list 945 is selected, the corresponding 3D content may be played and a playback screen may be displayed on the display 180.

Meanwhile, in FIG. 9A, three card objects 920, 940, and 960 are displayed, but when there are additional card objects, as shown in FIG. 9A, some areas of the card objects to be additionally displayed are also displayed on the display 180. It is possible. This allows the user to intuitively know that there are additional card objects to search.

The video card object 950 (tube card object) of FIG. 9B includes a card object name 952, an object 954 indicating content list switching, and a content list 955a corresponding to any sub category. Can be.

In particular, the content list 955a may be content items stored in a server operated by a video provider.

Among the plurality of content items stored in the server operated by the video provider, there is a sub-category related to genre, date, region, age, gender, etc. to display predetermined content items in the video card object 950. Can be set. In addition, the content list may be separately configured according to the sub category.

The content list in the video card object 950 displayed in FIG. 9B is a content list 955a including recently featured video items. That is, it is illustrated that a first content list corresponding to the first sub category is displayed.

In addition, when the object 954 indicating the content list switching is selected, the content list in the video card object 950 is a content list including the most viewed items, and a trending video. It may be a content list including items, or a content list including 3D contents items.

A smart share card object 970 of FIG. 9B includes a card object name 972, an object 974 indicating content list switching, and a content list 975a corresponding to any sub category. It can be provided.

In particular, the content list 975a may be a content list including content items stored in an adjacent electronic device.

Among the plurality of content items stored in the adjacent electronic device, a sub-category related to genre, date, and the like may be set in the shared content card object 970 to display predetermined content items. In addition, the content list may be separately configured according to the sub category.

The content list in the shared content card object 970 displayed in FIG. 9B illustrates that the content list 975a includes new content items.

In addition, when the object 974 representing the content list switching is selected, the content list in the shared content card object 970 includes a content list including movie content items and a photo content items. It may be a content list or a content list including music content items.

 The interest card object 990 of FIG. 9B includes a card object name 992, an object 994 for setting interest content, and a content list 995a corresponding to any interest item. can do.

In particular, the content list 995a may be a content list including content items provided by a specific server.

Among the plurality of content items provided by a specific server, a sub-category may be set in the interest card object 990, related to genre, date, and the like, to display predetermined content items. In addition, the content list may be separately configured according to the sub category.

The content list in the interest card object 990 displayed in FIG. 9B illustrates a content list 995a including world news items.

On the other hand, when the top story item and the business item are set in addition to the world news items by the selection of the object 994 for setting the content of interest, the object for setting the content of interest The content list displayed within 994 may be a content list that includes world news items, a content list that includes top stories news items, or a content list that includes business news items. have.

The launcher bar area 903 may include an application list 980 with frequently used application items.

In this launcher bar area 903, the application list 980 changes from the home screen 900 of FIG. 9A to the home screen 901 of FIG. 9B, for example, when the home screen is switched. When switching, it can be displayed as it is without change.

Meanwhile, FIG. 9C illustrates application items displayed in the launcher bar area 903 in the home screen 901. The widget items 981 and the notification (from left to right) are sequentially displayed. notification) item 982, application full view item 983, TV item, 3D item, prime time item (TV & movie item), movie play item, video item, application store item, web item, search item, etc. Illustrate that.

Some of these items may be edited or deleted, but some other items may be items that cannot be edited or deleted.

For example, a video item, an application store item, a web item, a search item, etc. may be non-deletable items, and a TV item, 3D item, etc. may be an item that can be edited or deleted.

FIG. 9C illustrates that the application full view item 983 is displayed in response to the movement of the remote controller 200 in the application list displayed in the launcher bar area 903 in the home screen 900. Based on the pointer 205, the selection is illustrated.

Accordingly, as shown in FIG. 9D, the application list screen 986 having a plurality of application items may be displayed on the video display device 100 as a full screen.

In this case, when there are additional application items to be displayed, it is also possible to display some areas of the application items to be additionally displayed on the display 180. In FIG. 9A, a partial area of an application item to be additionally displayed is illustrated in the lower area of the application list screen 986. This allows the user to intuitively know that there are additional application items to navigate.

On the other hand, unlike the drawing, it is also possible to display a partial region of the application item to be additionally displayed in the right region of the application list screen 986.

On the other hand, selection of an application item, focusing, etc., can be moved by inputting a direction key of the remote control apparatus 200 in addition to the pointer based on the movement of the remote control apparatus 200.

In the drawing, the cursor 988 is located at a predetermined application item according to the direction key input of the remote control apparatus 200. On the other hand, when there is a direction key input of the remote control device 200, a pointer based on the movement of the remote control device 200 may not be displayed.

Meanwhile, for moving the screen, a scroll bar (not shown) may be displayed on one area of the display.

Next, FIG. 9E illustrates an application list screen 987 by screen movement. In the application list screen 986 of FIG. 9D, when there is a screen movement input in a downward direction, the application list screen 987 of FIG. 9E may be displayed. The screen movement input at this time may be a pointer input, a direction key input, a scroll bar input, or the like. In the drawing, the cursor 989 is located at a predetermined application item according to the direction key input of the remote control apparatus 200.

Meanwhile, after operation 820 of FIG. 8 (that is, after displaying the home screen), the following steps may be performed.

The image display apparatus 100 determines whether there is a dynamic screen display input (S830), and if applicable, displays the dynamic screen (S830).

Next, the image display apparatus 100 may determine whether there is a dynamic screen movement input in the state of displaying the dynamic screen (S835), and if so, display the dynamic screen in movement (S840).

On the other hand, after operation S830, the image display apparatus 100 determines whether there is a dynamic screen enlargement input while displaying the dynamic screen (S845), and if so, enlarges the dynamic screen. It may be displayed (S850).

In a state where the home screen is displayed, the PIP key 241 in the remote controller 200 may be operated to display a dynamic screen.

The dynamic screen described herein is a screen screen that is actively movable on the display screen, and may be, for example, a broadcast video screen or a widget screen. Hereinafter, a description will be given of the dynamic screen, mainly on the broadcast video screen.

For example, in the home screen 901 where the live broadcast image 910 is not displayed, when the user is looking at the card objects 950, 970 and 990 and wants to watch the live broadcast image, the remote controller ( When the PIP key 241 of the 200 is operated, as shown in FIG. 10A, the live broadcast video 1010 may be displayed.

The live broadcast video 1010 may be a broadcast video of the same channel as the live broadcast video 910 displayed on the previous home screen 900 as shown in FIG. 9A. Accordingly, the user can immediately watch the desired live broadcast video without changing the home screen.

On the other hand, the live broadcast video 1010 is preferably displayed on the screen being displayed other than the important information display area. For example, as shown in FIG. 10A, the content list, which is automatically updated at predetermined time intervals, is preferably displayed in an area other than the interest card object 990 on which the content list is displayed.

Alternatively, the live broadcast image 1010 may be displayed by being superimposed on a card object that has the lowest use frequency, the lowest use frequency, or the least use time among the plurality of card objects. In other words, since the overlapping is displayed on the plurality of card objects having the least importance, the contents list with high user's interest can be viewed without interruption.

Alternatively, when the user uses the remote control apparatus 200 to display the pointer 205 in one region of the image display apparatus 100, the live broadcast image 1010 may be disposed near the region.

Or, conversely, when the user uses the remote control device 200 and the pointer 205 is displayed in an area in the image display device 100, the user may be interested in the area, and thus, the user may be interested in the area other than the area. In another area, the live broadcast video 1010 may be disposed.

FIG. 10A illustrates that when the PIP key 241 of the remote controller 200 is operated while the pointer 205 is displayed in the launcher bar area 903, it is superimposed on the content list in the video card object 950 and is live. For example, the broadcast video 1010 is displayed.

Meanwhile, the live broadcast image 1010 displayed in the home screen may be displayed in the home screen as a part of a transparent state. As a result, it is possible to hide an area within the home screen.

On the other hand, unlike the drawing, it may be displayed on the path between the card objects, or in the upper area of the card objects, so as not to cover the card object in the home screen. The size of the live broadcast video displayed as described above may be smaller than the size of the live broadcast video of FIG. 10A. That is, the size of the live broadcast video may vary in response to the size of the empty space in the screen.

On the other hand, when there is a 2D to 3D switching input while the live broadcast video 1010 of FIG. 10A is displayed on the home screen, the controller 170 of the video display device 100 may switch the displayed video to the 3D video. Can be. For each object, a depth may be added to allow the user to recognize that the object protrudes. In this case, the depth of the live broadcast video may be set to be the largest, so that it may appear more protruded than objects in other screens.

FIG. 10B illustrates that the pointer 205 displayed in response to the movement of the remote controller 200 moves within the live broadcast image 1010 while the live broadcast image 1010 is displayed in the video card object 950. To illustrate.

Accordingly, the object 1020 for moving and enlarging the live broadcast image 1010 may be displayed. In the figure, an object 1020 is displayed in superimposition on the live broadcast video 1010. The object 1020 may be displayed transparently, and thus, while viewing the live broadcast video 1010, the object 1020 may adjust movement or enlargement of the live broadcast video.

Meanwhile, unlike the drawing, the object 1020 for moving and enlarging may be displayed separately from the live broadcast image 1010. For example, an object 1020 for moving or enlarging may be displayed near a position where the pointer 205 displayed corresponding to the movement of the remote control apparatus 200 is displayed. In this case, it is preferable that the object 1020 for movement, enlargement, etc. is displayed transparently so as not to block other card objects.

In the drawing, the object 1020 includes the upper left moving item 1021, the upper right moving item 1022, the lower left moving item 1023, the lower right moving item 1024, the exiting item 1025, and the full screen viewing item ( 1026), and including the enlarged screen item 1027.

FIG. 10C illustrates that the upper left moving item 1021 in the object 1020 for moving, enlarging, etc. is selected. Accordingly, as shown in FIG. 10C, the live broadcast video 1010 is disposed and displayed in the upper left area of the home screen 901.

Next, when the screen magnification item 1027 in the object 1020 for moving, enlarging, etc., which is displayed as shown in FIG. 10A is selected, as shown in FIG. 901. In this case, the screen magnification item 1027 in the object 1020 may be converted to the screen reduction item 1028 and displayed as illustrated.

Meanwhile, when the display area of the live broadcast video 1010 is determined according to the direction of the user's gaze, when the user is interested in the entire home screen area and fixes the gaze to the entire area, the live broadcast video 1010 The display of can be stopped.

Alternatively, when the user presses the PIP key 241 of the video display device 200 once more, the display of the live broadcast video 1010 may be stopped.

FIG. 10E illustrates that the live broadcast video is not displayed on the home screen 901 as an interruption of the display of the live broadcast video, and an icon 1017 for viewing the live broadcast video is displayed instead. Such an icon 1017 is preferably displayed on an empty space area in the home screen.

9A to 10E illustrate the selection of an object or the like using the pointer 205 indicating the movement of the remote control apparatus 200, but the present disclosure is not limited thereto and various modifications are possible.

For example, an object or the like may be selected using the direction key and the confirmation key of the remote control apparatus 200. As another example, an object or the like may be selected in response to a user gesture based on the captured image of the user captured by the camera 195. As another example, the user's voice may be recognized, and an object or the like may be selected based on the recognized voice content.

The image display apparatus 100 may display a hand pointer 506 corresponding to the hand 505 of the user 1104 based on the captured image of the user captured by the camera 195. Then, the user may perform an operation corresponding to the gesture.

For example, when the live broadcast video 1010 is displayed in the lower left area of the home screen by the dynamic screen display input, the hand gesture of the user 1104 is a gesture corresponding to the upward movement as shown in FIG. 10F. In this case, the video display device 100 may determine the live broadcast video 1010 as a command to move upward, and move and display the live broadcast video 1010 displayed in the lower left area to the upper left area as shown in the drawing. have. In this case, the hand pointer 506 may be displayed on the moved live broadcast image 1010 as it is.

As another example, the remote control apparatus 200 having a microphone (not shown) may receive a user's voice and transmit the received voice to the image display apparatus 100. When the user outputs the voice 503 "Move the dynamic screen to the upper left" with the home screen as shown in FIG. 10G displayed, the remote control apparatus 200 collects the voice 503. And the image display device 100. The image display apparatus 100 may analyze a user's voice content and recognize a movement command of the upper left side of the dynamic screen through a voice recognition function. Accordingly, as shown in FIG. 10G, the live broadcast video 1010 displayed in the lower left region may be moved and displayed in the upper left region.

Meanwhile, the dynamic screen displayed on the image display device may be moved and displayed to another electronic device by a movement display command of the user.

For example, when the user outputs the voice 502 "Move the dynamic screen to the tablet" while the home screen as shown in FIG. 10H is displayed, the remote controller 200 may generate such a voice ( 503 may be collected and transmitted to the image display apparatus 100. The image display apparatus 100 may analyze a user's voice content and recognize a tablet movement display command of a dynamic screen through a voice recognition function.

For display of movement, first, the video display device 100 performs pairing with the tablet, and after pairing is completed, if the displayed live broadcast video is being received through the network, the corresponding network address is transmitted to the network interface unit ( Through 130), pass to the tablet. Accordingly, the tablet can display the live broadcast video 1011 as shown in the figure.

On the other hand, when the live broadcast video displayed on the video display device 100 is received through the tuner 110 or the like, the video display device 100 converts the live broadcast video into a stream and converts the converted stream. The network interface unit 130 transmits the data to the tablet. Accordingly, the tablet can display the live broadcast video 1011 as shown in the figure.

On the other hand, the command such as selection to the video display device may be a direction key and a confirmation key, a user gesture, a user voice, etc., in addition to the pointer of the remote control device. Hereinafter, a description will be given of a pointer indicating the movement of the remote controller, but as described above, the direction key and the confirmation key, the user gesture, the user's voice, etc. may be extended.

11A to 11E are similar to FIGS. 10A to 10E in that the dynamic screen is displayed on the home screen, except that the live broadcast card object 920 is displayed on the screen 900.

11A illustrates a live broadcast video displayed in the live broadcast card object 920 in the home screen 900 when the PIP key 241 of the video display device 200 operates while the home screen 900 is displayed. Illustrates that 1110 is separated and displayed.

In this case, in the area where the live broadcast image is displayed in the live broadcast card object 920, an object 1020 for moving or enlarging the live broadcast image may be displayed instead. 11A illustrates that the right upper end movement item 1022 in the object 1020 is selected and the live broadcast video 1110 is moved and displayed in the upper right region of the home screen 900. In this case, the live broadcast video 1110 may be a broadcast video of the same channel as the live broadcast video displayed in the live broadcast card object 920.

FIG. 11B illustrates that the bottom right moving item 1024 in the object 1020 for moving, enlarging, etc. is selected. Accordingly, as shown in FIG. 11B, the live broadcast video 1110 is disposed and displayed in the lower right area of the home screen 900. In the drawing, the 3D related card object 960 and the application list 980 are shown to be partially overlapped.

11C illustrates that the upper left moving item 1021 in the object 1020 for moving, enlarging, etc. is selected. Accordingly, as shown in FIG. 11C, the live broadcast video 1110 is disposed and displayed in the upper left area of the home screen 900. In this case, the live broadcast image 1110 may be partially overlapped with the object 1020 for moving and enlarging. In this case, at least one of the live broadcast image 1110 or the object 1020 may be displayed transparently so as not to cover the other one.

11D illustrates that the lower left moving item 1023 in the object 1020 for moving, enlarging, etc. is selected. Accordingly, as shown in FIG. 11D, the live broadcast video 1110 is disposed and displayed in the lower left area of the home screen 900.

11E illustrates that the screen magnification item 1027 in the object 1020 for moving, enlarging, etc. is selected. Accordingly, as shown in FIG. 11E, the enlarged live broadcast video 1115 is displayed in the home screen 900. In this case, the screen magnification item 1027 in the object 1020 may be converted to the screen reduction item 1028 and displayed as illustrated.

Meanwhile, when the PIP key 241 of the remote controller 200 is operated while the live broadcast image is displayed in the live broadcast card object 920, a live broadcast image separate from the displayed live broadcast projection may be displayed. Do. The additionally displayed live broadcast image may be a broadcast image of a channel different from the displayed broadcast image.

10A to 11E illustrate movement and enlarged display of the displayed live broadcast video 1010 or 1010. 10A to 11E, unlike the above-described movement or enlargement display, when there is a card object to be updated as described above, the movement or enlargement display is displayed in other areas, or the frequency of use or the number of times of use or the time of use are the highest. It is possible to move or enlarge the display in an area other than the card object, or to move or enlarge the display in an area other than the area where the pointer is displayed.

For example, in FIG. 10B, even if the top right move item 1022 is selected, the live broadcast video is not displayed in the updated interest card object 990, and adjacent to the shared content card object 970 adjacent thereto. It is preferable to display in an overlapping manner. As a result, in the information area important to the user, visibility is ensured despite the display of the live broadcast video.

FIG. 12 shows the live broadcast of channel 5 in the live broadcast card object 920 when the PIP key 241 of the remote controller 200 is operated while the live broadcast video is displayed in the live broadcast card object 920. An example of displaying a separate live broadcast video 1210 different from the video is shown.

In this case, the object 1220 for moving and enlarging the live broadcast image 1210 may be displayed. In the drawing, the object 1220 is superimposed on the live broadcast image 1210, but may be separately separated. .

Meanwhile, when the live broadcast image 1210 and the object 1220 for movement and enlargement overlap, it is preferable that one of the two is displayed transparently.

On the other hand, in order to receive the live broadcast video of a plurality of channels, a plurality of tuners are required, on the other hand, by using one tuner, to receive a live broadcast video of one channel, the other channel through the network interface unit 135 It is also possible to receive live broadcast video. Alternatively, it is also possible to receive a live broadcast video of a plurality of channels through the network interface unit 135.

On the other hand, after photographing the user through the camera 195, after recognizing the user's gaze direction from the photographed user image, it is also possible to control the live broadcast image 1010 is arranged in an area other than the user gaze direction. Since the information corresponding to the gaze direction area is concentrated, the live broadcast video 1010 may be displayed in the other area so as not to disturb it.

In this case, the live broadcast video 1010 may be floated and displayed in response to the movement of the user's line of sight.

12 illustrates a separate live broadcast image 1210 different from the live broadcast image of channel 5 in the live broadcast card object 920 when the PIP key 241 of the remote controller 200 is operated once. For example, when the PIP key 241 of the remote controller 200 operates twice, another live broadcast image may be displayed. In this case, the size of the live broadcast video 1210 that is displayed first may be smaller than the size of another newly displayed live broadcast video. In particular, the size of the live broadcast image 1210 that is displayed first may be sequentially reduced. Finally, it may be displayed as an icon 1017 for replaying the live broadcast video, which is displayed in FIG. 10E. As a result, the user can concentrate more on the newly displayed live broadcast video.

13 illustrates an example of recognizing a line of sight of the user 1300 through the camera 195 while the home screen 901 is displayed. The controller 170 may calculate a face movement direction, a pupil movement direction, etc. of the user from the photographed user image, and estimate the user's gaze direction based on this.

When the gaze direction of the user is estimated, the live broadcast image 1010 may be displayed in an area other than the gaze direction of the user. In FIG. 13, the user's gaze direction is shown in the left direction, not the right direction, in which the live broadcast video 1010 is displayed. As a result, the live broadcast video 1010 can be displayed without disturbing the user's viewing.

In this case, the object 1220 for moving and enlarging may overlap and display the live broadcast image 1010, and one of the objects 1220 for moving and enlarging the live broadcast image 1010 is transparent. It is preferable to display.

FIG. 14A illustrates a pointer 205 in which the entire application view item 983 in the application list is displayed in response to the movement of the remote controller 200 while the live broadcast video 1010 is displayed on the home screen 901. Based on the above, the selection is illustrated.

Accordingly, as shown in FIG. 14B, the application list screen 1405 having a plurality of application items may be displayed on the video display device 100 as a full screen. In this case, the live broadcast video 1410 may be displayed at the same location and the same size as displayed on the home screen 901.

On the other hand, as shown in FIG. 14C, the size of the live broadcast video 1415 displayed in the application list screen 1405 may be larger than that of FIG. 14A. On the other hand, unlike the drawing, the display position can be changed.

Since the amount of information displayed on the home screen 901 of FIG. 14A is large, the live broadcast video is displayed small. However, since the application list screen 1405 of FIG. 14C has a small amount of information displayed, that is, a lot of empty space is provided. The size of the live broadcast image 1415 may be enlarged and displayed. Accordingly, visibility of the live broadcast video 1415 may be improved.

Meanwhile, the image display apparatus 100 according to an exemplary embodiment of the present invention may highlight and display a corresponding object or menu when an object or a menu is selected or when focusing. Although not shown in FIGS. 9A to 14C, the outline of the object or menu to be selected or focused may be thickened, or at least one of size, color, and luminance may be changed and displayed. This allows the user to intuitively recognize the selection or the focusing.

The image display device and the method of operating the same according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments are all or all of the embodiments so that various modifications can be made. Some may be optionally combined.

On the other hand, the operating method of the image display device of the present invention can be implemented as a processor-readable code on a processor-readable recording medium provided in the image display device. The processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

In addition, although the preferred embodiment of the present invention has been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

Claims (20)

Displaying a home screen including at least one card object including a contents list;
Displaying a dynamic screen in the home screen when there is a dynamic screen display input; And
And moving the dynamic screen in the home screen when there is the dynamic screen movement input.
When there is the dynamic screen display input in a state in which a live broadcast video is displayed in a first area within the home screen, the dynamic screen is displayed in a second area separately from the live broadcast video. How the device works.
The method of claim 1,
And displaying the dynamic screen in the home screen in a magnified manner when the dynamic screen enlargement input is received.
The method of claim 1,
The dynamic screen,
Operation method of a video display device comprising a live video screen or a widget screen.
The method of claim 1,
When the card object including the updated content list is displayed on the home screen, the dynamic screen is displayed in a region other than the card object including the updated content list. .
The method of claim 1,
The dynamic screen,
The method of operating an image display apparatus, wherein the card objects in the home screen are superimposed on the card objects having the lowest use frequency or the least use time.
The method of claim 1,
The dynamic screen,
And a pointer corresponding to the movement of the remote control apparatus is displayed in the home screen, in a region other than the pointer.
The method of claim 1,
And displaying an object including a screen movement item or a screen magnification item with respect to the displayed dynamic screen.
The method of claim 1,
And the object for moving or enlarging is displayed overlapping with the dynamic screen.
The method of claim 8,
The object for moving or enlarging is superimposed on the dynamic screen when a pointer corresponding to the movement of a remote controller is displayed on the dynamic screen.
Displaying a home screen including at least one card object including a contents list;
Displaying a dynamic screen in the home screen when there is a dynamic screen display input; And
And moving the dynamic screen in the home screen when there is the dynamic screen movement input.
And when the dynamic screen and the object for moving or enlarging are overlapped with each other, the dynamic screen or the object for moving or enlarging is displayed transparently.
The method of claim 1,
Photographing the user; And,
Recognizing a gaze direction of the user based on the photographed image, further comprising:
The dynamic screen,
And the display unit is displayed in an area of the home screen other than the recognized user gaze direction.
The method of claim 1,
Photographing the user; And,
Recognizing a gaze direction of the user based on the photographed image, further comprising:
The dynamic screen,
And a floated display is displayed in response to the recognized direction of movement of the user's line of sight.
delete The method of claim 1,
In the home screen, when there is the dynamic screen display input while the live broadcast video is displayed in the first area, the displayed live broadcast video is moved and displayed in the second area as the dynamic screen. Method of operating a video display device.
Displaying a home screen including at least one card object including a contents list;
Displaying a dynamic screen in the home screen when there is a dynamic screen display input;
Moving and displaying a dynamic screen in the home screen when there is the dynamic screen movement input;
Switching the home screen to display an application screen;
And the size or display position of the dynamic screen displayed in the application screen is different from the size or display position of the dynamic screen displayed in the home screen.
A network interface unit exchanging data with a server;
A display for displaying a home screen including at least one card object including a contents list, and displaying a dynamic screen in the home screen when there is a dynamic screen display input; And
And a control unit which controls to move and display the dynamic screen in the home screen when the dynamic screen movement input is present.
When there is the dynamic screen display input in a state in which a live broadcast video is displayed in a first area within the home screen, the dynamic screen is displayed in a second area separately from the live broadcast video. Device.
The method of claim 16,
The control unit,
And displaying the dynamic screen in the home screen when the dynamic screen magnification is input.
The method of claim 16,
And a user input interface unit configured to receive movement information or a predetermined key input from a remote control device.
The control unit,
And displaying the dynamic screen in another area other than the pointer when the pointer corresponding to the movement of the remote controller is displayed in the home screen.
The method of claim 16,
The control unit,
And an object including a screen movement item or a screen magnification item for the displayed dynamic screen so as to overlap the dynamic screen and display the object.
The method of claim 16,
It further comprises a camera for photographing the user,
The control unit,
And recognizing a gaze direction of a user based on the image photographed by the camera, and controlling the dynamic screen to be displayed in an area outside the recognized user gaze direction in the home screen.
KR1020130011284A 2013-01-25 2013-01-31 Image display apparatus, and method for operating the same KR102039486B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020130011284A KR102039486B1 (en) 2013-01-31 2013-01-31 Image display apparatus, and method for operating the same
US13/922,707 US10031637B2 (en) 2013-01-25 2013-06-20 Image display apparatus and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130011284A KR102039486B1 (en) 2013-01-31 2013-01-31 Image display apparatus, and method for operating the same

Publications (2)

Publication Number Publication Date
KR20140098517A KR20140098517A (en) 2014-08-08
KR102039486B1 true KR102039486B1 (en) 2019-11-01

Family

ID=51745293

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130011284A KR102039486B1 (en) 2013-01-25 2013-01-31 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR102039486B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108073266B (en) * 2017-12-15 2020-12-04 麒麟合盛网络技术股份有限公司 Screen locking method and device of mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0169307B1 (en) * 1995-12-28 1999-03-20 배순훈 Sub-picture position auto-shifting apparatus at an end of broadcasting program of multi-tv
KR20020068134A (en) * 2001-02-20 2002-08-27 엘지전자주식회사 Method for displaying PIP of a digital TV
KR20100125853A (en) * 2009-05-22 2010-12-01 엘지전자 주식회사 Apparatus and method for controlling contents pip video
KR101730422B1 (en) * 2010-11-15 2017-04-26 엘지전자 주식회사 Image display apparatus and method for operating the same

Also Published As

Publication number Publication date
KR20140098517A (en) 2014-08-08

Similar Documents

Publication Publication Date Title
US10200738B2 (en) Remote controller and image display apparatus having the same
US9182890B2 (en) Image display apparatus and method for operating the same
US10031637B2 (en) Image display apparatus and method for operating the same
US9363570B2 (en) Broadcast receiving apparatus for receiving a shared home screen
CN113259741B (en) Demonstration method and display device for classical viewpoint of episode
KR102058041B1 (en) Image display apparatus, and method for operating the same
KR20110120132A (en) Apparatus for controlling an image display device and method for operating the same
KR20110129715A (en) Image display device and method for operating the same
KR102104438B1 (en) Image display apparatus, and method for operating the same
US9398324B2 (en) Image display apparatus and operation method thereof
KR101847616B1 (en) Image display apparatus, server and method for operating the same
KR102046642B1 (en) Image display apparatus, and method for operating the same
KR101916437B1 (en) Mobile apparatus, image display apparatus, server and method for operating the same
KR20110129714A (en) Image display device and method for operating the same
KR102056165B1 (en) Apparatus for receiving broadcasting and method for operating the same
KR102110532B1 (en) Image display apparatus, and method for operating the same
KR102039486B1 (en) Image display apparatus, and method for operating the same
US9542008B2 (en) Image display apparatus and method for operating the same
KR20130079926A (en) Image display apparatus, server and method for operating the same
KR102281839B1 (en) Apparatus for providing Image
KR20110134090A (en) Image display apparatus and method for operating the same
KR102104445B1 (en) Apparatus for receiving broadcasting and method for operating the same
KR102141046B1 (en) Method for operating Image display apparatus
KR20110128537A (en) Image display device and method for operating the same
KR20160037449A (en) Method for operating and apparatus for providing Image

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant