KR101626310B1 - Image Display Device and Operating Method for the Same - Google Patents

Image Display Device and Operating Method for the Same Download PDF

Info

Publication number
KR101626310B1
KR101626310B1 KR1020100026932A KR20100026932A KR101626310B1 KR 101626310 B1 KR101626310 B1 KR 101626310B1 KR 1020100026932 A KR1020100026932 A KR 1020100026932A KR 20100026932 A KR20100026932 A KR 20100026932A KR 101626310 B1 KR101626310 B1 KR 101626310B1
Authority
KR
South Korea
Prior art keywords
osd
osd object
signal
image
eye image
Prior art date
Application number
KR1020100026932A
Other languages
Korean (ko)
Other versions
KR20110107667A (en
Inventor
주영선
김운영
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100026932A priority Critical patent/KR101626310B1/en
Priority to US12/959,706 priority patent/US20110227911A1/en
Publication of KR20110107667A publication Critical patent/KR20110107667A/en
Application granted granted Critical
Publication of KR101626310B1 publication Critical patent/KR101626310B1/en

Links

Images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Human Computer Interaction (AREA)

Abstract

The present invention relates to an image display apparatus and an operation method thereof. A method of operating an image display apparatus according to an exemplary embodiment of the present invention is performed by an image display apparatus capable of three-dimensionally displaying two or more OSD objects. The method includes displaying a first OSD object, Wherein the at least one of the first OSD object and the second OSD object is composed of a plurality of viewpoint images, Is different. Receiving a user setting signal for at least one of the first OSD object and the second OSD object, and controlling the depth of the first OSD object or the second OSD object according to the user setting signal, And changing at least one of the states. It is possible to selectively control the display states of OSD objects displayed in multiple layers according to the user's convenience.

Description

Technical Field [0001] The present invention relates to an image display device and an operation method thereof,

The present invention relates to an image display apparatus and an operation method thereof. More particularly, the present invention relates to an image display apparatus or image display method capable of displaying a plurality of objects in multiple layers.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is being converted from analog broadcasting to digital broadcasting all over the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Compared to analog broadcasting, digital broadcasting is strong against noise and has low data loss, is advantageous for error correction, has high resolution, and provides a clear screen. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

Recently, various contents that can be provided through stereoscopic images and stereoscopic images are being actively studied, and stereoscopic image technology is becoming more common and practical in various other environments and technologies as well as computer graphics. Also, the stereoscopic image can be transmitted in the digital broadcasting described above, and a device for reproducing the stereoscopic image is also under development.

Accordingly, an object of the present invention is to provide an image display apparatus and an operation method thereof that can display various images such as a user interface image and a broadcast image in multiple layers, and can vary a visual effect of an image according to a depth value of each layer have.

It is still another object of the present invention to provide an image display apparatus and an operation method thereof, in which a user can arbitrarily control an image displayed in multiple layers or an object constituting the image, such as depth, display position, and display state.

According to another aspect of the present invention, there is provided a method of operating an image display device capable of three-dimensionally displaying two or more OSD objects, the method comprising: displaying a first OSD object; Displaying a second OSD object having a depth sense different from that of the first OSD object, wherein at least one of the first OSD object or the second OSD object is composed of a plurality of viewpoint images, The depth feeling is changed by the touch sensor; Receiving a user setting signal for at least one of the first OSD object and the second OSD object; And changing at least one of a depth sensing, a display position, and a display state of the first OSD object or the second OSD object according to the user setting signal.

According to another aspect of the present invention, there is provided an image display apparatus capable of displaying two or more OSD objects, including a first OSD object, a second OSD object having a different depth from the first OSD object, Wherein at least one of the first OSD object or the second OSD object is composed of a plurality of viewpoint images, and the depth sense is varied by an interval between the plurality of viewpoint images; A display unit for displaying the first OSD object and the second OSD object; And a user input unit for inputting a user setting signal for at least one of the first OSD object and the second OSD object, wherein the controller controls the depth of the first OSD object or the second OSD object according to the user setting signal, A display position, or a display state of the display screen.

According to an embodiment of the present invention, an image display apparatus or an operation method thereof capable of easily displaying a plurality of images or a plurality of objects displayed in a multi-layer using a depth sense and a visual effect varying thereby / RTI > Also, according to the embodiment of the present invention, the user can freely set the depth of an image displayed in a multi-layer or objects constituting the multi-layer.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a view illustrating a video display device system according to an embodiment of the present invention. FIG.
2 is an internal block diagram of a video display device according to an embodiment of the present invention.
3 is an internal block diagram of the controller 170 among the image display apparatuses according to an embodiment of the present invention. 4 is a diagram showing an example of a 3D video signal format capable of implementing 3D video.
5 illustrates various scaling schemes of a 3D video signal according to an embodiment of the present invention.
FIG. 6 is a view showing a variation in depth of a 3D image or a 3D object according to an embodiment of the present invention; FIG.
FIG. 7 is a view showing a state in which a depth sense of an image or the like is controlled according to an embodiment of the present invention. FIG.
8 and 9 are views illustrating an image display apparatus and a remote control apparatus according to an embodiment of the present invention.
10 is a flowchart illustrating an operation method of an image display apparatus according to an embodiment of the present invention.
Figures 11-13 illustrate a plurality of OSD objects displayed according to an embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a view illustrating a video display device system according to an embodiment of the present invention.

The image display apparatus 100 according to an embodiment of the present invention can communicate with the broadcasting station 210, the network server 220, or the external device 230.

The image display apparatus 100 may receive a broadcasting signal including a video signal transmitted from the broadcasting station 210. The image display apparatus 100 may process a video signal, a voice signal, or a data signal included in the broadcast signal so as to be suitable for output from the image display apparatus 100. [ The image display apparatus 100 can output an image or sound based on the processed image signal.

The image display apparatus 100 can communicate with the network server 220. The network server 220 is a device capable of transmitting and receiving signals to and from the video display device 100 through an arbitrary network. For example, the network server 220 may be a mobile phone terminal that can be connected to the image display apparatus 100 through a wired or wireless base station. In addition, the network server 220 may be a device capable of providing content to the video display device 100 through the Internet network. The content provider can provide the content to the video display device 100 using the network server.

The image display apparatus 100 can communicate with the external apparatus 230. [ The external device 230 is a device capable of transmitting / receiving signals directly to / from the image display device 100 by wire or wirelessly. For example, the external device 230 may be a media storage device or a playback device used by a user. That is, the external device 230 may be a camera, a DVD or Blu-ray player, a personal computer, or the like.

The broadcasting station 210, the network server 220, and the external device 230 may transmit a signal including a video signal to the video display device 100. The image display apparatus 100 can display an image based on a video signal included in an input signal. The video display apparatus 100 may transmit a signal transmitted from the broadcasting station 210 and the network server 220 to the video display apparatus 100 to the external apparatus 230. In addition, a signal transmitted from the external device 230 to the video display device 100 may be transmitted to the broadcasting station 210 or the network server 220. That is, the video display device 100 can transmit contents included in the signals transmitted from the broadcasting station 210, the network server 220, and the external device 230, in addition to directly reproducing the video content from the video display device 100.

2 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.

2, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a broadcast signal receiving unit 110, a network interface unit 120, an external device input / output unit 130, a remote control device interface unit 140 A storage unit 150, a control unit 170, a display 180, and an audio output unit 185. [0027]

The broadcast signal receiving unit 110 can receive an RF broadcast signal corresponding to a channel selected by a user or all previously stored channels among RF (Radio Frequency) broadcast signals received through an antenna from a broadcast station (see 210 in FIG. 1) have. The broadcast signal receiving unit 110 converts the received RF broadcast signal into an intermediate frequency signal, a baseband image or a voice signal, and outputs the converted signal to the controller 170.

In addition, the broadcast signal receiver 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme. The broadcast signal receiving unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through the channel memory function among the received RF broadcast signals, and converts the RF broadcast signals into an intermediate frequency signal, a baseband image, or a voice signal. This is to show a thumbnail list including a plurality of thumbnail images corresponding to broadcast channels on the display unit 180. [ Therefore, the broadcast signal receiving unit 110 can receive the RF broadcast signals of the selected channel or all the stored channels sequentially / periodically.

The network interface unit 120 provides an interface for connecting the video display apparatus 100 to a wired / wireless network including the Internet network or a network server (see 220 in FIG. 1).

The network interface unit 120 may include a wireless communication unit capable of wirelessly connecting the video display device 100 to the Internet. For wireless Internet access, WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access)

The network interface unit 120 may receive content or data provided by a content provider or a network operator through a network. That is, it is possible to receive contents such as broadcasts, games, VOD, broadcasting signals, and the related information provided from a contents provider through a network. In addition, the update information and the update file of the firmware provided by the network operator can be received.

In addition, the network interface unit 120 may be connected to a communication network capable of video or voice communication. The communication network may mean a broadcasting communication network, a public telephone network, a mobile communication network, etc. connected through a LAN.

The external device input / output unit 130 can connect the external device (see 230 in FIG. 1) and the image display device 100. To this end, the external device input / output unit 130 may include an A / V input / output unit or a wireless communication unit.

The external device input / output unit 130 is connected to an external device such as a DVD (Digital Versatile Disk), a Blu ray, a game device, a camera, a camcorder, a computer (notebook) The external device input / output unit 130 transmits a video signal, an audio signal, or a data signal input from the outside through a connected external device to the control unit 170 of the video display device 100. Also, the control unit 170 can output the processed video signal, audio signal, or data signal to the connected external device.

The A / V input / output unit is a module for allowing a video signal and an audio signal of an external device to be input to the video display device 100 and includes an Ethernet terminal, a USB terminal, a CVBS (Composite Video Banking Sync) , An S-video terminal (analog), a DVI (digital visual interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, and a D-SUB terminal.

Further, the wireless communication unit can perform wireless communication with another external device. The image display apparatus 100 is connected to an external device according to a communication standard such as Bluetooth, an RFID (Radio Frequency Identification), an IrDA (Infrared Data Association), an UWB (Ultra Wideband), a ZigBee, Can be connected.

Also, the external device input / output unit 130 may be connected to various set-top boxes via at least one of the various terminals described above to perform input / output operations with the set-top box.

For example, when the set-top box is a set-top box for an IP (Internet Protocol) TV, the external device input / output unit 130 transmits video, audio, or data signals processed in the IP TV set- 170). In addition, the controller 170 can transmit the processed signals to the IP TV set-top box.

Meanwhile, the IPTV may include ADSL-TV, VDSL-TV, FTTH-TV and the like depending on the type of the transmission network. The IPTV may include a TV over DSL, a video over DSL, a TV over IP BTV), and the like. In addition, IPTV may also mean an Internet TV capable of accessing the Internet, or a full browsing TV.

The remote control device interface unit 140 includes a wireless communication unit capable of wirelessly transmitting and receiving signals with the remote control device 200 and a coordinate calculation unit capable of calculating coordinates of a pointer corresponding to the motion of the remote control device 200 . The remote control device interface unit 140 can wirelessly transmit and receive signals to and from the remote control device 200 through the RF module. Also, the remote control device 200 can receive a signal transmitted through the IR module according to the IR communication standard.

The coordinate calculation unit of the remote control device interface unit 140 can correct the camera shake or error from the signal corresponding to the motion of the remote control device 200 received through the wireless communication unit of the remote control device interface unit 140. [ The coordinate calculation unit can calculate the coordinates of the pointer to be displayed on the display of the image display device 100 after the camera shake or error is corrected.

The remote control device transmission signal input to the image display device 100 through the remote control device interface unit 140 is output to the control unit 170 of the image display device 100. The control unit 170 determines various kinds of information for controlling the operation of the image display apparatus 100 in accordance with the information about the motion or the key operation of the remote control apparatus 200 from the signal transmitted from the remote control apparatus 200, It is possible to generate and output a control signal.

As another example, the remote control device 200 may calculate the pointer coordinates corresponding to the motion of the remote control device 200 and output it to the remote control device interface unit 140. In this case, the remote control device interface unit 140 can transmit information on the received pointer coordinates to the control unit 170 without any correction of any other shaking or error.

The storage unit 150 may store a video signal input to the video display device 100 and a voice signal and a data signal related to the video signal. For example, a moving image storage command may be input to the image display apparatus 100 that is reproducing a moving image based on a broadcast signal. The image display apparatus 100 may store at least a part of the moving image being reproduced in the storage unit 150 in response to the input moving image storage command. The video display device 100 may refer to a video signal and a data signal related to a video signal and a video signal stored in the storage unit 150 when a stored moving picture playback command is input. The image display apparatus 100 can reproduce a moving image based on the referenced signal.

The controller 170 controls the overall operation of the image display apparatus 100. The control unit 170 may receive signals transmitted by the remote control device 200 or other types of control command input means. In addition, a command can be input through the local key provided in the image display apparatus 100. The controller 170 determines a command included in the received signal or a command corresponding to a local key operation and controls the image display apparatus 100 to correspond to the command.

For example, when the user inputs a predetermined channel selection command, the control unit 170 controls the broadcast signal receiving unit 110 to receive the broadcast signal provided through the selected channel through the broadcast signal receiving unit 110. Also, the video and audio signals of the selected channel can be processed and output to the display unit 180 or the sound output unit 185. In addition, channel information selected by the user may be output through the display unit 180 or the audio output unit 185 together with the video signal and the audio signal.

The control unit 170 may process the video signal or the audio signal based on the information included in the data signal received together with the video signal or the audio signal. For example, the controller 170 determines the format of a corresponding video signal using a data signal associated with the video signal input to the video display device 100, and outputs the video signal, which is input to the video display device 100, Lt; / RTI >

The controller 170 may generate an OSD signal that can display an OSD (On Screen Display) related to an image generated based on the video signal from a data signal related to the video signal. In addition, the user can confirm relevant information in the image display apparatus 100 or generate a graphic user interface so that the image display apparatus 100 can input the image display apparatus control command.

The user can input another type of video or audio output command through the remote control device 200 or another kind of control command input means. For example, the user may wish to view a camera or camcorder image signal input through the external device input / output unit 130 instead of the broadcast signal. In this case, the control unit 170 controls the image display apparatus 100 so that the video signal or the audio signal input through the USB input unit of the external device input / output unit 130 is outputted through the display unit 180 or the audio output unit 185, The video signal or the audio signal can be processed.

The control unit 170 of the present embodiment can process a video signal so that a 2D or 3D video signal input from the outside can be displayed on the display unit 180. [ In addition, the controller 170 may process the image signal so that the generated graphic user interface is three-dimensionally displayed on the display unit 180. A detailed description of the control unit 170 will be described later with reference to FIG.

The display unit 180 converts the video signal, the data signal, the OSD signal processed by the control unit 170 or the video signal and the data signal received through the external device input / output unit 130 into R, G, and B signals, respectively Thereby generating a driving signal. The display unit 180 may display a screen according to the generated driving signal. The display unit 180 may be a PDP, an LCD, an OLED, or a flexible display. Also, the image display apparatus 100 and the display unit 180 according to the embodiment of the present invention can perform a 3D display (3D display).

The 3D display can be divided into an additional display method and a single display method depending on the way in which the 3D image is recognized by the user.

In the single display method, a 3D image can be implemented independently on a display without a separate auxiliary device. A user who views a display using a single display method can view a 3D image without using an additional device (ex) polarized glasses). The single display method may be a lenticular method, a parallax barrier, or the like.

The additional display method is a method of implementing a 3D image using an auxiliary device. Additional display methods include a head-mounted display (HMD) system, a glasses system, and the like. Further, polarized glasses, shutter glasses, spectral filters, and the like can be applied to the glasses used in the glasses system.

On the other hand, the display unit 180 may function as an input device as well as an output device by being composed of a touch screen.

The audio output unit 185 receives a signal processed by the video / audio processing unit 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs it as a voice. The sound output unit 185 may be implemented by various types of speakers.

3 is an internal block diagram of the controller 170 of the image display apparatus according to an embodiment of the present invention.

The controller 170 may include a demodulator 171, a demultiplexer 172, a decoder 173, an OSD generator 174, and a formatter 175. The demodulator 171 may demodulate the broadcast signal received by the broadcast signal receiver 110.

For example, the demodulator 171 may receive the digital IF signal DIF converted by the broadcast signal receiver 110 and perform a demodulation operation. The demodulator 171 may perform channel decoding. For this, the demodulator 171 may include a convolution decoder, a deinterleaver, and a reed-solomon decoder to perform convolutional decoding, deinterleaving, and reed solomon decoding.

The demodulator 171 can demodulate and decode the channel and output the stream signal TS. The stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. Specifically, the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.

The demodulator 171 may be separately provided according to the ATSC scheme and the DVB scheme. The stream signal output from the demodulator 171 may be input to the demultiplexer 172.

The demultiplexer 172 demultiplexes the received stream signal, for example, the MPEG-2 TS into a video signal, a voice signal, and a data signal. The stream signal input to the demultiplexer 172 may be a stream signal output from the demodulator 171, the network interface unit 120, and the external device input / output unit 130.

The demultiplexed data signal may be an encoded data signal. The encoded data signal may include EPG (Electronic Program Guide) information including broadcasting information such as name, start time, and end time of a broadcast program broadcasted on each channel. For example, the EPG information may be ATSC-PSIP (ATSC-Program and System Information Protocol) information in the case of the ATSC scheme and DVB-Service Information (DVB-SI) information in case of the DVB scheme .

The decoder unit 173 can decode the demultiplexed signal. The decoder unit 173 of the present embodiment includes a video decoder 173a for decoding the demultiplexed video signal and a scaler 173b for adjusting the resolution of the decoded video signal to be output from the video display device 100 .

The OSD generation unit 174 may generate an OSD signal such that an object is displayed on the display unit 180 as an OSD. The OSD may display information related to an image displayed on the display unit 180. [ In addition, the OSD may include a user interface for receiving a control signal or a user command for controlling the operation of the image display apparatus 100.

The OSD generation unit 174 according to the embodiment of the present invention can extract a thumbnail image corresponding to the reproduction time point of the content that is being played or reproduced in the image display apparatus 100. [ The OSD generating unit 174 may generate an OSD signal so that the 3D object including the extracted thumbnail image can be recognized by the user, and output the generated OSD signal to the formatter 175. [

The formatter 175 can determine the format of the inputted video signal by referring to the data signal related to the video signal. The formatter 175 converts the input video signal into a format suitable for the display unit 180 and outputs the format to the display unit 180.

The video display device 100 of this embodiment can display a 3D image on the display unit 180. [ In this case, the formatter 175 may generate a 3D video signal according to a predetermined format suitable for displaying the input video signal on the display unit 180. In an embodiment of the present invention, the 3D image signal may include a left eye image signal and / or a right eye image signal. As described above, in the embodiment of the present invention, a left eye image and a right eye image can be used to implement a 3D image. The left eye image signal may be a left eye image, and the right eye image signal may be a right eye image signal. The formatter 175 outputs the generated 3D video signal to the display unit 180. The display unit 180 displays a 3D image based on the generated 3D image signal.

In the present embodiment, the image display apparatus 100 may display the OSD as a 3D object according to the OSD signal generated by the OSD generating unit 174. [ The formatter 175 displays the OSD signal generated by the OSD generating unit 173 on the display unit 180 so that a plurality of viewpoint images constituting the 3D object, for example, the left eye image and the right eye image constituting the 3D object are displayed on the display unit 180 180) and output the 3D video signal to the display unit 180. The display unit 180 displays the 3D video signal in a format that can be displayed on the display unit 180.

The video display apparatus 100 having the user interface generating unit separately includes a mixer 174 capable of mixing the video signals output from the decoder unit 173 and the OSD generating unit 174 with the user interface video signals output from the user interface generating unit, As shown in FIG. The mixer may be provided in the formatter 175 for mixing the video signals output from the decoder unit 173 and the OSD generating unit 174.

4 is a diagram showing an example of a 3D image signal format capable of implementing a 3D image. The 3D image signal format may be determined according to a method of arranging the left eye image and the right eye image generated to implement the 3D image.

The 3D image may be a multiple view image. The user can view the multiple view image through the left eye and the right eye. The user can feel the stereoscopic effect of the 3D image through the difference of the images detected through the left eye and the right eye. A multi-view image for implementing a 3D image is composed of a left eye image that the user can recognize through the left eye and a right eye image that can be recognized through the right eye according to an embodiment of the present invention.

As shown in FIG. 4A, the manner in which the left eye image and the right eye image are arranged left and right is referred to as a side by side format. As shown in FIG. 4B, a method of arranging the left eye image and the right eye image up and down is referred to as a top / down format. As shown in FIG. 4C, the method of arranging the left eye image and the right eye image in a time-division manner is called a frame sequential format. As shown in FIG. 4D, a method of mixing the left eye image and the right eye image line by line is referred to as an interlaced format. As shown in FIG. 4E, a method of mixing the left eye image and the right eye image by box is called a Checker Box format.

A video signal included in a signal input from the outside to the image display apparatus 100 may be a 3D image signal capable of implementing a 3D image. In addition, the graphic user interface video signal generated to display information related to the video display device 100 or to input a command related to the video display device 100 may be a 3D video signal. The formatter 175 may mix the 3D video signal included in the signal inputted from the outside to the video display device 100 and the graphic user interface 3D video signal and output the mixed signal to the display unit 180.

The formatter 175 can determine the format of the mixed 3D video signal by referring to the related data signal. The formatter 175 may process the 3D video signal to fit the determined format and output the processed 3D video signal to the display unit 180. When the 3D image signal format that can be output from the display unit 180 is limited, the formatter 175 converts the received 3D image signal into a 3D image signal format that can be output from the display unit 180 And output it to the display unit 180.

The OSD generation unit 174 may generate an OSD (On Screen Display) signal. Specifically, the OSD generation unit 174 generates various kinds of information (e.g., a variety of information) on the screen of the display unit 180 based on at least one of a video signal and a data signal, or a user input signal input through a remote control device or another kind of control command input means May be generated as a graphic or text signal. The OSD generating unit 174 may generate a signal for displaying a graphic or text capable of inputting a control command to the image display apparatus 100. [ The generated OSD signal may be output to the display unit 180 together with the processed image signal and the processed data signal.

The OSD signal may include information on a user interface screen, various menu screens, widgets, icons, and the like that the display unit 180 can display as a signal generated for graphic or text display. The OSD generation unit 174 may generate the OSD signal as a 2D video signal or a 3D video signal. The OSD signal generated by the OSD generating unit 174 may include a graphical user interface 3D video signal mixed with another video signal in the formatter 175. [

The display unit 180 may display the object according to the OSD signal generated by the OSD generating unit 174. [ The object of this embodiment may be one of a volume control button, a channel control button, an image display device control menu, an icon, a navigation tab, a scroll bar, a progressive bar, a text box, and a window.

The user can recognize information on the image display apparatus 100 or information on the image displayed on the image display apparatus 100 through the object displayed on the display unit 180. [ In addition, a command can be input to the image display apparatus 100 through the object displayed on the image display apparatus 100. [ On the other hand, the 3D object in the present specification is an object to which a stereoscopic effect is applied so as to have a stereoscopic effect. The 3D object may be a PIP image, an EPG indicating broadcast program information, various menus of a video display device, a widget, an icon, and the like.

FIG. 5 is a diagram illustrating various scaling methods or images of a 3D image signal according to an exemplary embodiment of the present invention. Referring to FIG. See FIG. 5 for scaling or tilting the 3D object.

A module for image processing such as a scaler included in the controller 170 or the controller 170 may be configured to enlarge or reduce the 3D object 510 in the 3D image signal or the 3D image signal as a whole 513). This is a general function of image processing such as scaler and control unit.

In addition, the controller 170 may generate or transform a screen into a polygon such as a trapezoidal shape or a parallelogram shape in order to display an image rotated at a certain angle or inclined in a certain direction. A video signal processed in a parallelogram or trapezoid shape for display of a tilted or rotated screen may be received. If the 3D image signal or the 3D object corresponding to the osd is generated in the control unit and the corresponding 3D image signal is output to the display, the control unit 170 displays the 3D object as a trapezoid (as shown in FIG. 5B) 516) or a parallelogram (519) shape as shown in Fig. 5 (c).

An image received from a broadcasting station (see 210 in FIG. 1), a network server (see 230 in FIG. 1) or an external input device (see 230 in FIG. 1) The shape of the trapezoid 516 or the parallelogram 519 as shown in FIG. 5 (b) or FIG. 5 (c) as well as the case of being enlarged or reduced as shown in FIG. Dimensional effect can be more emphasized on a 3D object in a 3D image signal or a 3D image signal when a 3D image signal or the like is generated or processed so that the 3D image signal or the like is generated. In addition, this can serve as a factor for diversifying and maximizing the stereoscopic effect of the image sensed by the user.

The slope effect or the rotation effect applied to the image according to the shape of the image is determined by the difference in length of the parallel sides of the trapezoid shape 516 illustrated in Figure 5 (b) By adding or subtracting the diagonal size difference of the parallelogram 519 illustrated in Fig.

In this case, even within a single 3D image or a 3D object, different tilting intervals may be applied to each part, resulting in a tilting effect. In other words, in order to make the image appear to be tilted or rotated, a part having a larger depth and a part having a smaller depth coexist in a 3D image or a 3D object. This means that a parallax interval for a pair of left- As shown in FIG.

When one of the left eye image and the right eye image for displaying the 3D image or the 3D object is generated in the shape shown in FIG. 5 in the scaler or the OSD generating unit in the controller 170, the generated left eye or right eye image is copied And a pair of binocular images can be generated by generating one remaining image.

On the other hand, the scale adjustment of the 3D image signal or the 3D object may be performed by the formatter 175 of the control unit 170 described above. The 3D image signal and the like in FIG. 5 may be a signal obtained by combining a left eye image signal, a right eye image signal, or a left eye image signal and a right eye image signal.

The formatter 175 receives the decoded video signal, separates the 2D video signal or the 3D video signal, and separates the 3D video signal into the left eye video signal and the right eye video signal. Then, the left eye image signal and the right eye image signal may be scaled into one or more of various examples shown in FIG. 5 and output in a predetermined format as shown in FIG. On the other hand, the scaling may be performed before or after the output format is formed.

The formatter 175 receives the OSD signal of the OSD generating unit 174 or the OSD signal mixed with the decoded video signal, separates the 3D video signal, and separates the 3D video signal into a plurality of viewpoint video signals . For example, the 3D image signal may be separated into a left eye image signal and a right eye image signal, and the separated left eye image signal and right eye image signal may be scaled as shown in FIG. 5 and output in a predetermined format shown in FIG. .

Meanwhile, the OSD generation unit 174 may directly perform the image signal generation process or the scaling process described above with respect to the OSD output. When the OSD generating unit 174 directly performs scaling on the OSD, the formatter 175 does not need to perform scaling on the OSD. In this case, the OSD generating unit 174 not only generates the OSD signal, but scales the OSD signal according to the depth or inclination of the OSD, and further outputs the OSD signal in a suitable format. The format of the OSD signal output from the OSD generation unit 174 may be any of various formats of the left eye image signal, the right eye image signal, or the left eye image and the right eye image, as shown in FIG. At this time, the output format is the same as the output format of the formatter 175.

FIG. 6 is a view showing a variation in depth of a 3D image or a 3D object according to an embodiment of the present invention.

According to the embodiment of the present invention described above, the 3D image is composed of multi-view images, and the multi-view image can be exemplified as the left eye image and the right eye image. In this case, FIG. 6 shows a position where a position recognized as an image is changed according to the interval between the left eye image and the right eye image. Referring to FIG. 6, the three-dimensional sensation or the perspective sensation of the image to be felt by the user according to the interval or parallax between the left eye image and the right eye image will be described.

In Fig. 6, a plurality of images or a plurality of objects having different depths are shown. These objects are referred to as a first object 615, a second object 625, a third object 635, and a fourth object 645.

That is, the first object 615 is composed of a first left eye image based on the first left eye image signal and a first right eye image based on the first right eye image signal. That is, the video signal for displaying the first object is composed of the first left eye image signal and the first right eye image signal. 6 shows the positions of the first right eye image based on the first left eye image signal and the first right eye image signal on the display unit 180 based on the first left eye image signal. 6 shows an interval between the first left-eye image and the first right-eye image displayed on the display unit 180. FIG. The description of the first object may be applied to the second to fourth objects. Hereinafter, for convenience of explanation, the left eye image and the right eye image displayed on the display unit 180 for one object, and the interval set between the two images and the serial numbers of the objects will be described in unison.

The first object 615 is composed of a first right eye image 613 (indicated by R1 in Fig. 6) and a first left eye image 611 (indicated by L1 in Fig. 6). The interval between the first right eye image 613 and the first left eye image 611 is set to d1. The user recognizes the elongation that connects the left eye 601 and the first left eye image 611 and that the image is formed at a point where an extension line connecting the right eye 603 and the first right eye image 603 intersects. Accordingly, the user recognizes that the first object 615 is located behind the display unit 180. [ The distance between the display unit 180 and the first object 615 recognized by the user can be represented by a depth. In this embodiment, the 3D object recognized by the user has a negative value (-) as if it is positioned behind the display unit 180. Therefore, the depth of the first object 615 has a negative value.

The second object 625 is composed of a second right eye image 623 (represented by R2) and a second left eye image 621 (represented by L2). The second right-eye image 623 and the second left-eye image 621 are displayed at the same position on the display unit 180 according to the present embodiment. The interval between the second right eye image 623 and the second left eye image 621 is zero. The user recognizes that an extension line connecting the left eye 601 and the second left eye image 621 and an extension line connecting the right eye 603 of the user and the second right eye image 623 intersect with each other. Accordingly, the user recognizes that the second object 625 is displayed on the display unit 180. [ In this case, the second object 625 may be referred to as a 2D object and may be referred to as a 3D object. The second object 625 is an object having the same depth as the display unit 180, and the depth of the second object 625 is zero.

The third object 635 and the fourth object 645 are examples for explaining 3D objects recognized as being protruding toward the user from the display unit 180. [ Furthermore, it is possible to explain that the degree of perspective or stereoscopic feeling perceived by the user varies with the change in the interval between the left eye image and the right eye image, with reference to examples of the third object 635 and the fourth object 645.

The third object 635 is composed of a third right eye image 633 (represented by R3) and a third left eye image (represented by 631 and L3). And the interval between the third right eye image 633 and the third left eye image 631 is set to d3. The user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the third left eye image 631 and an extension line of the right eye 603 and the third right eye image 633. Accordingly, the user recognizes that the third object 625 is positioned ahead of the display unit 180, i.e., nearer to the user. That is, the third object 635 is recognized by the user as if the third object 635 is projected from the display unit 180 toward the user. In this embodiment, the 3D object recognized by the user has a positive value (+), as in the case where the 3D object is positioned ahead of the display unit 180. Thus, the depth of the third object 635 has a positive value.

The fourth object 645 is composed of a fourth right eye image 643 (represented by R4) and a fourth left eye image (represented by 641 and L4). And the interval between the fourth right eye image 643 and the fourth left eye image 641 is set to d4. Here, an inequality of 'd3 <d4' is established between d3 and d4. The user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the fourth left eye image 641 and an extension line of the right eye 603 and the fourth right eye image 643. Therefore, the user recognizes that the fourth object 645 is positioned closer to the user than the third object 635, as well as located in front of the display unit 180, i.e., closer to the user. That is, the fourth object 645 is perceived by the user as if it is positioned to protrude from the display unit 180 and the third object 635 toward the user. The fourth object 645 has a depth amount value.

The image display apparatus 100 can recognize the user as if the object composed of the left eye image and the right eye image are positioned behind the display unit 180 by adjusting the positions of the left eye image and the right eye image displayed on the display unit 180 So that the user is perceived as if it is located. In addition, the image display apparatus 100 can control the depth sense of the object composed of the left eye image and the right eye image by adjusting the display intervals of the left eye image and the right eye image displayed on the display unit 180. [

That is, according to the description with reference to FIG. 6, whether the depth of an object composed of a left eye image and a right eye image has a positive value (+) or a negative value (-) according to the left and right display positions of the left eye image and the right eye image Is determined. As described above, an object having a positive value (+) in depth is an object recognized by the user as if it is located protruding from the display unit 180. Also, an object whose depth has a negative value (-) is an object recognized by the user as if it is positioned behind the display unit 180.

Referring to FIG. 6, the depth of the object, that is, the distance between the point at which the 3D image is recognized by the user and the display unit 180 varies according to the absolute value of the interval between the left eye image and the right eye image can do.

FIG. 7 is a view illustrating a manner in which the depth of an image is controlled according to an embodiment of the present invention. Referring to FIG. 7, it can be seen that the depth of the same image or the same 3D object changes depending on the interval between the left eye image 701 and the right eye image 702 displayed on the display unit 180. In this embodiment, the depth of the display unit 180 is set to zero. The depth of the image recognized as being projected from the display unit 180 is set to have a positive value.

An interval between the left eye image 701 and the right eye image 702 shown in FIG. 7A is a. The interval between the left eye image 701 and the right eye image 702 shown in FIG. 7 (b) is b. Where b is greater than aa. That is, the interval between the left eye image 701 and the right eye image 702 in the example shown in FIG. 7 (b) is wider than the example shown in FIG. 7 (a).

In this case, as described with reference to Fig. 6, the depth feeling of the 3D image or the 3D object shown in Fig. 7B is larger than the depth feeling of the 3D image or 3D object shown in Fig. 7A . In each case, when the depth sense is expressed numerically and denoted by a 'and b', respectively, it can be understood that the relationship of a '<b' is also established according to the relation of a <b. That is, when it is desired to implement a 3D image which is seen as a protruding, the depth feeling expressed by the widening or narrowing of the interval between the left eye image 701 and the right eye image 702 may become larger or smaller.

8 and 9 are views illustrating an image display apparatus and a remote control apparatus according to an embodiment of the present invention.

The image display apparatus 100 can be controlled by a signal transmitted from the remote control apparatus 200. [ The user can input commands such as power on / off, channel up / down, volume up / down, etc. to the video display device 100 using the remote control device 200. The remote control apparatus 200 transmits a signal including a command corresponding to the operation of the user to the image display apparatus 100. [ The image display apparatus 100 can discriminate a signal received from the remote control device 200 and generate a control signal according to the signal or perform an operation according to a command included in the signal.

The remote control apparatus 200 can transmit a signal to the image display apparatus 100 according to the IR communication standard. Also, the remote control device 200 may transmit a signal to the image display device 100 or a signal transmitted from the image display device 100 according to another type of wireless communication standard. There may be a remote control device 200 that can detect a movement of a user of the remote control device 200 and transmit a signal including a command corresponding to the motion to the image display device 100. [ In the present embodiment, this remote control device 200 will be described as an example using a space remote controller. According to various embodiments of the present invention, in addition to a space remote controller, a general wired / wireless mouse, an air mouse, various pointing means, and remote controllers of various forms (rings, bracelets, thimble, have.

8 and 9, one of the remote control devices 200 that can input commands to the video display device 100 for remote control of the video display device 100 is a spatial remote controller 201 And FIGS. 8 and 9 are perspective views of the space remote controller 201. As shown in FIG.

In this embodiment, the spatial remote controller 201 can transmit and receive signals in accordance with the RF communication standard with the image display apparatus 100. As shown in FIG. 8, a pointer 202 corresponding to the spatial remote controller 201 may be displayed on the image display device 100.

The user can move or rotate the space remote controller 201 up, down, left, right, back and forth. The pointer 202 displayed on the video display device 100 corresponds to the motion of the spatial remote controller 201. [ FIG. 9 shows a bar 202 displayed on the image display device 100 in response to movement of the spatial remote controller 201.

In the example described with reference to FIG. 9, when the user moves the spatial remote controller 201 to the left, the pointer 202 displayed on the video display device 100 also moves to the left corresponding thereto. In this regard, the spatial remote controller 201 may be provided with a sensor capable of detecting movement. Information on the motion of the spatial remote controller 201 sensed through the sensor of the spatial remote controller 201 is transmitted to the image display apparatus 100. The image display apparatus 100 can calculate the coordinates of the pointer 202 from the information on the motion of the spatial remote controller 201. [ The image display apparatus 100 can display the pointer 202 so as to correspond to the calculated coordinates.

As shown in FIGS. 8 and 9, the pointer 202 displayed on the image display apparatus 100 can be moved corresponding to the up, down, left, right, or rotation of the spatial remote controller 201. The moving speed and the moving direction of the pointer 202 may correspond to the moving speed and the moving direction of the spatial remote controller 201.

For the aforementioned series of operations or functions of the spatial remote controller 201, the spatial remote controller 201 includes a remote control wireless communication unit, a user input unit, a sensor unit, a remote control signal output unit, a power supply unit, a remote control information storage unit, And the like. That is, the remote controller of the spatial remote controller processes information or signals sensed by the user input unit and / or the sensing unit to generate a remote control signal. The remote control signal includes, for example, information as to which part of the keypad or button corresponding to the user input unit is applied with pressure or touch, a time at which the pressure or touch has been sustained, and coordinates or angles in which the spatial remote controller moved or rotated through the sensing unit Can be generated based on the information about the user.

The remote control signal generated through the above process is transmitted to the image display device through the remote control wireless communication unit. More specifically, the remote control signal output through the remote control wireless communication unit is input to the remote control device interface unit 140 of the video display device. The remote control wireless communication unit may receive the wire / wireless signal transmitted by the video display device.

The remote control information storage unit stores various kinds of programs and application data necessary for controlling or operating the image display apparatus or the spatial remote controller. For example, when wireless communication is performed between the video display device and the spatial remote controller, the remote control information storage section stores the used frequency band, and remote control information related to the frequency band can be used in the communication of the remote control information storage section.

In addition, the power supply unit is a module that supplies power and the like necessary for driving the spatial remote controller. According to one example, when the remote control unit temporarily stops the power supply or outputs a signal instructing to restart the power supply depending on whether the spatial remote controller sensed by the sensing unit is moving, The power can be saved while the spatial remote controller is not used or is not operating.

As another example, a predetermined command may be set to be input to the image display apparatus 100 in response to the movement of the spatial remote controller 201. That is, even if a certain pressure or touch is not detected in the user input unit, a predetermined command can be inputted or generated only by the motion of the spatial remote controller. For example, when moving the remote control 201 back and forth, the size of the image displayed on the image display device 100 may be enlarged or reduced. Thus, examples of spatial remote controls do not limit the scope of the present invention.

10 is a diagram illustrating an operation method of an image display apparatus according to an embodiment of the present invention.

Of course, the image display device here refers to an image display device capable of three-dimensionally displaying an image. In order to stereoscopically represent an image, a method illustrated in an embodiment of the present invention displays an image of a plurality of viewpoints having a predetermined parallax interval for displaying one 3D image or 3D object.

First, the control unit 170 generates a first OSD object and a second OSD object. Here, the first OSD object and the second OSD object have different depths by setting different depth values. The depth value indicates a case in which the face looks like retreating from the user when the same plane as that of the display unit 180 is 0 and a case where the depth value appears to protrude from the front face of the display unit 180 is a positive face.

There is no limitation on the depth values of the first OSD object and the second OSD object. However, in the embodiment described with reference to FIG. 10, it is assumed that the first OSD object and the second OSD object are displayed in multiple layers. Of course, in another embodiment of the present invention, the initial depth values of the first OSD object and the second OSD object are set to be equal to each other, and then the depth of the first OSD object and the depth of the second OSD object The values may change differently.

The first OSD object and the second OSD object may be displayed in multiple layers including the case where the initial depth values of the first OSD object and the second OSD object are differently set and the case where the depth values are different by the user setting signal, . That is, a plurality of OSD objects are displayed on the display unit 180 in multiple layers. In this case, the first OSD object and the second OSD object are located on different layers. In addition, the first OSD object and the second OSD object may be included in one layer, but may have a different depth sense by changing the depth value, and the first OSD object and the second OSD object may be displayed as constituent elements of different layers .

Here, at least one of the first OSD object or the second OSD object may be composed of a plurality of view-point images. That is, both the first OSD object and the second OSD object may be composed of a plurality of view-point images, and either one of them may be composed of a plurality of view-point images. In this case, the depth value of the OSD object described above can be added or subtracted by the interval between the plurality of viewpoint images constituting each OSD object, i.e., the left eye image and the right eye image.

The display unit 180 displays the first OSD object and the second OSD object (S1010). The user input unit receives the user setting signal and transmits the user setting signal to the controller 170 (S1020). The user set signal may be input to one or more OSD objects. Alternatively, the user may control a plurality of OSD objects with one user setting signal input by selecting or activating one or more OSD objects upon inputting a user setting signal.

The user input unit may receive a user setting signal from a remote control device connected to the video display device. The remote control device may be a remote control device that senses the motion of the remote control device and transmits a signal including a command corresponding to the motion to the image display device.

If the user inputs a user setting signal using a remote controller such as a space remote controller, the controller 170 can recognize the user setting signal as a cursor of the space remote controller or an OSD object displayed at a point where the pointer is located . In this case, the remote control device interface unit (140 of FIG. 2) described above can serve as a user input unit.

When the user setting signal is input, the controller 170 changes the depth of the OSD object designated by the user setting signal, the display position or the display state according to the setting information included in the user setting signal (S1030).

For example, when the user setting signal designates the first OSD object and the setting information included in the user setting signal indicates that the depth value should be increased by 20, the controller 170 increases the depth value of the first OSD object by 20 To modify the settings of the first OSD object or regenerate the first OSD object. The depth sense means the projecting effect or the retreat effect of the OSD object represented by the depth value as described above. The user setting signal for controlling the depth of the OSD object can be referred to as an OSD depth control signal.

Here, the OSD object displayed in 3D is composed of a plurality of viewpoint images, which may mean a left eye image and a right eye image. Therefore, when controlling the depth of the OSD object, the controller 170 may add or subtract the depth value by adding or subtracting the interval between the left eye image and the right eye image. Therefore, when the depth control signal for the first OSD object is inputted, the controller 170 outputs the depth control signal to the display unit by adding and subtracting the interval between the left eye image and the right eye image of the first OSD object, Eye image and the right-eye image of the second OSD object, and outputs the result to the display unit 180.

In addition, in addition to the interval between the left eye image and the right eye image of the OSD object, the control unit 170 can obtain the slope effect of the OSD object by variously generating the shape of the OSD object. The control unit 170 generates a shape of the OSD object in a trapezoidal or parallelogram shape according to the direction in which the OSD object is inclined or rotated, and the inclination effect of the OSD object has been described above. There may be a change in the spacing between images. Accordingly, when the user setting signal is a tilt control signal for controlling the tilt of one of the first OSD object or the second OSD object, the controller 170 controls the shape of the left eye image and the right eye image to be a shape such as a parallelogram or a trapezoid And by changing the interval between the left eye image and the right eye image, a tilt effect can be given to the OSD object.

The display position is information indicating whether the OSD object is displayed on a plane including the display unit 180 or on a plane parallel to the display unit 180 regardless of the depth value. For example, the display position of the OSD object may be represented by an X-Y coordinate value of the OSD object on the X-Y plane, assuming that the plane including the display unit 180 is an X-Y plane.

Accordingly, when the user setting signal is an OSD position control signal for changing the display position of the OSD object, the controller 170 controls the coordinates of at least one OSD object of the first OSD object or the second OSD object designated according to the OSD position control signal By changing the value, the display position is controlled. From the viewpoint of the user, it is seen that the first OSD object or the second OSD object moves as the OSD position control signal is input.

Also, the user setting signal may be an OSD display control signal for changing the display state of the OSD object. The display state of the OSD object may mean the color temperature, transparency, brightness, and sharpness of the OSD object. Therefore, the OSD display control signal includes information on which OSD object is to be targeted and information on how to change the color temperature, transparency, brightness, or sharpness of the OSD object.

Here, the display state of the OSD object may be added or subtracted based on the depth value of the OSD object. That is, the color temperature, transparency, brightness, or sharpness of the OSD object may be added or subtracted according to the depth value set in the OSD object or the depth of the OSD object. For example, when OSD objects are displayed in multiple layers, OSD objects belonging to the same layer or OSD objects having the same depth value have the same color temperature, transparency, brightness or sharpness.

For example, when the OSD objects are overlaid on a plurality of layers, the controller 170 can set the sharpness of the OSD objects belonging to the layer having the large depth value to a high level and the brightness of the OSD objects belonging to the layer having the small depth value . Of course, the lower the depth value, the higher the sharpness can be set. Also, the transparency of the OSD objects belonging to the layer having the large depth value and the OSD objects belonging to the layer having the small depth value can be set differently according to the depth value.

11 to 13 are views showing a plurality of OSD objects displayed according to an embodiment of the present invention.

11 shows a case where the first OSD object 1110 and the second OSD object 1120 are displayed as separate layers.

In the embodiment described with reference to FIG. 11, a user setting signal for controlling the first OSD object 1110 is input. As the user setup signal is input, the controller 170 may increase the depth value of the first OSD object 1110. In addition, the controller 170 can change the display position, the display state, and the like of the first OSD object 1110 according to the user setting signal.

Referring to FIG. 12, the first OSD object 1110, the second OSD object 1120, and the third OSD object 1130 have different depths. The second OSD object represents one layer and the first OSD object 1110 and the third OSD object 1130 are components of the second OSD object 1120. [

12, the user setting signal for controlling the first OSD object 1110 is input. When the user setting signal is input, the controller 170 sets the depth value of the first OSD object 1110 to . In addition, the controller 170 may further change the display position, the display state, and the like of the first OSD object 1110 according to the user setting signal.

12, the user setting signal for controlling the first OSD object 1110 is input. When the user setting signal is input, the controller 170 sets the depth value of the first OSD object 1110 to . In addition, the controller 170 can change the display position, the display state, and the like of the first OSD object 1110 according to the user setting signal.

In addition, although the first OSD object 1110 and the third OSD object 1130 are displayed as the same layers as the components included in the second OSD object 1120, a user signal is input only to the first OSD object 1110 The first OSD object 1110 has a different depth value from other OSD objects in the same layer.

13 illustrates a first OSD object 1110, a second OSD object 1120 and a third OSD object 1130 and a fourth OSD object 1140, a fifth OSD object 1150 and a sixth OSD object 1160, Are displayed with different depth senses. The first OSD object 1110 and the third OSD object 1130 are components of the second OSD object 1120 and the fourth OSD object 1140 and the sixth OSD object 1160 are components of the fifth OSD object 1120. [ 1150). That is, the second OSD object 1120 and the fifth OSD object 1150 represent different layers, and the first OSD object 1110, the third OSD object 1130, the fourth OSD object 1140, The OSD object 1160 is an OSD object as a component included in different layers.

In the embodiment described with reference to FIG. 13, the user set signal inputted is the first OSD object 1110 and the fourth OSD object 1140. Two user set signals may be separately input for the first OSD object 1110 and the fourth OSD object 1140 and a plurality of OSD objects (i.e., the first OSD object 1110 and the fourth OSD object 1110) (1140) may be input.

Accordingly, the first OSD object 1110 is set to have a different depth of view, a display position, a display state, and the like compared to other OSD objects in the same layer, and the fourth OSD object 1140 is also set to other OSD objects in the same layer A different depth sense, a display position, a display state, and the like are set.

Of course, the user inputs a user setting signal for the OSD object, which is a component of the layer, such as the first OSD object 1110, the third OSD object 1130, the fourth OSD object 1140, and the sixth OSD object 1160 However, it is also possible to input a user setting signal for controlling the depth of a layer such as the second OSD object 1120 or the fifth OSD object 1150. When a user setting signal is input to the layer, the components included therein may be controlled together, or only the OSD objects which are layers separately from the components may be independently controlled. That is, when a user setting signal for increasing the depth sense is input to the second OSD object 1120, i) depth sensitivity of the first OSD object 1110 and the third OSD object 1130, which are components, The depth of the second OSD object 1120 may be changed and the depth of the first OSD object 1110 and the third OSD object 1130 may be changed without change .

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above but the embodiments can be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium readable by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also a carrier wave such as transmission over the Internet. In addition, the processor readable recording medium may be distributed over networked computer systems so that code readable by the processor in a distributed manner can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

100: Video display device
110: broadcast signal receiver
120: Network interface unit
130: External device input / output unit
140: remote control device interface unit
150:
170:
180:
185:
200: remote control device
210: Station
220: Network server

Claims (18)

A method of operating an image display apparatus capable of stereoscopically displaying two or more OSD objects,
Displaying a first OSD object;
Displaying a second OSD object having a depth sense different from that of the first OSD object, wherein the first OSD object and the second OSD object are composed of a plurality of viewpoint images, Depth varies;
Receiving a user setting signal for at least one of the first OSD object and the second OSD object; And
Changing at least one of a depth of view, a display position or a display state of the first OSD object or the second OSD object according to the user setting signal,
When the first OSD object and the second OSD object are displayed in multiple layers, OSD objects belonging to the same layer or OSD objects having the same depth value have the same color temperature, transparency, brightness or sharpness. A method of operating a video display device.
The method according to claim 1,
Wherein the plurality of viewpoint images include a left eye image and a right eye image,
Wherein the interval between the left eye image and the right eye image is increased or decreased according to the depth control signal when the user setting signal is a depth control signal for controlling the depth feeling.
The method according to claim 1,
Wherein the plurality of viewpoint images include a left eye image and a right eye image,
Eye image and the left-eye image according to the tilt control signal when the user setting signal is a tilt control signal for controlling the tilt of at least one of the first OSD object or the second OSD object, And changing the interval between the right-eye images
A method of operating a video display device.
The method according to claim 1,
Wherein when the user setting signal is an OSD position control signal for changing a display position of the OSD object, at least one of the first OSD object or the second OSD object is displayed on the display And changing a coordinate value to be displayed on the display unit.
The method according to claim 1,
If the user setting signal is an OSD display control signal for changing the display state of the OSD object, the control unit may control one of color temperature, transparency, brightness, and sharpness of the first OSD object or the second OSD object according to the OSD display control signal And changing a state of the video display device.
The method according to claim 1,
Wherein the transparency of the first OSD object is different from the transparency of the second OSD object, and the transparency is added or subtracted according to the depth.
The method according to claim 1,
Wherein the sharpness of the first OSD object is different from that of the second OSD object, and the sharpness of the second OSD object is adjusted according to the depth.
The method according to claim 1,
Wherein the color temperature of the first OSD object is different from the color temperature of the second OSD object, and the color temperature is adjusted according to the depth.
The method of claim 1, wherein
A user setting signal is input from a remote control device connected to the video display device in a wireless manner, the remote control device detects a motion of the remote control device, and transmits a signal including a command corresponding to the motion to the video display device Wherein the remote controller is a spatial remote controller for transmitting the image data to the display device.
A video display device capable of displaying two or more OSD objects,
A controller for generating a first OSD object and generating a second OSD object having a different depth from the first OSD object, wherein at least one of the first OSD object or the second OSD object comprises a plurality of viewpoint images , The depth sense is changed by the interval between the plural viewpoint images;
A display unit for displaying the first OSD object and the second OSD object;
And a user input unit for receiving a user setting signal for at least one of the first OSD object and the second OSD object,
Wherein the controller changes at least one of a depth of view, a display position, and a display state of the first OSD object or the second OSD object according to the user setting signal, and the first OSD object and the second OSD object are multi- The OSD objects belonging to the same layer or the OSD objects having the same depth value have the same color temperature, transparency, brightness or sharpness.
11. The method of claim 10,
Wherein the plurality of viewpoint images include a left eye image and a right eye image,
Wherein the controller adjusts the interval between the left eye image and the right eye image according to the depth control signal when the user setting signal is a depth control signal for controlling the depth sense.
11. The method of claim 10,
Wherein the plurality of viewpoint images include a left eye image and a right eye image,
When the user setting signal is a tilt control signal for controlling the tilt of at least one of the first OSD object and the second OSD object, the controller controls the shape of the left eye image and the right eye image according to the tilt control signal, And changing an interval between the left eye image and the right eye image.
11. The method of claim 10,
If the user setting signal is an OSD position control signal for changing the display position of the OSD object, the control unit controls the OSD position control signal so that at least one of the first OSD object or the second OSD object is displayed on the display unit And changes the coordinate value to be displayed on the display unit.
11. The method of claim 10,
If the user setting signal is an OSD display control signal for changing the display state of the OSD object, the controller controls the color temperature, the transparency, the brightness, and the color of the first OSD object or the second OSD object according to the OSD display control signal. And changing the at least one of the brightness and the sharpness.
11. The method of claim 10,
Wherein the controller sets the transparency of the first OSD object and the second OSD object to be different from each other, and adds or subtracts the transparency according to the depth.
11. The method of claim 10,
Wherein the control unit sets the sharpness of the first OSD object and the sharpness of the second OSD object to be different from each other, and adjusts the sharpness according to the depth sense.
11. The method of claim 10,
Wherein the controller sets the color temperature of the first OSD object and the color temperature of the second OSD object differently, and adjusts the color temperature according to the depth.
The method of claim 10, wherein
The user input unit receives a user setting signal from a remote control device connected to the video display device in a wireless manner, and the remote control device detects a motion of the remote control device and outputs a signal including a command corresponding to the motion Wherein the image display device is a spatial remote controller for transmitting the image data to the image display device.
KR1020100026932A 2010-03-22 2010-03-25 Image Display Device and Operating Method for the Same KR101626310B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020100026932A KR101626310B1 (en) 2010-03-25 2010-03-25 Image Display Device and Operating Method for the Same
US12/959,706 US20110227911A1 (en) 2010-03-22 2010-12-03 Image display device and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100026932A KR101626310B1 (en) 2010-03-25 2010-03-25 Image Display Device and Operating Method for the Same

Publications (2)

Publication Number Publication Date
KR20110107667A KR20110107667A (en) 2011-10-04
KR101626310B1 true KR101626310B1 (en) 2016-06-13

Family

ID=45025746

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100026932A KR101626310B1 (en) 2010-03-22 2010-03-25 Image Display Device and Operating Method for the Same

Country Status (1)

Country Link
KR (1) KR101626310B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140061098A (en) * 2012-11-13 2014-05-21 엘지전자 주식회사 Image display apparatus and method for operating the same
CA2917221A1 (en) 2013-07-02 2015-01-08 Lg Electronics Inc. Method and apparatus for processing 3-dimensional image including additional object in system providing multi-view image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100649523B1 (en) * 2005-06-30 2006-11-27 삼성에스디아이 주식회사 Stereoscopic image display device
JP2009135686A (en) * 2007-11-29 2009-06-18 Mitsubishi Electric Corp Stereoscopic video recording method, stereoscopic video recording medium, stereoscopic video reproducing method, stereoscopic video recording apparatus, and stereoscopic video reproducing apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100667514B1 (en) * 2005-02-21 2007-01-10 엘지전자 주식회사 Optical disc device and method for displaying dual sub-picture thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100649523B1 (en) * 2005-06-30 2006-11-27 삼성에스디아이 주식회사 Stereoscopic image display device
JP2009135686A (en) * 2007-11-29 2009-06-18 Mitsubishi Electric Corp Stereoscopic video recording method, stereoscopic video recording medium, stereoscopic video reproducing method, stereoscopic video recording apparatus, and stereoscopic video reproducing apparatus

Also Published As

Publication number Publication date
KR20110107667A (en) 2011-10-04

Similar Documents

Publication Publication Date Title
KR101685343B1 (en) Image Display Device and Operating Method for the Same
US8896672B2 (en) Image display device capable of three-dimensionally displaying an item or user interface and a method for operating the same
US9335552B2 (en) Image display apparatus and method for operating the same
EP2337363B1 (en) Image display apparatus and method for displaying a pointer on a stereoscopic display
US20110227911A1 (en) Image display device and method for operating the same
KR101699738B1 (en) Operating Method for Image Display Device and Shutter Glass for the Image Display Device
KR101702949B1 (en) Method for operating an apparatus for displaying image
KR101661956B1 (en) Image Display Device and Operating Method for the Same
KR101626310B1 (en) Image Display Device and Operating Method for the Same
KR101655804B1 (en) Image Display Device with 3D-Thumbnail and Operation Controlling Method for the Same
EP2574068A2 (en) Image display apparatus and method for operating the same
KR101699740B1 (en) Image Display Device of 2D/3D convertible display and Operating Method for the same
KR101735608B1 (en) Image Display Device and Operating Method for the Same
KR101698787B1 (en) Image Display Device and Operating Method for the Same
KR101716171B1 (en) Image Display Device and Operating Method for the Same
KR101638541B1 (en) Image Display Device and Operating Method for the Same
KR101668245B1 (en) Image Display Device Controllable by Remote Controller and Operation Controlling Method for the Same
KR101657560B1 (en) Image Display Device and Operating Method for the Same
KR101638536B1 (en) Image Display Device and Controlling Method for the Same
KR101702968B1 (en) Operating an Image Display Device
KR20110128535A (en) Image display device and operating method for the same
KR101691801B1 (en) Multi vision system
KR101626304B1 (en) Image Display Device and Controlling Method for the Same
KR20110088952A (en) Image display device with a 3d object including a thumbmail image and operation controlling method for the same
KR20110106705A (en) Image display device and operating method for the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190424

Year of fee payment: 4