KR20140115786A - Image display apparatus - Google Patents

Image display apparatus Download PDF

Info

Publication number
KR20140115786A
KR20140115786A KR1020130030885A KR20130030885A KR20140115786A KR 20140115786 A KR20140115786 A KR 20140115786A KR 1020130030885 A KR1020130030885 A KR 1020130030885A KR 20130030885 A KR20130030885 A KR 20130030885A KR 20140115786 A KR20140115786 A KR 20140115786A
Authority
KR
South Korea
Prior art keywords
signal
image
setting
unit
input
Prior art date
Application number
KR1020130030885A
Other languages
Korean (ko)
Inventor
양덕용
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130030885A priority Critical patent/KR20140115786A/en
Publication of KR20140115786A publication Critical patent/KR20140115786A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image display apparatus according to an embodiment of the present invention includes a display for displaying a 3D setting screen for a plurality of applications, an interface for receiving a user's 3D setting input regarding at least one application among a plurality of applications, And a control unit for setting and storing the setting corresponding to the 3D setting input for each application or setting and storing the setting corresponding to the activity of the predetermined application. Accordingly, in the case of using the 3D contents, the 3D-related setting can be performed simply and conveniently, and the usability of the user is improved.

Description

[0001] The present invention relates to an image display apparatus,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve the usability of a user using 3D contents.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Compared to analog broadcasting, digital broadcasting is strong against external noise and has a small data loss, is advantageous for error correction, has a high resolution, and provides a clear screen. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

It is an object of the present invention to provide an image display apparatus and an operation method thereof that can improve the usability of the user.

It is another object of the present invention to provide an image display apparatus and an operation method thereof that can perform 3D related settings simply and conveniently when 3D contents are used.

According to an aspect of the present invention, there is provided an image display apparatus including a display for displaying a 3D setting screen for a plurality of applications, an interface for receiving a user's 3D setting input regarding at least one application among a plurality of applications, And a controller for setting and storing a setting corresponding to the 3D setting input for each application or setting and storing the setting corresponding to the activity of the predetermined application.

According to the present invention, in the case of using the 3D contents, the 3D related setting can be performed simply and conveniently, and the usability of the user is improved.

1 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
2A and 2B are internal block diagrams of a set-top box and a display apparatus according to an embodiment of the present invention.
3 is an internal block diagram of the control unit of FIG.
4 is a diagram showing various formats of a 3D image.
5 is a diagram illustrating the operation of the 3D viewing apparatus according to the format of FIG.
6 is a diagram illustrating various scaling methods of a 3D image signal according to an exemplary embodiment of the present invention.
Fig. 7 is a view for explaining how images are formed by the left eye image and the right eye image.
8 is a view for explaining the depth of the 3D image according to the interval between the left eye image and the right eye image.
9 to 11 are views referred to explain the image display apparatus according to the embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.

1, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a tuner unit 110, a demodulation unit 120, an external device interface unit 130, a network interface unit 135, (Not shown), a controller 170, a display 180, an audio output unit 185, and a 3D viewing device 195. The display unit 180 may include a display unit 140, a user input interface unit 150, a sensor unit

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by the user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through the antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner unit 110 can process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner unit 110 can be directly input to the controller 170.

The tuner unit 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among the RF broadcast signals received through the antenna in the present invention, and sequentially selects RF broadcast signals of the intermediate frequency signal, baseband image, . ≪ / RTI >

On the other hand, the tuner unit 110 can include a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner that simultaneously receives broadcast signals of a plurality of channels is also possible.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

For example, when the digital IF signal output from the tuner unit 110 is the ATSC scheme, the demodulation unit 120 performs 8-VSB (7-Vestigal Side Band) demodulation. Also, the demodulation unit 120 may perform channel decoding. To this end, the demodulator 120 includes a trellis decoder, a de-interleaver, and a reed solomon decoder to perform trellis decoding, deinterleaving, Solomon decoding can be performed.

For example, when the digital IF signal output from the tuner unit 110 is a DVB scheme, the demodulator 120 performs COFDMA (Coded Orthogonal Frequency Division Modulation) demodulation. Also, the demodulation unit 120 may perform channel decoding. For this, the demodulator 120 may include a convolution decoder, a deinterleaver, and a reed-solomon decoder to perform convolutional decoding, deinterleaving, and reed solomon decoding.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. At this time, the stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. Specifically, the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.

Meanwhile, the demodulating unit 120 may be separately provided according to the ATSC scheme and the DVB scheme. That is, it can be provided as an ATSC demodulation unit and a DVB demodulation unit.

The stream signal output from the demodulation unit 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 can be connected to an external device 190 such as a digital versatile disk (DVD), a Blu ray, a game device, a camera, a camcorder, a computer . The external device interface unit 130 transmits external video, audio or data signals to the controller 170 of the video display device 100 through the connected external device 190. Also, the control unit 170 can output the processed video, audio, or data signal to the connected external device. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), a DVI (Digital Visual Interface) terminal, an HDMI (High Definition Multimedia Interface) terminal, an RGB terminal, a D-SUB terminal, and the like.

The wireless communication unit can perform short-range wireless communication with other electronic devices. The image display apparatus 100 is capable of performing communication such as Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, DLNA (Digital Living Network Alliance) Depending on the standard, it can be networked with other electronic devices.

Also, the external device interface unit 130 may be connected to various set-top boxes via at least one of the various terminals described above to perform input / output operations with the set-top box.

On the other hand, the external device interface unit 130 can transmit and receive data to and from the 3D viewing device 195.

The network interface unit 135 provides an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. The network interface unit 135 may include an Ethernet terminal and the like for connection with a wired network and may be a WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless) broadband), Wimax (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access) communication standards.

The network interface unit 135 can receive, via the network, contents or data provided by the Internet or a content provider or a network operator. That is, it can receive contents and related information such as movies, advertisements, games, VOD, broadcasting signals, etc., provided from the Internet, a content provider, and the like through a network. In addition, the update information and the update file of the firmware provided by the network operator can be received. It may also transmit data to the Internet or a content provider or network operator.

The network interface unit 135 is connected to, for example, an IP (Internet Protocol) TV and receives the processed video, audio or data signals from the settop box for IPTV to enable bidirectional communication, And can transmit the signals processed by the controller 170 to the IPTV set-top box.

Meanwhile, the IPTV may include ADSL-TV, VDSL-TV, FTTH-TV and the like depending on the type of the transmission network, and may include TV over DSL, Video over DSL, BTV), and the like. In addition, IPTV may also mean an Internet TV capable of accessing the Internet, or a full browsing TV.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage unit 140 may perform a function for temporarily storing video, audio, or data signals input to the external device interface unit 130. [ In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

The storage unit 140 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory) RAM, ROM (EEPROM, etc.), and the like. The image display apparatus 100 can reproduce files (moving picture files, still picture files, music files, document files, etc.) stored in the storage unit 140 and provide them to users.

1 shows an embodiment in which the storage unit 140 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the user input interface unit 150 may be configured to turn on / off the power, select the channel, and display the screen from the remote control device 200 according to various communication methods such as a radio frequency (RF) communication method and an infrared Or transmit a signal from the control unit 170 to the remote control device 200. The remote control device 200 may be connected to the remote control device 200 via a network.

For example, the user input interface unit 150 may transmit a user input signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170.

The sensor unit (not shown) may sense the position of the user or the gesture of the user, the touch or the position of the 3D viewing device 195. To this end, the sensor unit (not shown) may include a touch sensor, an audio sensor, a position sensor, an operation sensor, a gyro sensor, and the like.

The position of the sensed user, the gesture of the user, the touch or the position signal of the 3D viewing device 195 may be input to the controller 170. Alternatively, the control unit 170 may be input through the user input interface unit 150, unlike the drawing.

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner unit 110 or the demodulation unit 120 or the external device interface unit 130 so as to output the video or audio output Signals can be generated and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 1, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG.

In addition, the control unit 170 can control the overall operation in the video display device 100. [ For example, the control unit 170 may control the tuner unit 110 to control the tuning of the RF broadcast corresponding to the channel selected by the user or the previously stored channel.

In addition, the controller 170 may control the image display apparatus 100 according to a user command or an internal program input through the user input interface unit 150.

For example, the control unit 170 controls the tuner unit 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, video, audio, or data signals of the selected channel are processed. The control unit 170 may output the video or audio signal processed by the user through the display 180 or the audio output unit 185 together with the channel information selected by the user.

The control unit 170 controls the operation of the external device 190 through the external device interface unit 130 according to an external device video reproducing command received through the user input interface unit 150. For example, Or the video signal or audio signal from the camcorder can be output through the display 180 or the audio output unit 185. [

Meanwhile, the control unit 170 may control the display 180 to display an image. For example, a broadcast image input through the tuner unit 110, an external input image input through the external device interface unit 130, an image input through the network interface unit 135, or an image stored in the storage unit 140 So that the image can be displayed on the display 180.

At this time, the image displayed on the display 180 may be a still image or a moving image, and may be a 2D image or a 3D image.

On the other hand, the controller 170 generates a 3D object for a predetermined object among the images displayed on the display 180, and displays the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), EPG (Electronic Program Guide), various menus, widgets, icons, still images, moving images, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to be projected relative to the image displayed on the display 180.

On the other hand, the control unit 170 recognizes the position of the user based on the image photographed from the photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 can be grasped. In addition, the x-axis coordinate and the y-axis coordinate in the display 180 corresponding to the user position can be grasped.

On the other hand, the control unit 170 can perform signal processing so that the corresponding video can be viewed according to the viewing apparatus.

For example, when detecting presence, operation, or the number of the viewing apparatus 195 in the sensor unit (not shown) or the photographing unit (not shown), the controller 170 performs pairing with the viewing apparatus 195 Can be performed. That is, it is possible to control to output the pairing signal to the viewing device 195 and to control the receiving device 195 to receive the response signal.

The control unit 170 may control the tuner unit 110 to receive broadcast images according to the number of the viewing devices 195 that are sensed. For example, when the number of the viewing apparatuses to be sensed is three, it is possible to control the tuner unit 110 having a plurality of tuners to receive broadcast images of different channels. Then, the control unit 170 can control each of the broadcast images to be displayed at a different time in synchronization with each viewing apparatus.

On the other hand, the control unit 170 can receive the input external input image according to the number of the viewing apparatuses to be sensed. For example, when the number of the viewing apparatuses to be sensed is three, it is possible to control to receive an external input image from an optical apparatus such as a broadcast image, a DVD, or the like, and an external input image from the PC, respectively. Then, the control unit 170 can control to display each video (broadcast video, DVD video, PC video) at different times in synchronization with each viewing device.

On the other hand, the controller 170 can increase the vertical synchronization frequency Vsync of the displayed image and control the displayed images to be displayed each time the number of viewers detected increases during the video display. For example, for 1/60 second, if the first video and the second video are displayed in synchronism with the first viewing device and the second 3D viewing device, respectively, and the third viewing device is used, It is possible to control the first to third images to be displayed in synchronization with the first to third viewing apparatuses, respectively. That is, the first and third images may be displayed while the vertical synchronization frequency is set to 120 Hz, and the first to third images may be displayed with the vertical synchronization frequency of 180 Hz.

On the other hand, the control unit 170 can set a viewable image search target, for example, a channel search target of a broadcast image, for each audience device. For example, a channel search target may be set differently for each age, such as an adult and a child, and a search target may be different when a channel is searched. In addition, it is also possible to provide them separately by taste, sex, recent viewing channel, or program rating.

On the other hand, when the first viewing apparatus and the second viewing apparatus select the same image, the control unit 170 can control to notify a message indicating duplication. This message may be displayed in object form on the display 180, or may be transmitted as a wireless signal to each viewing device.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided. The channel browsing processing unit receives the stream signal TS output from the demodulation unit 120 or the stream signal output from the external device interface unit 130 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be directly or encoded and input to the controller 170. In addition, the generated thumbnail image may be encoded in a stream format and input to the controller 170. The control unit 170 may display a thumbnail list having a plurality of thumbnail images on the display 180 using the input thumbnail image. At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts a video signal, a data signal, an OSD signal, a control signal processed by the control unit 170, a video signal, a data signal, a control signal, and the like received from the external device interface unit 130, .

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or the like, and may also be capable of a 3D display. In order to view the three-dimensional image, the display 180 may be divided into an additional display method and a single display method.

The single display method can implement a 3D image only on the display 180 without a separate additional display, for example, glass, and examples thereof include a lenticular method, a parallax barrier, and the like Various methods can be applied.

In addition, the additional display method can implement a 3D image using an additional display as the 3D viewing device 195 in addition to the display 180. For example, various methods such as a head mount display (HMD) type, Can be applied.

On the other hand, the glasses type can be further divided into a passive type such as a polarizing glasses type and an active type such as a shutter glass type. Also, the head mount display type can be divided into a passive type and an active type.

On the other hand, the 3D viewing device 195 may be a 3D glass for stereoscopic viewing. The glass for 3D 195 may include a passive polarizing glass or an active shutter glass, and is described as a concept including the head mount type described above.

Meanwhile, the display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the control unit 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs it as a voice. The voice output unit 185 may be implemented by various types of speakers.

A photographing unit (not shown) photographs the user. The photographing unit (not shown) may be implemented by a single camera, but the present invention is not limited thereto, and may be implemented by a plurality of cameras. On the other hand, the photographing unit (not shown) may be disposed above the display 180. The image information photographed by the photographing unit (not shown) is input to the control unit 170.

The control unit 170 can detect the gesture of the user by combining the images captured from the photographing unit (not shown) or the signals detected from the sensor unit (not shown), respectively.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. [ To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. Also, the remote control apparatus 200 can receive the video, audio, or data signal output from the user input interface unit 150 and display it or output it by the remote control apparatus 200.

The video display apparatus 100 described above can be applied to a digital broadcasting of ATSC system (8-VSB system), digital broadcasting of DVB-T system (COFDM system), digital broadcasting of ISDB-T system (BST-OFDM system) And a digital broadcasting receiver capable of receiving at least one of the digital broadcasting signals. In addition, as a portable type, digital terrestrial DMB broadcasting, satellite DMB broadcasting, ATSC-M / H broadcasting, DVB-H broadcasting (COFDM broadcasting), MediaFoward Link Only And a digital broadcasting receiver capable of receiving at least one of digital broadcasting and the like. It may also be a digital broadcast receiver for cable, satellite communications, or IPTV.

Meanwhile, the video display device described in the present specification can be applied to various applications such as a TV receiver, a projector, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants) Multimedia Player).

Meanwhile, the block diagram of the image display apparatus 100 shown in FIG. 1 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

1, the video display apparatus 100 does not include the tuner unit 110 and the demodulation unit 120 shown in FIG. 1, and may be connected to the network interface unit 130 or the external device interface unit 135 to play back the video content.

On the other hand, the image display apparatus 100 is an example of a video signal processing apparatus that performs signal processing of an image stored in the apparatus or an input image. Another example of the image signal processing apparatus includes a display 180 shown in FIG. 1, A set-top box excluding the audio output unit 185, a DVD player, a Blu-ray player, a game machine, a computer, and the like may be further exemplified. The set-top box will be described with reference to Figs. 2A to 2B below.

2A and 2B are internal block diagrams of a set-top box and a display apparatus according to an embodiment of the present invention.

Referring to FIG. 2A, the set-top box 250 and the display device 300 can transmit or receive data by wire or wirelessly. Hereinafter, the difference from FIG. 1 will be mainly described.

The set top box 250 may include a network interface unit 255, a storage unit 258, a signal processing unit 260, a user input interface unit 263, and an external device interface unit 265.

The network interface unit 255 provides an interface for connection to a wired / wireless network including the Internet network. It can also transmit or receive data to other users or other electronic devices via the connected network or other network linked to the connected network.

The storage unit 258 may store a program for each signal processing and control in the signal processing unit 260 or may store a video, It may perform a function for temporary storage of a signal.

The signal processing unit 260 performs signal processing of an input signal. For example, it is possible to demultiplex or decode an input video signal, and perform demultiplexing or decoding of an input audio signal. To this end, a video decoder or a voice decoder may be provided. The processed video signal or audio signal can be transmitted to the display device 300 through the external device interface unit 265. [

The user input interface unit 263 transfers a signal input by the user to the signal processing unit 260 or a signal from the signal processing unit 260 to the user. For example, various control signals, such as power on / off, operation input, and setting input, input through a local key (not shown) or the remote control device 200, may be received and transmitted to the signal processing unit 260.

The external device interface unit 265 provides an interface for transmitting or receiving data with an external device connected by wire or wirelessly. In particular, it provides an interface for transmitting or receiving data with the display device 300. It is also possible to provide an interface for data transmission or reception with an external device such as a game device, a camera, a camcorder, a computer (notebook computer) or the like.

The set-top box 250 may further include a media input unit (not shown) for playing a separate media. An example of such a media input unit is a Blu-ray input unit (not shown) or the like. That is, the set-top box 250 can include a Blu-ray player or the like. The input media such as a Blu-ray disc can be transmitted to the display device 300 via the external device interface 265 for display after signal processing such as demultiplexing or decoding in the signal processor 260 .

The display device 300 includes a tuner unit 270, an external device interface unit 273, a demodulation unit 275, a storage unit 278, a control unit 280, a user input interface unit 283, a display 290, And an audio output unit 295, as shown in FIG.

The tuner unit 270, the demodulation unit 275, the storage unit 278, the control unit 280, the user input interface unit 283, the display 290, and the audio output unit 295, The display 180, and the audio output unit 185 of the tuner unit 110, the demodulation unit 120, the storage unit 140, the control unit 170, the user input interface unit 150, .

On the other hand, the external device interface unit 273 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, it provides an interface for transmitting or receiving data with the set-top box 250.

Accordingly, the video signal or audio signal input through the set-top box 250 is output through the display unit 290 or the audio output unit 295 through the control unit 290. [

2B, the set-top box 250 and the display apparatus 300 are the same as the set-top box 250 and the display apparatus 300 of FIG. 2A except that the tuner unit 270 and the demodulation unit 275 are located in the set-top box 250 rather than in the display device 300. FIG. Only the differences will be described below.

The signal processing unit 260 may perform signal processing of a broadcast signal received through the tuner unit 270 and the demodulation unit 275. Also, the user input interface unit 263 can receive inputs such as channel selection, channel storage, and the like.

Although the set-top box 250 shown in FIGS. 2A and 2B does not show the audio output unit 185 in FIG. 1, it is also possible to have separate audio output units.

FIG. 3 is an internal block diagram of the control unit of FIG. 1, FIG. 4 is a diagram illustrating various formats of a 3D image, and FIG. 5 is a diagram illustrating operations of a 3D viewing apparatus according to the format of FIG.

A controller 170 according to an exemplary embodiment of the present invention includes a demultiplexer 310, an image processor 320, an OSD generator 340, a mixer 345, (350), and a formatter (360). An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processing unit 320 may perform image processing of the demultiplexed image signal. To this end, the image processing unit 320 may include a video decoder 225 and a scaler 235. [

The video decoder 225 decodes the demultiplexed video signal and the scaler 235 performs scaling so that the resolution of the decoded video signal can be output from the display 180.

The video decoder 225 may include a decoder of various standards. For example, the video decoder 225 may include at least one of an MPEG-2 decoder, an H.264 decoder, an MPEC-C decoder (MPEC-C part 3), an MVC decoder, and an FTV decoder.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

Meanwhile, the image signal decoded by the image processing unit 320 may be a 3D image signal in various formats. For example, a 3D image signal composed of a color image and a depth image, or a 3D image signal composed of a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

4, the format of the 3D video signal is a side-by-side format (Fig. 4A) in which the left eye image signal L and the right eye image signal R are arranged left and right, A frame sequential format (FIG. 4C) for arranging in a time division manner, an interlaced format (FIG. 4B) for mixing the left eye image signal and the right eye image signal line by line 4d), a checker box format (FIG. 4e) for mixing the left eye image signal and the right eye image signal box by box, and the like.

The OSD generation unit 340 generates an OSD signal according to a user input or by itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded video signal processed by the image processor 320. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 converts the frame rate of an input image. For example, a frame rate of 60 Hz is converted to 120 Hz, 240 Hz, or 480 Hz. When converting the frame rate of 60 Hz to 120 Hz, it is possible to insert the same first frame between the first frame and the second frame, or insert the third frame predicted from the first frame and the second frame. When converting a frame rate of 60 Hz to 240 Hz, it is possible to insert three more identical frames or insert three predicted frames. In the case of converting the frame rate of 60 Hz to 480 Hz, it is possible to insert seven more frames of the same frame or to insert seven frames of the predicted frame.

On the other hand, the frame rate converter 350 can output the frame rate as it is without any additional frame rate conversion. Preferably, when a 2D video signal is input, the frame rate can be output as it is. On the other hand, when a 3D video signal is input, the frame rate can be varied as described above.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. The left eye glass of the 3D viewing apparatus 195 and the synchronization signal Vsync for opening the right eye glass can be output.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

In the present specification, a 3D video signal means a 3D object. Examples of the 3D object include a picuture in picture (PIP) image (still image or moving picture), an EPG indicating broadcasting program information, Icons, texts, objects in images, people, backgrounds, web screens (newspapers, magazines, etc.).

On the other hand, the formatter 360 can change the format of the 3D video signal. For example, it can be changed to any one of various formats exemplified in FIG. Accordingly, according to the format, the operation of the 3D viewing device of the glasses type can be performed as shown in FIG.

5A illustrates operation of the 3D-use glass 195, particularly, the shutter glass 195 when the formatter 360 arranges and outputs the frames in the frame sequential format of the format shown in FIG. 4. FIG.

That is, when the left eye image L is displayed on the display 180, the left eye glass of the shutter glass 195 is opened and the right eye glass is closed. When the right eye image R is displayed, The left eye glass is closed, and the right eye glass is opened.

On the other hand, FIG. 5B illustrates the operation of the 3D-use glass 195, particularly the polarizing glass 195, when the formatter 360 arranges and outputs the side-by-side format of the format shown in FIG. On the other hand, the 3D glass 195 applied in FIG. 5 (b) may be a shutter glass, and the shutter glass at this time may be operated as a polarizing glass by keeping both the left-eye glass and right-eye glass open .

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal can be separated into the left eye image signal L and the right eye image signal R, as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. On the other hand, the functions of such a 3D processor can be merged into the formatter 360 or merged into the image processing unit 320. [ This will be described later with reference to FIG. 6 and the like.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

For example, if the demultiplexed speech signal is a coded speech signal, it can be decoded. Specifically, when the demultiplexed speech signal is an MPEG-2 standard encoded speech signal, it can be decoded by an MPEG-2 decoder. In addition, if the demultiplexed speech signal is a coded voice signal of the MPEG 4 BSAC (Bit Sliced Arithmetic Coding) standard according to a terrestrial DMB (Digital Multimedia Broadcasting) scheme, it can be decoded by an MPEG 4 decoder. Also, if the demultiplexed speech signal is an encoded audio signal of the AAC (Advanced Audio Codec) standard of MPEG 2 according to the satellite DMB scheme or DVB-H, it can be decoded by the AAC decoder. If the demultiplexed speech signal is a Dolby AC-3 standard encoded audio signal, it can be decoded by an AC-3 decoder.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel. For example, the EPG information may be ATSC-PSIP (ATSC-Program and System Information Protocol) information in the case of the ATSC scheme and may include DVB-SI information in the DVB scheme . The ATSC-PSIP information or the DVB-SI information may be information included in the above-mentioned stream, that is, the header (2 bytes) of the MPEG-2 TS.

3, the signals from the OSD generating unit 340 and the image processing unit 320 are mixed in the mixer 345 and then 3D processed in the formatter 360. However, the present invention is not limited to this, May be located behind the formatter. That is, the output of the image processing unit 320 is 3D-processed by the formatter 360, and the OSD generating unit 340 performs 3D processing together with the OSD generation. Thereafter, the processed 3D signals are mixed by the mixer 345 It is also possible to do.

Meanwhile, the block diagram of the controller 170 shown in FIG. 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be separately provided.

6 is a diagram illustrating various scaling methods of a 3D image signal according to an exemplary embodiment of the present invention.

Referring to the drawing, in order to increase the 3-dimensional effect, the controller 170 may perform 3D effect signal processing. In particular, it is possible to perform a size adjustment or a tilt adjustment of a 3D object in a 3D image.

The 3D object 510 in the 3D image signal or the 3D image signal can be enlarged or reduced 512 as a whole at a certain ratio as shown in FIG. 6 (a), and also, as shown in FIG. 6 (b) , The 3D object may be partially enlarged or reduced (trapezoidal shape, 514, 516). 6 (d), at least a portion of the 3D object may be rotated (parallelogram shape 518). This scaling (scaling) or skew adjustment can emphasize the 3D effect of a 3D object in a 3D image or a 3D image, that is, a 3D effect.

On the other hand, as the slope becomes larger, the difference in length between the parallel sides of the trapezoidal shapes 514, 516 becomes larger as shown in Fig. 6B or 6C, .

The size adjustment or the tilt adjustment may be performed after the 3D image signal is aligned in a predetermined format in the formatter 360. [ Or in the scaler 235 in the image processing unit 320. [ On the other hand, the OSD generating unit 340 may generate an object in a shape as illustrated in FIG. 6 for the OSD generated for 3D effect enhancement.

Although not shown in the drawing, signal processing for a 3D effect (3-dimensional effect) may be performed by adjusting the brightness, tint, and brightness of an image signal or object, It is also possible that signal processing such as color adjustment is performed. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. The signal processing for the 3D effect may be performed in the controller 170 or may be performed through a separate 3D processor. Particularly, when it is performed in the control unit 170, it is possible to perform it in the formatter 360 or in the image processing unit 320 together with the above-described size adjustment or tilt adjustment.

FIG. 7 is a view for explaining how images are formed by a left eye image and a right eye image, and FIG. 8 is a view for explaining depths of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 615, 625, 635, and 645 are illustrated.

First, the first object 615 includes a first left eye image 611 (L) based on the first left eye image signal and a first right eye image 613 (R) based on the first right eye image signal, It is exemplified that the interval between the first left eye image 611, L and the first right eye image 613, R is d1 on the display 180. [ At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the first left eye image 611 and an extension line connecting the right eye 603 and the first right eye image 603. Accordingly, the user recognizes that the first object 615 is positioned behind the display 180. [

Next, since the second object 625 includes the second left eye image 621, L and the second right eye image 623, R and overlaps with each other and is displayed on the display 180, do. Accordingly, the user recognizes that the second object 625 is located on the display 180. [

Next, the third object 635 and the fourth object 645 are arranged in the order of the third left eye image 631, L, the second right eye image 633, R, the fourth left eye image 641, Right eye image 643 (R), and their intervals are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 635 and the fourth object 645 are positioned at positions where the images are formed, respectively, and recognizes that they are located before the display 180 in the drawing.

At this time, it is recognized that the fourth object 645 is projected before the third object 635, that is, more protruded than the third object 635. This is because the interval between the fourth left eye image 641, L and the fourth right eye image 643, d4 is larger than the interval d3 between the third left eye image 631, L and the third right eye image 633, R. [

Meanwhile, in the embodiment of the present invention, the distance between the display 180 and the objects 615, 625, 635, and 645 recognized by the user is represented by a depth. Accordingly, it is assumed that the depth when the user is recognized as being positioned behind the display 180 has a negative value (-), and the depth when the user is recognized as being positioned before the display 180 (depth) has a negative value (+). That is, the greater the degree of protrusion in the user direction, the greater the depth.

8, the interval a between the left eye image 701 and the right eye image 702 in FIG. 8A is smaller than the interval between the left eye image 701 and the right eye image 702 shown in FIG. 8 (b) (b ') of the 3D object shown in Fig. 8 (b) is larger than the depth a' of the 3D object shown in Fig. 8 (a).

In this way, when the 3D image is exemplified as the left eye image and the right eye image, the positions recognized as images are different depending on the interval between the left eye image and the right eye image. Accordingly, by adjusting the display intervals of the left eye image and the right eye image, the depth of the 3D image or the 3D object composed of the left eye image and the right eye image can be adjusted.

9 to 11 are views referred to explain the image display apparatus according to the embodiment of the present invention.

Currently, some video display devices include 3D auto-detect function of contents. However, if 3D function such as 3D World or HDMI in is available, auto-detect only in 3D mode, detect function is turned on. The reason I do not always turn on 3D auto detection is because of the high detection error rate.

For example, if there is a scene in which a window is arranged symmetrically in a 2D image, it can be recognized as a 3D image and forced to change to a 3D mode. If the 2D image is forcibly changed to the 3D image, the user can not see the proper image.

In recent years, a video display device has been connected to a network to freely use the Internet, and a smart function for installing and using various applications has been supported. In addition, there may be applications that support 3D among various applications.

For example, by creating Side-By-Side and Top-Bottom screens in applications such as wallpaper, home launchers, and games, You can see it as a solid. For such applications, it is possible to turn on the 3D auto detection function and to provide the convenience to the user if the side-by-side, top-bottom, etc. screens can be automatically converted to the 3D mode.

An image display apparatus 100 according to an embodiment of the present invention includes a display 180 for displaying a 3D setting screen for a plurality of applications, an interface 180 for receiving a user's 3D setting input regarding at least one of the plurality of applications, A setting unit 140 for setting the 3D setting input, a setting unit 150 for setting and storing the setting corresponding to the 3D setting input for each application or setting and storing the setting corresponding to the activity of the predetermined application, And a control unit 170 for controlling the display unit.

If you turn on the 3D auto detection function for all applications, you can force 3D conversion even if it is not 3D as described above.

In this case, the user feels very uncomfortable, and the 3D auto detection function is turned off and the image display device is used.

Also, in this case, the user has the inconvenience of turning on the 3D mode every time the same 3D supporting application is executed.

However, the present invention can set and store various 3D environment settings such as a 3D auto sensing function for each application, or set and store each 3D environment setting for each predetermined application.

The 3D setting input may be an input for setting at least one of whether to automatically convert to a 3D mode (whether or not a 3D automatic sensing function is used), a 3D format such as a side-by-side, a top-bottom, .

In addition, the 3D setting input may be an input for setting to switch to the 3D mode automatically when the set application is executed or in response to the predetermined activity of the set application. In this case, the 3D setting input may further include an input for setting a format of the 3D image.

Accordingly, it is possible to provide the user with the convenience of automatically converting a stereoscopic 3D image from a specific screen of a predetermined application or executing a predetermined application.

9 to 11 show various examples of the 3D setting screen.

The control unit 170 may control the display unit 180 to display the 3D setting screen when the user executes the setting menu or when the user switches to the 3D mode while the application is running.

According to an embodiment of the present invention, a user can manage an application to be turned on in 3D automatic detection function.

Referring to the 3D setting screen 900 shown in FIG. 9, an application supporting 3D or an application desired to be used in 3D among a plurality of applications 910 installed in the video display device or available via the web can be automatically detected The function item 930 can be set to be turned on.

On the other hand, an application 920 that is not set to the 3D auto detection function item does not turn on the 3D auto detection function by default.

On the other hand, if the user manually converts the 3D mode during execution of the application 920 which is not set as the 3D automatic detection function item, the user can inquire whether or not to set the item as the 3D automatic detection function item.

On the other hand, in the case of the application set as the 3D automatic detection function item, it is possible to set not to switch to the 3D mode unconditionally at the time of executing the application, but to turn on the 3D automatic detection function from the specific scene and switch to the 3D mode.

For example, in the case of using an Android OS, each scene may be configured as an activity, and the activity may correspond to a specific scene of the predetermined application.

On the other hand, the conversion of each activity is known at the framework level.

For example, when a user selects a GameMain activity 1043 from among a plurality of activities 1041, 1042, 1043, and 1044 on the detail setting screen 1000 for a predetermined application 1010 as shown in FIG. 10, The controller 170 can automatically control to switch to the 3D mode when entering the GameMain activity. On the other hand, the selected activity 1043 may display an object 1050 indicating the setting.

Alternatively, you can set the user to turn on the 3D auto-sensing and execution feature in an activity called GameStart (not shown), which will automatically turn on 3d mode when entering an activity called GameStart.

On the other hand, the detailed setting screen 1000 for the predetermined application 1010 in FIG. 10 may be a setting screen for the application 920 in which the automatic 3D item is not turned on.

For example, the user can select the manual setting item 1020 and the manual setting item 1020 of the automatic setting item 1030 to set the mode to be switched to the 3D mode on the specific screen without switching to the 3D mode immediately when the application is executed have.

In addition, one embodiment of the present invention recognizes a pattern of a user and can set a point at which 3D should be automatically turned on. For example, in Android, if you manually turn on 3D in an activity called GameStart, you can save this point and use it from hereafter.

In contrast to the embodiment shown in FIG. 10, even when the application is set to the 3D automatic sensing function item, the 3D automatic sensing function is turned on from a specific scene instead of being switched to the 3D mode unconditionally at the time of execution of the application, Mode can be set.

11, the application 1110 detailed setting screen 1100 set as the 3D automatic detection function item includes a manual setting item 1120, an automatic setting item 1130, a plurality of activities 1141, 1142, 1143, and 1144, May be displayed.

If the user selects the predetermined activity 1053, the control unit 170 can control the 3D mode automatically when entering the selected activity. The selected activity 1043 includes an object 1150 ) Can be displayed.

Meanwhile, in one embodiment of the present invention, 3D formats such as side-by-side and top-bottom can also be set. For example, the user can set the 3d mode in a top-bottom manner in an activity called GameStart, and the control unit 170 can automatically turn on the top-bottom 3D mode when entering an activity called GameStart.

In the conventional 3D TV, since the specifications of the 3D image format are not unified with respect to the 3D contents and the 3D image format information is not included, the viewer must directly view the 3D TV after selecting the 3D image format.

In addition, every time the content is used, there is an inconvenience to newly select a 3D image format.

Thus, an embodiment of the present invention may store 3D image format settings for a given application and then apply automatically stored 3D format settings when using the same broadcast application.

Meanwhile, the control unit 170 may control the setting to be stored for each logged-in user account.

The video display device often has a plurality of users such as a family member, and the applications installed for each user may be different or have different preference settings. The video display device is generally used by a plurality of users at the same time in the home. However, in some cases, the video display device may be used alone.

Therefore, by storing and managing account-specific settings for each logged-in user, it is possible to prevent other users from changing the preferences of individual users or to prevent inconvenience of other users using the video display device.

Meanwhile, the controller 170 may control the stored settings to be transmitted to a server connected through a network. The control unit 170 can transmit the 3D setting details of the user to an external server or an electronic device through the network interface unit 135. [

On the other hand, 3D mode-related settings (auto detection and execution, 3D type, 3D on, etc.) can be defined in advance by dividing the version of each application and the package name and managed by the server, have. Also, the pattern of usage can be updated to the server.

According to the present invention, in the case of using 3D contents, in particular, applications supporting 3D, the 3D-related setting can be performed simply and conveniently, thereby improving usability of the user.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above but the embodiments can be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

110: tuner unit 120: demodulation unit
130: external device interface unit 135: network interface unit 140: storage unit 150: user input interface unit 170: control unit 180: display
200: remote control device

Claims (8)

A display for displaying a 3D setting screen for a plurality of applications;
An interface for receiving a user's 3D setting input regarding at least one of the plurality of applications;
A storage unit for storing settings corresponding to the 3D setting input; And
And a controller for setting and saving settings corresponding to the 3D setting input for each application or setting and storing the settings for each activity of an application.
The method according to claim 1,
Wherein the 3D setting input is an input for setting at least one of an automatic conversion into a 3D mode, a 3D format, and a conversion time into a 3D mode.
The method according to claim 1,
Wherein the 3D setting input is an input for setting to switch to the 3D mode automatically when the set application is executed or in correspondence with a predetermined activity of the set application.
The method of claim 3,
Wherein the 3D setting input further comprises an input for setting a format of a 3D image.
The method according to claim 1,
Wherein the control unit controls the setting to be stored for each logged-in user account.
The method according to claim 1,
Wherein the control unit controls the display unit to display the 3D setting screen on the display when the user executes the setting menu or when the user switches to the 3D mode while the predetermined application is being executed.
The method according to claim 1,
Wherein the activity corresponds to a specific scene of the predetermined application.
The method according to claim 1,
Wherein the control unit controls to transmit the stored setting to a server connected through a network.
KR1020130030885A 2013-03-22 2013-03-22 Image display apparatus KR20140115786A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130030885A KR20140115786A (en) 2013-03-22 2013-03-22 Image display apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130030885A KR20140115786A (en) 2013-03-22 2013-03-22 Image display apparatus

Publications (1)

Publication Number Publication Date
KR20140115786A true KR20140115786A (en) 2014-10-01

Family

ID=51990033

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130030885A KR20140115786A (en) 2013-03-22 2013-03-22 Image display apparatus

Country Status (1)

Country Link
KR (1) KR20140115786A (en)

Similar Documents

Publication Publication Date Title
US8803954B2 (en) Image display device, viewing device and methods for operating the same
US8896672B2 (en) Image display device capable of three-dimensionally displaying an item or user interface and a method for operating the same
KR101349276B1 (en) Video display device and operating method therefor
US8797390B2 (en) Image display device, 3D viewing device, and method for operating the same
KR20110052771A (en) Image display device and operating method for the same
KR20120011254A (en) Method for operating an apparatus for displaying image
KR20110082380A (en) Apparatus for displaying image and method for operating the same
US20130291017A1 (en) Image display apparatus and method for operating the same
KR20130010277A (en) Method for operating an apparatus for displaying image
KR101730424B1 (en) Image display apparatus and method for operating the same
KR20120062428A (en) Image display apparatus, and method for operating the same
KR101730323B1 (en) Apparatus for viewing image image display apparatus and method for operating the same
KR101716144B1 (en) Image display apparatus, and method for operating the same
KR20140115786A (en) Image display apparatus
KR101702968B1 (en) Operating an Image Display Device
KR101638536B1 (en) Image Display Device and Controlling Method for the Same
KR101176500B1 (en) Image display apparatus, and method for operating the same
KR101737367B1 (en) Image display apparatus and method for operating the same
KR20120034836A (en) Image display apparatus, and method for operating the same
KR101691801B1 (en) Multi vision system
KR20110134087A (en) Image display apparatus and method for operating the same
KR101730423B1 (en) Apparatus for displaying image and method for operating the same
KR20120054324A (en) Method for operating an apparatus for displaying image
KR102014149B1 (en) Image display apparatus, and method for operating the same
KR20110114295A (en) Apparatus for viewing 3d image and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination