KR20130030601A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20130030601A
KR20130030601A KR1020110094191A KR20110094191A KR20130030601A KR 20130030601 A KR20130030601 A KR 20130030601A KR 1020110094191 A KR1020110094191 A KR 1020110094191A KR 20110094191 A KR20110094191 A KR 20110094191A KR 20130030601 A KR20130030601 A KR 20130030601A
Authority
KR
South Korea
Prior art keywords
image
display
eye image
left eye
right eye
Prior art date
Application number
KR1020110094191A
Other languages
Korean (ko)
Inventor
장효상
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110094191A priority Critical patent/KR20130030601A/en
Publication of KR20130030601A publication Critical patent/KR20130030601A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

PURPOSE: An image display device and an operating method thereof are provided to watch a plurality of 3D images through one display by displaying a first 3D image and a second 3D image on the display at different times according to a plurality of 3D image display modes. CONSTITUTION: A control unit enters a plurality of 3D image display modes by the input of a user(S1310). The control unit arranges a first 3D image as a first left eye image and a first right eye image by the control of a formatter(S1320). The control unit arranges a second 3D image as a second left eye image and a second right eye image by the control of the formatter(S1330). The control unit displays the first 3D image and the second 3D image at different times by the control of a display(S1340). [Reference numerals] (AA) Start; (BB) End; (S1310) Entering a multiple 3D image display mode; (S1320) Arranging a first 3D image into a first left eye image and a first right eye image; (S1330) Arranging a second 3D image into a second left eye image and a second right eye image; (S1340) Displaying the first and second 3D images at different times

Description

[0001] The present invention relates to an image display apparatus and a method of operating the same,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve the usability of a user.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

It is an object of the present invention to provide a video display device and an operation method thereof that can improve the usability of the user.

Another object of the present invention is to provide an image display apparatus and a method of operating the same, capable of viewing a plurality of 3D images using one display.

In accordance with an aspect of the present invention, there is provided a method of operating an image display apparatus, the method comprising: entering a plurality of 3D image display modes, and according to a mode, a first left eye image and a first right eye image according to a mode; Arranging the second 3D image into a second left eye image and a second right eye image according to a mode, and displaying the first 3D image and the second 3D image on a display at different times. Include.

In addition, the operation method according to an embodiment of the present invention for achieving the above object, entering a plurality of image display mode including a 3D image, and according to the mode, the 3D image in the first left eye image and the first right eye image And arranging the 2D video according to the mode, and displaying the 3D video and the 2D video on the display at different times.

In addition, according to an embodiment of the present invention for achieving the above object, in the plurality of 3D image display mode, the first 3D image is arranged into a first left eye image and a first right eye image, the second 3D image And a formatter for arranging the second left eye image and the second right eye image, and a display for displaying the first 3D image and the second 3D image at different times.

In addition, the image display device according to an embodiment of the present invention for achieving the above object, in the multi-image display mode including a 3D image, arranges the 3D image into a first left eye image and a first right eye image, 2D image And a display for displaying 3D video and 2D video at different times.

According to an embodiment of the present invention, according to the plurality of 3D image display mode, by displaying the first 3D image and the second 3D image on the display at different times, it is possible to watch a plurality of 3D images using one display. . Thus, the user's convenience of use can be increased.

In particular, the user wearing the first viewing device can watch only the first 3D image, and the user wearing the second viewing device can watch only the second 3D image, thereby allowing the user to watch the desired 3D image.

Meanwhile, the first left eye image and the first right eye image are displayed at different times in the first area of the display, and the second left eye image and the second right eye image of the second 3D image are displayed at different times. By being displayed in the second area of, it is possible to simply watch a plurality of 3D images using one display.

In addition, according to another embodiment of the present invention, according to the multiple image display mode including the 3D image, by displaying the 3D image and the 2D image on the display at different times, it is possible to watch a plurality of images using a single display Will be. Thus, the user's convenience of use can be increased.

In particular, a user wearing the first viewing device can watch only a 3D image, and a user wearing the second viewing device can only watch a 2D image, so that each user can watch a desired image.

1 is a block diagram illustrating an image display apparatus according to an exemplary embodiment of the present invention.
2A to 2B are internal block diagrams of a set-top box and a display device according to an embodiment of the present invention.
3 is an internal block diagram of the controller of FIG. 1.
4 is a diagram illustrating various formats of a 3D image.
5 is a diagram illustrating an operation of a viewing apparatus according to the format of FIG. 4.
6 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating an image formed by a left eye image and a right eye image.
8 is a diagram illustrating depth of a 3D image according to a distance between a left eye image and a right eye image.
9 is a diagram illustrating a control method of the remote controller of FIG. 1.
10 is an internal block diagram of the remote control device of FIG. 1.
11 is a diagram illustrating a 3D viewing apparatus and an image display apparatus according to an exemplary embodiment of the present invention.
12 is an internal block diagram of the 3D viewing apparatus and the image display apparatus of FIG. 11.
13 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
14 to 20 are diagrams for describing various examples of an operation method of the image display apparatus of FIG. 13.
FIG. 21 is a flowchart illustrating an operation method of an image display apparatus according to another embodiment of the present invention.
22 to 28 are views referred to for describing various examples of an operating method of the image display apparatus of FIG. 21.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a block diagram illustrating an image display apparatus according to an exemplary embodiment of the present invention.

Referring to FIG. 1, the image display apparatus 100 according to an exemplary embodiment of the present invention may include a broadcast receiver 105, an external device interface 130, a storage 140, a user input interface 150, and a viewer. The device interface unit 160, a sensor unit (not shown), a controller 170, a display 180, an audio output unit 185, and a 3D viewing device 195 may be included.

The broadcast receiver 105 may include a tuner 110, a demodulator 120, and a network interface 130. Of course, it is possible to design the network interface unit 130 not to include the tuner unit 110 and the demodulation unit 120 as necessary, and to provide the network interface unit 130 with the tuner unit 110 And the demodulation unit 120 are not included.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner 110 may process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal CVBS / SIF output from the tuner 110 may be directly input to the controller 170.

In addition, the tuner unit 110 may receive a single broadcast RF broadcast signal according to an ATSC (Advanced Television System Committee) scheme or a multiple broadcast RF broadcast signal according to a digital video broadcasting (DVB) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna in the present invention, and converts them to intermediate frequency signals or baseband video or audio signals. Can be converted to

On the other hand, the tuner unit 110 may be provided with a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner may be used to receive broadcast signals of multiple channels simultaneously.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a DVD (Digital Versatile Disk), Blu-ray (Blu ray), a game device, a camera, a camcorder, a computer (laptop), a set top box, or the like by wire / wireless. It may also perform input / output operations with external devices.

The A / V input / output unit may receive a video and audio signal of an external device. The wireless communication unit may perform short range wireless communication with another electronic device.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. For example, the network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal-processed video, audio, or data signal.

In addition, the storage unit 140 may perform a function for temporarily storing an image, audio, or data signal input to the external device interface unit 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function such as a channel map.

Although the storage unit 140 of FIG. 1 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the remote controller 200 transmits / receives a user input signal such as power on / off, channel selection, screen setting, or a local key (not shown) such as a power key, a channel key, a volume key, or a set value. Transmits a user input signal input from the control unit 170, or transmits a user input signal input from the sensor unit (not shown) for sensing the user's gesture to the control unit 170, or the signal from the control unit 170 It can transmit to a sensor unit (not shown).

The viewing apparatus interface unit 160 may transmit or receive a data signal with the 3D viewing apparatus 195. For example, when the 3D viewing apparatus 195 is turned on, the viewing apparatus interface unit 160 may transmit a pairing signal to the 3D viewing apparatus 195, and the pairing response signal from the 3D viewing apparatus 195. Can be received.

In addition, the viewing apparatus interface unit 160 may transmit a synchronization signal related to turn-on timing of the left eye glass and the right eye glass of the 3D viewing apparatus 195 to the 3D viewing apparatus 195.

The controller 170 may demultiplex the input stream or process the demultiplexed signals through the tuner unit 110, the demodulator 120, or the external device interface unit 130. Signals can be generated and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The voice signal processed by the controller 170 may be sound output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

Although not shown in FIG. 1, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 3.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

The controller 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined 2D object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

The controller 170 may recognize a location of a user based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the display 180 corresponding to the user position may be determined.

On the other hand, although not shown in the figure, it may be further provided with a channel browsing processing unit for generating a thumbnail image corresponding to the channel signal or the external input signal. The channel browsing processor may receive a stream signal TS output from the demodulator 120 or a stream signal output from the external device interface 130, extract a video from the input stream signal, and generate a thumbnail image. Can be. The generated thumbnail image may be stream decoded together with the decoded image and input to the controller 170. The controller 170 may display a thumbnail list including a plurality of thumbnail images on the display 180 by using the input thumbnail image.

At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or a 3D display.

In order to view the 3D image, the display 180 may be divided into an additional display method and a single display method.

The independent display method may implement 3D image viewing by the display 180 alone without an additional display, for example, glasses, and the like, for example, a lenticular method and a parallax barrier. Various methods may be applied.

Meanwhile, the additional display method may implement 3D video viewing by using the additional display as the 3D viewing device 195 in addition to the display 180. For example, various methods such as a head mounted display (HMD) type and glasses type may be used. This can be applied.

On the other hand, the spectacle type may be divided into a passive system such as a polarized glasses type and an active system such as a shutter glass type. In addition, the head mounted display can be divided into a passive type and an active type.

The 3D viewing apparatus 195 may be 3D glasses capable of viewing stereoscopic images. The 3D glass 195 may include a passive polarized glass or an active shutter glass, and may also include the aforementioned head mount type.

For example, when the 3D viewing apparatus 195 is a polarized glass, the left eye glass may be implemented as a left eye polarized glass, and the right eye glass may be implemented as a right eye polarized glass.

As another example, when the 3D viewing apparatus 195 is a shutter glass, the left eye glass and the right eye glass may be opened and closed alternately with each other.

Meanwhile, the 3D viewing apparatus 195 may view different images for each user.

For example, when the 3D viewing device 195 is polarized glass, it may be implemented with the same polarized glass. That is, both the left eye glass and the right eye glass of the first viewing device 195a may be implemented with the left eye polarization glass, and the left eye glass and the right eye glass of the second viewing device 195b may be both implemented with the right eye polarization glass. have.

As another example, when the 3D viewing apparatus 195 is a shutter glass, it is possible to open and close at the same time. That is, both the left glass and the right eye glass of the first viewing device 195a are all opened for the first time and both are closed for the second time, and the left glass and the right eye glass of the second viewing device 195b are both made of the first glass. It can be closed for one hour and open all for a second time.

 The display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170 and outputs the audio signal.

A photographing unit (not shown) photographs the user. The photographing unit (not shown) may be implemented by a single camera, but the present invention is not limited thereto, and may be implemented by a plurality of cameras. Meanwhile, the photographing unit (not shown) may be embedded in the image display device 100 on the display 180 or disposed separately. The image information photographed by the photographing unit (not shown) may be input to the control unit 170.

The controller 170 may detect a user's gesture based on each of the images captured by the photographing unit (not shown) or the sensed signal from the sensor unit (not shown) or a combination thereof.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, and display or output the audio from the remote control apparatus 200.

Meanwhile, the above-described image display apparatus 100 may be a digital broadcast receiver capable of receiving fixed or mobile digital broadcasting.

Meanwhile, the video display device described in the present specification can be applied to a TV set, a monitor, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (personal digital assistant), a portable multimedia player (PMP) And the like.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 1 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

On the other hand, the image display apparatus 100 does not include the tuner 110 and the demodulator 120 shown in FIG. 1, unlike the illustrated in FIG. 1, but the network interface 130 or the external device interface unit ( Through 135, image content may be received and played back.

The image display apparatus 100 is an example of an image signal processing apparatus that performs signal processing of an image stored in an apparatus or an input image. Another example of the image signal processing apparatus is the display 180 illustrated in FIG. 1. And a set top box in which the audio output unit 185 is excluded, the above-described DVD player, Blu-ray player, game device, computer, etc. may be further illustrated. Among these, the set top box will be described with reference to FIGS. 2A to 2B below.

2A to 2B are internal block diagrams of a set-top box and a display device according to an embodiment of the present invention.

First, referring to FIG. 2A, the set-top box 250 and the display apparatus 300 may transmit or receive data by wire or wirelessly. Hereinafter, a description will be given focusing on differences from FIG. 1.

The set top box 250 may include a network interface unit 255, a storage unit 258, a signal processor 260, a user input interface unit 263, and an external device interface unit 265.

The network interface unit 255 provides an interface for connecting to a wired / wireless network including an internet network. It is also possible to transmit or receive data with other users or other electronic devices via the connected network or another network linked to the connected network.

The storage unit 258 may store a program for processing and controlling signals in the signal processing unit 260, and may include an image, audio, or data input from the external device interface unit 265 or the network interface unit 255. It may also serve as a temporary storage of the signal.

The signal processor 260 performs signal processing on the input signal. For example, demultiplexing or decoding of an input video signal may be performed, and demultiplexing or decoding of an input audio signal may be performed. To this end, a video decoder or an audio decoder may be provided. The signal processed video signal or audio signal may be transmitted to the display apparatus 300 through the external device interface unit 265.

The user input interface unit 263 transmits a signal input by the user to the signal processor 260 or transmits a signal from the signal processor 260 to the user. For example, various control signals, such as power on / off, operation input, setting input, etc., which are input through a local key (not shown) or the remote control apparatus 200, may be received and transmitted to the signal processor 260.

The external device interface unit 265 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, an interface for transmitting or receiving data with the display apparatus 300 is provided. It is also possible to provide an interface for data transmission or reception with an external device such as a game device, a camera, a camcorder, a computer (notebook computer) or the like.

The set top box 250 may further include a media input unit (not shown) for reproducing a separate media. An example of such a media input unit may be a Blu-ray input unit (not shown). That is, the set top box 250 can be provided with a Blu-ray player or the like. The input media such as a Blu-ray disc may be transmitted to the display apparatus 300 through the external device interface unit 265 for display after signal processing such as demultiplexing or decoding in the signal processing unit 260. .

The display apparatus 300 includes a broadcast receiver 272, an external device interface 273, a storage 278, a controller 280, a user input interface 283, a display 290, and an audio output unit ( 295).

The broadcast receiving unit 272 may include a tuner unit 270 and a demodulation unit 275.

The tuner 270, the demodulator 275, the storage 278, the controller 280, the user input interface 283, the display 290, and the audio output unit 295 are described with reference to FIG. 1. The tuner 110, the demodulator 120, the storage 140, the controller 170, the user input interface 150, the display 180, and the audio output unit 185 are described. Omit the description.

The external device interface unit 273 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, it provides an interface for transmitting or receiving data with the set-top box 250.

Accordingly, the video signal or the audio signal input through the set top box 250 is output through the display 290 or the audio output unit 295 via the control unit 290.

Next, referring to FIG. 2B, the set-top box 250 and the display apparatus 300 are the same as the set-top box 250 and the display apparatus 300 of FIG. 2A, except that the broadcast receiving unit 272 is the display apparatus ( 300) The difference is that it is located in the set-top box 250, not mine. In addition, the broadcast receiving unit 272 has a difference in that it further includes a network interface unit 255. Only the differences are described below.

The signal processor 260 may perform signal processing of a broadcast signal received through the tuner 270 and the demodulator 275. In addition, the user input interface unit 263 may receive an input such as channel selection and channel storage.

Meanwhile, although the audio output unit 185 of FIG. 1 is not illustrated in the set top box 250 of FIGS. 2A to 2B, it is also possible to have separate audio output units.

3 is an internal block diagram of the controller of FIG. 1, FIG. 4 is a diagram illustrating various formats of a 3D image, and FIG. 5 is a diagram illustrating an operation of a viewing apparatus according to the format of FIG. 4.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 310, the image processor 320, the processor 330, the OSD generator 340, the mixer 345 , Frame rate converter 350, and formatter 360. The audio processing unit (not shown) and the data processing unit (not shown) may be further included.

The demultiplexer 310 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processor 320 may perform image processing of the demultiplexed image signal. To this end, the image processing unit 320 may include a video decoder 225 and a scaler 235. [

The image decoder 225 decodes the demultiplexed image signal, and the scaler 235 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 225 may include a decoder of various standards.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

The image signal decoded by the image processor 320 may be a 3D image signal having various formats. For example, the image may be a 3D image signal including a color image and a depth image, or may be a 3D image signal including a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

Here, the format of the 3D video signal is a side by side format (FIG. 4A) in which the left eye video signal L and the right eye video signal R are disposed left and right, as shown in FIG. 4, and up and down. Top / Down format (FIG. 4B) to arrange, Frame Sequential format (FIG. 4C) to arrange by time division, Interlaced format which mixes left-eye video signal and right-eye video signal line by line (FIG. 4B) 4d), a checker box format (FIG. 4E) for mixing the left eye image signal and the right eye image signal for each box may be used.

The processor 330 may control overall operations in the image display apparatus 100 or the controller 170. For example, the processor 330 may control the tuner 110 to control tuning of an RF broadcast corresponding to a channel selected by a user or a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

In addition, the processor 330 may perform data transmission control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100. In addition, the generated OSD signal may include a 2D object or a 3D object.

In addition, the OSD generator 340 may generate a pointer that can be displayed on a display based on a pointing signal input from the remote controller 200. In particular, such a pointer may be generated by the pointing signal processor, and the OSD generator 240 may include such a pointing signal processor (not shown). Of course, the pointing signal processor (not shown) may be provided separately without being provided in the OSD generator 240.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded image signal processed by the image processor 320. In this case, the OSD signal and the decoded video signal may each include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the data as it is without additional frame rate conversion.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. In addition, the synchronization signal Vsync for opening the left eye glass and the right eye glass of the 3D viewing apparatus 195 may be output.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

Meanwhile, in the present specification, the 3D video signal is meant to include a 3D object. Examples of the object include a picture in picture (PIP) image (still image or a video), an EPG indicating broadcast program information, various menus, widgets, There may be an icon, text, an object in the image, a person, a background, a web screen (newspaper, magazine, etc.).

The formatter 360 may change the format of the 3D video signal. For example, it may be changed to any one of various formats illustrated in FIG. 4. Accordingly, according to the format, as shown in FIG. 5, the operation of the viewing apparatus of the glasses type may be performed.

First, FIG. 5A illustrates an operation of the 3D glasses 195, in particular the shutter glass 195, when the formatter 360 arranges and outputs the frame sequential format among the formats of FIG.

That is, when the left eye image L is displayed on the display 180, the left eye glass of the shutter glass 195 is opened and the right eye glass is closed. When the right eye image R is displayed, The left eye glass is closed and the right eye glass is opened.

FIG. 5B illustrates an operation of the 3D glass 195, in particular, the polarization glass 195 when the formatter 360 arranges and outputs the side-by-side format among the formats of FIG. 4. Meanwhile, the 3D glass 195 applied in FIG. 5B may be a shutter glass, and the shutter glass may be operated as a polarized glass by keeping both the left eye glass and the right eye glass open. .

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, an edge or selectable object may be detected within the 2D image signal according to a 3D image generation algorithm, and an object or selectable object according to the detected edge may be separated into a 3D image signal and generated. Can be. At this time, the generated 3D image signal can be separated into the left eye image signal L and the right eye image signal R, as described above.

Although not shown in the figure, a 3D processor (not shown) for processing a 3D effect signal may be further disposed after the formatter 360. The 3D processor (not shown) may process brightness, tint, and color adjustment of an image signal to improve 3D effects. For example, signal processing may be performed to sharpen the near distance and blur the far distance. Meanwhile, the functions of the 3D processor may be merged into the formatter 360 or merged into the image processor 320. This will be described later with reference to FIG. 6 and the like.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

Also, the audio processor (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.

In FIG. 3, the signals from the OSD generator 340 and the image processor 320 are mixed in the mixer 345 and then 3D processed in the formatter 360, but the present invention is not limited thereto. May be located after the formatter. That is, the output of the image processor 320 is 3D processed by the formatter 360, and the OSD generator 340 performs 3D processing together with OSD generation, and then mixes each processed 3D signal by the mixer 345. It is also possible.

Meanwhile, a block diagram of the controller 170 shown in FIG. 3 is a block diagram for one embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be provided separately.

6 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.

Referring to the drawings, in order to increase the 3D effect, the controller 170 may perform 3D effect signal processing. Among them, in particular, the size or tilt of the 3D object in the 3D image may be adjusted.

As shown in FIG. 6A, the 3D image signal or the 3D object 510 in the 3D image signal may be enlarged or reduced 512 as a whole at a predetermined ratio, and as shown in FIGS. 6B and 6C. The 3D object may be partially enlarged or reduced (trapezoidal shapes 514 and 516). In addition, as illustrated in FIG. 6D, at least a part of the 3D object may be rotated (parallel quadrilateral shape) 518. Through such scaling (scaling) or tilting, it is possible to emphasize a three-dimensional effect, that is, a three-dimensional effect, of a 3D image or a 3D object in the 3D image.

On the other hand, as the slope becomes larger, as shown in FIG. 6 (b) or 6 (c), the length difference between the parallel sides of the trapezoidal shapes 514 and 516 increases, or as shown in FIG. 6 (d), the rotation angle is increased. It gets bigger.

Meanwhile, the size adjustment or the tilt adjustment may be performed after the 3D video signal is aligned in a predetermined format in the formatter 360. Alternatively, it may be performed by the scaler 235 in the image processor 320. On the other hand, the OSD generation unit 340, it is also possible to create the object in the shape as shown in Figure 6 to generate the OSD to emphasize the 3D effect.

On the other hand, although not shown in the figure, as a signal processing for a three-dimensional effect, in addition to the size adjustment or tilt adjustment illustrated in Figure 6, the brightness (brightness), tint (Tint) and It is also possible to perform signal processing such as color adjustment. For example, signal processing may be performed to sharpen the near distance and blur the far distance. Meanwhile, the signal processing for the 3D effect may be performed in the controller 170 or may be performed through a separate 3D processor. In particular, when performed in the controller 170, it may be performed in the formatter 360 together with the above-described size adjustment or tilt adjustment, or may be performed in the image processor 320.

FIG. 7 is a diagram illustrating an image formed by a left eye image and a right eye image, and FIG. 8 is a diagram illustrating a depth of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 615, 625, 635, and 645 are illustrated.

First, the first object 615 includes first left eye images 611 and L based on the first left eye image signal and first right eye images 613 and R based on the first right eye image signal. An interval between the first left eye image 611 and L and the first right eye image 613 and R is illustrated to be d1 on the display 180. At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the first left eye image 611 and an extension line connecting the right eye 603 and the first right eye image 603. Accordingly, the user recognizes that the first object 615 is located behind the display 180.

Next, since the second object 625 includes the second left eye images 621 and L and the second right eye images 623 and R and overlaps each other, the second object 625 is displayed on the display 180. do. Accordingly, the user recognizes that the second object 625 is located on the display 180.

Next, the third object 635 and the fourth object 645 are the third left eye image 631 and L, the second right eye image 633 and R, and the fourth left eye image 641 and L and the fourth object, respectively. The right eye images 643 and R are included, and the intervals are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 635 and the fourth object 645 are positioned at the positions where the images are formed, respectively, and in the drawing, each of them is located in front of the display 180.

At this time, it is recognized that the fourth object 645 is projected before the third object 635, that is, more protruded than the third object 635. This is because the interval between the fourth left eye image 641, L and the fourth right eye image 643, d4 is larger than the interval d3 between the third left eye image 631, L and the third right eye image 633, R. [

Meanwhile, in the exemplary embodiment of the present invention, the distance between the display 180 and the objects 615, 625, 635, and 645 recognized by the user is expressed as a depth. Accordingly, the depth when the user is recognized as if it is located behind the display 180 has a negative value (-), and the depth when the user is recognized as if it is located before the display 180. (depth) is assumed to have a negative value (+). That is, the greater the degree of protrusion in the direction of the user, the greater the size of the depth.

Referring to FIG. 8, the distance a between the left eye image 701 and the right eye image 702 of FIG. 8A is the distance between the left eye image 701 and the right eye image 702 shown in FIG. 8B. If (b) is smaller, it can be seen that the depth a 'of the 3D object of FIG. 8 (a) is smaller than the depth b' of the 3D object of FIG. 8 (b).

As such, when the 3D image is exemplified as the left eye image and the right eye image, a position recognized as image formation from the user's point of view varies depending on the distance between the left eye image and the right eye image. Therefore, by adjusting the display interval of the left eye image and the right eye image, it is possible to adjust the depth of the 3D image or 3D object composed of the left eye image and the right eye image.

9 is a diagram illustrating a control method of the remote controller of FIG. 1.

As illustrated in FIG. 9A, a pointer 205 corresponding to the remote controller 200 is displayed on the display 180.

The user can move or rotate the remote control device 200 up and down, left and right (FIG. 9B) and front and rear (FIG. 9C). The pointer 205 displayed on the display 180 of the image display device corresponds to the movement of the remote controller 200. The remote control apparatus 200 may be referred to as a spatial remote controller because the pointer 205 is moved and displayed according to the movement in the 3D space as shown in the figure.

FIG. 9B illustrates that when the user moves the remote control apparatus 200 to the left side, the pointer 205 displayed on the display 180 of the image display apparatus also moves to the left side correspondingly.

Information about the movement of the remote control device 200 detected through the sensor of the remote control device 200 is transmitted to the image display device. The image display device may calculate the coordinates of the pointer 205 from the information about the movement of the remote controller 200. The image display device may display the pointer 205 to correspond to the calculated coordinates.

FIG. 9C illustrates a case in which the user moves the remote control apparatus 200 away from the display 180 while pressing a specific button in the remote control apparatus 200. As a result, the selection area in the display 180 corresponding to the pointer 205 may be zoomed in and enlarged. On the contrary, when the user moves the remote controller 200 to be closer to the display 180, the selection area in the display 180 corresponding to the pointer 205 may be zoomed out and reduced. On the other hand, when the remote control device 200 moves away from the display 180, the selection area is zoomed out, and when the remote control device 200 approaches the display 180, the selection area may be zoomed in.

On the other hand, while pressing a specific button in the remote control device 200 can recognize the up, down, left and right movement. That is, when the remote control device 200 moves away from or near the display 180, the up, down, left and right movements are not recognized, and only the front and back movements can be recognized. In a state where a specific button in the remote controller 200 is not pressed, only the pointer 205 moves according to the up, down, left, and right movements of the remote controller 200.

Meanwhile, the moving speed or the moving direction of the pointer 205 may correspond to the moving speed or the moving direction of the remote control apparatus 200.

10 is an internal block diagram of the remote control device of FIG. 1.

Referring to the drawings, the remote control apparatus 200 includes a wireless communication unit 825, a user input unit 835, a sensor unit 840, an output unit 850, a power supply unit 860, a storage unit 870, It may include a controller 880.

The wireless communication unit 825 transmits and receives a signal with any one of the image display apparatus according to the embodiments of the present invention described above. Among the image display apparatuses according to the exemplary embodiments of the present invention, one image display apparatus 100 will be described as an example.

In the present embodiment, the remote control apparatus 200 may include an RF module 821 capable of transmitting and receiving signals with the image display apparatus 100 according to the RF communication standard. In addition, the remote control apparatus 200 may include an IR module 823 capable of transmitting and receiving a signal with the image display apparatus 100 according to the IR communication standard.

In the present embodiment, the remote control apparatus 200 transmits a signal containing information on the movement of the remote control apparatus 200 to the image display apparatus 100 through the RF module 821.

In addition, the remote control apparatus 200 may receive a signal transmitted from the image display apparatus 100 through the RF module 821. In addition, the remote control apparatus 200 may transmit a command regarding power on / off, channel change, volume change, etc. to the image display apparatus 100 through the IR module 823 as necessary.

The user input unit 835 may be configured as a keypad, a button, a touch pad, or a touch screen. The user may input a command related to the image display apparatus 100 to the remote control apparatus 200 by manipulating the user input unit 835. When the user input unit 835 includes a hard key button, the user may input a command related to the image display apparatus 100 to the remote control apparatus 200 by pushing a hard key button. When the user input unit 835 includes a touch screen, the user may input a command related to the image display apparatus 100 to the remote controller 200 by touching a soft key of the touch screen. In addition, the user input unit 835 may include various kinds of input means that the user can operate, such as a scroll key or a jog key, and the present embodiment does not limit the scope of the present invention.

The sensor unit 840 may include a gyro sensor 841 or an acceleration sensor 843. The gyro sensor 841 may sense information about the movement of the remote controller 200.

For example, the gyro sensor 841 may sense information about an operation of the remote controller 200 based on the x, y, and z axes. The acceleration sensor 843 may sense information about a moving speed of the remote controller 200. Meanwhile, a distance measuring sensor may be further provided, whereby the distance with the display 180 may be sensed.

The output unit 850 may output a video or audio signal corresponding to a manipulation of the user input unit 835 or corresponding to a signal transmitted from the image display apparatus 100. The user may recognize whether the user input unit 835 is manipulated or whether the image display apparatus 100 is controlled through the output unit 850.

For example, the output unit 850 may be an LED module 851 that is turned on when the user input unit 835 is manipulated or a signal is transmitted to or received from the image display device 100 through the wireless communication unit 825, or a vibration module generating vibration. 853), a sound output module 855 for outputting sound, or a display module 857 for outputting an image.

The power supply unit 860 supplies power to the remote control device 200. The power supply unit 860 may reduce power waste by stopping the power supply when the remote controller 200 does not move for a predetermined time. The power supply unit 860 may resume power supply when a predetermined key provided in the remote control apparatus 200 is operated.

The storage unit 870 may store various types of programs, application data, and the like required for controlling or operating the remote control apparatus 200. If the remote control apparatus 200 transmits and receives a signal wirelessly through the image display apparatus 100 and the RF module 821, the remote control apparatus 200 and the image display apparatus 100 transmit signals through a predetermined frequency band. Send and receive The control unit 880 of the remote control device 200 stores information on a frequency band or the like for wirelessly transmitting and receiving signals with the image display device 100 paired with the remote control device 200 in the storage unit 870. Reference may be made.

The controller 880 controls various items related to the control of the remote controller 200. The controller 880 transmits a signal corresponding to a predetermined key manipulation of the user input unit 835 or a signal corresponding to the movement of the remote controller 200 sensed by the sensor unit 840 through the wireless communication unit 825. 100 can be sent.

11 is a diagram illustrating a 3D viewing apparatus and an image display apparatus according to an exemplary embodiment of the present invention, and FIG. 12 is an internal block diagram of the 3D viewing apparatus and the image display apparatus of FIG. 11.

11 and 12, the 3D viewing apparatus 195 according to an embodiment of the present invention includes a power supply unit 910, a switch 918, a control unit 920, a wireless communication unit 930, and a left eye glass 940. ), The right eye glass 960 may be included.

The 3D viewing apparatus 195 of FIG. 11 may be a shutter glass or a polarizing glass.

For example, when the switch 918 of the 3D viewing apparatus 195 is turned on, operating power from the power supply unit 910 may be supplied to the controller 920 and the wireless communication unit 930.

According to the synchronization signal received from the image display apparatus 100, power from the power supply unit 910 may be alternately supplied to the left eye glass 940 and the right eye glass 960.

For example, the driving voltage VthL may be applied to the left eye glass 940 in the first period, and the driving voltage VthR may be applied to the right eye glass 960 in the second period. Accordingly, the left eye glass 940 and the right eye glass 960 may be alternately opened.

The controller 920 synchronizes the left eye image frame and the right eye image frame displayed on the display 180 of the image display apparatus 100 with the left eye glass 940 and the right eye glass 960 of the 3D viewing apparatus 195. It can be controlled to open and close. In this case, the left eye glass 940 and the right eye glass 960 may be opened and closed in synchronization with the synchronization signal Vsync received from the wireless communication unit 198.

Meanwhile, when the image displayed on the image display apparatus 100 is a 3D image, the left eye glass 940 and the right eye glass 960 may be alternately opened or closed in synchronization with the corresponding synchronization signal Vsync. .

On the other hand, if the image displayed on the image display apparatus 100 is a 2D image, the controller 920 opens or closes the left eye glass 940 and the right eye glass 960 together in synchronization with the corresponding synchronization signal Vsync. Can be controlled.

In addition, the controller 920 may control operations of the power supply unit 910 and the wireless communication unit 930. When the switch 918 is turned on, the controller 920 may control the power supply 910 to operate to supply power to each component.

Meanwhile, the controller 920 may receive a pairing signal from the image display apparatus 100 through the wireless communication unit 930 for pairing with the image display apparatus 100. In addition, the controller 920 may control the pairing response signal to be transmitted to the image display apparatus 100.

The wireless communication unit 930 may receive a pairing signal from the video display device 100 and transmit a pairing response signal to the video display device 100. In addition, the synchronization signal Vsync may be received from the image display apparatus 100.

As described above, the image display apparatus 100 may include a wireless communication unit 198, a controller 170, a display 180, and the like. Hereinafter, the operation with the 3D viewing apparatus 195 will be described.

In this case, the wireless communication unit 198 may be provided in the viewing device interface unit 160, the external device interface unit 130, or the network interface unit 135.

Meanwhile, the wireless communication unit 198 may transmit a pairing signal to the 3D viewing apparatus 195 and may receive a pairing response signal from the 3D viewing apparatus 195. The wireless communication unit 198 may transmit the synchronization signal Vsync to the 3D viewing apparatus 195 after pairing is completed.

Meanwhile, in the wireless communication between the image display apparatus 100 and the 3D viewing apparatus 195, various methods such as infrared communication, RF communication, and Bluetooth communication may be used.

13 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 14 to 20 are diagrams for explaining various examples of the method of operating the image display apparatus of FIG. 13.

Referring to the drawings, a plurality of 3D image display modes are entered (S1310).

The plurality of 3D image display modes may be manually entered by a user's input. For example, when a remote controller 200 or a local key (not shown) is provided with a hot keey for displaying a plurality of 3D images, a hot key may be used to enter the plurality of 3D images display mode. Can be. Alternatively, as another example, a plurality of 3D image display mode objects may be selected and entered while the menu is displayed on the display 180.

In this case, the plurality of 3D image display modes are modes for viewing different images for each of a plurality of users. For example, a first user wearing the first viewing device 195a may have a first 3D image of a plurality of images. Only the user can watch, and the user who wears the second viewing device 195b may be able to view only the second 3D image among the plurality of images.

On the other hand, the entry into the plurality of 3D image display modes may be automatically entered. For example, when a plurality of users wear the 3D viewing apparatuses 195a and 195b, respectively, the image display apparatus 100 may detect this and automatically enter a plurality of 3D image display modes. Detection of the wearing of the plurality of viewing apparatuses may be detected through the photographing unit (not shown).

Next, the first 3D image is arranged into a first left eye image and a first right eye image (S1320). In operation S1330, the second 3D image is arranged into a second left eye image and a second right eye image. The first 3D image and the second 3D image are displayed on the display at different times (S1340).

When entering the plurality of 3D image display modes, the controller 170 may receive a plurality of 3D images. Alternatively, when the controller 170 enters a plurality of 3D image display modes, the controller 170 may receive a plurality of 2D images and convert the plurality of 2D images into a plurality of 3D images. The operation of converting the 2D image into the 3D image may be performed by the formatter 360 as described above.

The formatter 360 arranges the first 3D image among the input 3D images into a first left eye image and a first right eye image. The second 3D image is arranged as a second left eye image and a second right eye image among the plurality of input 3D images.

As illustrated in FIG. 14, the formatter 360 uses the first left eye image 1405 of the first 3D image 1410 to display the first left eye image 1442 such that the formatter 360 is disposed only in the left region according to the side by side format. By using the first right eye image 1415 of the first 3D image 1410, the first right eye image 1446 may be arranged to be disposed only in the left region according to the side by side format.

In addition, as shown in FIG. 14, the formatter 360 uses the second left eye image 1425 of the second 3D image 1430, and according to the side by side format, the second left eye image 1444 to be disposed only in the right region. ), And using the second right eye image 1435 of the second 3D image 1430, the second right eye image 1484 may be arranged to be disposed only in the right region according to the side by side format.

As illustrated in FIG. 14, the formatter 360 may be arranged in order of the first left eye image 1442, the second left eye image 1444, the first right eye image 1446, and the second right eye image 1446.

Accordingly, the display 180 displays the first left eye image 1442 at the first time t1, displays the second left eye image 1444 at the second time t2, and displays the third time t3. ) May be displayed on the first right eye image 1446, and the second right eye image 1484 may be displayed at the fourth time t4.

That is, the first 3D image 1410 and the second 3D image 1430 are sequentially displayed at different times.

Meanwhile, on the display 180, a film having a left eye polarization pattern and a right eye polarization pattern corresponding to the side by side format may be disposed. That is, the left eye polarization pattern may be disposed in the left region of the display 180, and the right eye polarization pattern may be disposed in the right region.

On the other hand, the first viewing device 195a is turned on at the first time t1 and the third time t3, and the second viewing device 195b is turned on at the second time t2 and the fourth time t4. Is turned on.

Accordingly, a user wearing the first viewing device 195a may watch the first left eye image 1442 and the first right eye image 1446, and the combination thereof may allow the first 3D image 1410 to be viewed. You can watch.

On the other hand, a user wearing the second viewing device 195b may watch the second left eye image 1444 and the second right eye image 1484, and the combination thereof may view the second 3D image 1430. It becomes possible.

Meanwhile, since the turn-on periods of the viewing devices are alternated, the user wearing the first viewing device 195a may watch only the first 3D image 1410, and the user wearing the second viewing device 195b may read 2 Only the 3D image 1430 can be viewed.

Meanwhile, in the first viewing apparatus 195a, the left eye polarization pattern may be formed or activated in both the left eye glass and the right eye glass so that only the left region of the display 180 can be viewed. When the left eye polarization pattern is formed, the synchronization signal from the image display apparatus 100 is not required, but when the left eye polarization pattern is activated, the synchronization signal from the image display apparatus 100 is required. By the synchronization signal, the left eye glass and the right eye glass of the first viewing device 195a may be turned on to activate the left eye polarization pattern.

On the other hand, in the second viewing apparatus 195b, the right eye polarization pattern may be formed or activated in both the left eye glass and the right eye glass so that only the right region of the display 180 can be viewed. When the right eye polarization pattern is formed, the synchronization signal from the image display apparatus 100 is not required, but when the right eye polarization pattern is activated, the synchronization signal from the image display apparatus 100 is required. By the synchronization signal, the left eye glass and the right eye glass of the first viewing device 195a may be turned on to activate the left eye polarization pattern.

FIG. 15 is similar to FIG. 14 except that the image order of the formatter 360 is different.

That is, as shown in FIG. 15, the formatter 360 may be arranged in order of the first left eye image 1442, the first right eye image 1446, the second left eye image 1444, and the second right eye image 1444. have.

Accordingly, the display 180 displays the first left eye image 1442 at the first time t1, displays the first right eye image 1446 at the second time t2, and displays the third time t3. ), The second left eye image 1444 is displayed, and the second right eye image 1484 is displayed at the fourth time t4.

That is, the first 3D image 1410 and the second 3D image 1430 are sequentially displayed at different times.

Accordingly, the first viewing device 195a is turned on at the first time t1 and the second time t2, and the second viewing device 195b is turned on at the third time t3 and the fourth time ( turn on t4).

Accordingly, a user wearing the first viewing device 195a may watch the first left eye image 1442 and the first right eye image 1446, and the combination thereof may allow the first 3D image 1410 to be viewed. You can watch.

On the other hand, a user wearing the second viewing device 195b may watch the second left eye image 1444 and the second right eye image 1484, and the combination thereof may view the second 3D image 1430. It becomes possible.

Meanwhile, since the turn-on periods of the viewing devices are alternated, the user wearing the first viewing device 195a may watch only the first 3D image 1410, and the user wearing the second viewing device 195b may read 2 Only the 3D image 1430 can be viewed.

In FIG. 16, a first 3D image 1610 including a first 3D object 1615 that can be viewed only by a user wearing the first viewing apparatus 195a is displayed on a display at a first time, and the second viewing apparatus is displayed. A second 3D image 1620 including a second 3D object 1625 that can be viewed only by a user wearing 195b is displayed on the display at a second time.

Accordingly, a plurality of 3D images can be viewed using one display 180, so that user convenience can be increased.

Next, FIGS. 17 to 18 illustrate another example of the first 3D image and the second 3D image array.

First, referring to FIG. 17, the formatter 360 uses the first left eye image 1705 of the first 3D image 1710 to arrange the first left eye image 1742 to be disposed only on odd lines according to an interlace format. The first right eye image 1746 may be arranged to be disposed only on odd lines according to an interlace format using the first right eye image 1715 of the first 3D image 1710.

In addition, as shown in FIG. 17, the formatter 360 uses the second left eye image 1725 of the second 3D image 1730 to display the second left eye image 1744 such that the second left eye image 1744 is disposed only on even lines according to an interlace format. Using the second right eye image 1735 of the second 3D image 1730, the second right eye image 1748 may be arranged to be disposed only on even lines according to an interlace format.

As illustrated in FIG. 17, the formatter 360 may be arranged in order of the first left eye image 1742, the second left eye image 1744, the first right eye image 1746, and the second right eye image 1748.

Accordingly, the display 180 displays the first left eye image 1742 at the first time t1, displays the second left eye image 1744 at the second time t2, and displays the third time t3. ), The first right eye image 1746 may be displayed, and at the fourth time t4, the second right eye image 1748 may be displayed.

That is, the first 3D image 1710 and the second 3D image 1730 are sequentially displayed at different times.

Meanwhile, on the display 180, a film having a left eye polarization pattern and a right eye polarization pattern corresponding to the interlace format may be disposed. That is, the left eye polarization pattern may be disposed on the odd line of the display 180, and the right eye polarization pattern may be disposed on the even line.

On the other hand, the first viewing device 195a is turned on at the first time t1 and the third time t3, and the second viewing device 195b is turned on at the second time t2 and the fourth time t4. Is turned on.

Accordingly, a user wearing the first viewing device 195a may watch the first left eye image 1742 and the first right eye image 1746, and the combination thereof may provide a first 3D image 1710. You can watch.

Meanwhile, a user wearing the second viewing device 195b may watch the second left eye image 1744 and the second right eye image 1748, and the combination thereof may allow the user to watch the second 3D image 1730. It becomes possible.

Meanwhile, in the first viewing apparatus 195a, the left eye polarization pattern may be formed or activated in both the left eye glass and the right eye glass so that only the left region of the display 180 can be viewed.

On the other hand, in the second viewing apparatus 195b, the right eye polarization pattern may be formed or activated in both the left eye glass and the right eye glass so that only the right region of the display 180 can be viewed.

FIG. 18 is similar to FIG. 17 except that the image order of the formatter 360 is different.

That is, as shown in FIG. 18, the formatter 360 may arrange the first left eye image 1742, the first right eye image 1746, the second left eye image 1744, and the second right eye image 1748 in the order. have.

Accordingly, the display 180 displays the first left eye image 1742 at the first time t1, displays the first right eye image 1746 at the second time t2, and displays the third time t3. ) May display the second left eye image 1744, and display the second right eye image 1748 at a fourth time t4.

The first viewing device 195a is turned on at the first time t1 and the second time t2, and the second viewing device 195b is turned on at the third time t3 and the fourth time t4. Turn on.

Next, FIGS. 19 to 20 illustrate another example of the first 3D image and the second 3D image array.

First, referring to FIG. 19, the formatter 360 may alternately arrange the first left eye image 1905 and the first right eye image 1915 of the first 3D image 1910 according to the frame sequential format. .

In addition, as illustrated in FIG. 19, the formatter 360 may alternately arrange the second left eye image 1925 and the second right eye image 1935 of the second 3D image 1930 according to the frame sequential format. .

As illustrated in FIG. 19, the formatter 360 may be arranged in order of the first left eye image 1942, the second left eye image 1944, the second left eye image 1946, and the second right eye image 1944.

Accordingly, the display 180 displays the first left eye image 1942 at the first time t1, displays the second left eye image 1944 at the second time t2, and displays the third time t3. ), The first right eye image 1946 may be displayed, and the second right eye image 1948 may be displayed at the fourth time t4.

On the other hand, in the first viewing device 195a, the left eye glass is turned on at the first time t1, and the right eye glass is turned on at the third time t3. On the other hand, in the second viewing device 195b, the left eye glass is turned on at the second time t2, and the right eye glass is turned on at the fourth time t4.

Accordingly, a user wearing the first viewing device 195a may watch the first left eye image 1942 and the first right eye image 1946, and the combination thereof may allow the first 3D image 1910 to be viewed. You can watch.

On the other hand, a user wearing the second viewing device 195b may watch the second left eye image 1944 and the second right eye image 1948, and the combination thereof may allow the user to watch the second 3D image 1930. It becomes possible.

FIG. 20 is similar to FIG. 19 except that the image order arranged in the formatter 360 is different.

That is, as shown in FIG. 20, the formatter 360 may arrange the first left eye image 1942, the first right eye image 1946, the second left eye image 1944, and the second right eye image 1944 in order. have.

Accordingly, the display 180 displays the first left eye image 1942 at the first time t1, displays the first right eye image 1946 at the second time t2, and displays the third time t3. ) May display the second left eye image 1944, and at the fourth time t4, the second right eye image 1948 may be displayed.

On the other hand, in the first viewing device 195a, the left eye glass is turned on at the first time t1, and the right eye glass is turned on at the second time t2. On the other hand, in the second viewing device 195b, the left eye glass is turned on at the third time t3, and the right eye glass is turned on at the fourth time t4.

21 is a flowchart illustrating a method of operating an image display device according to another exemplary embodiment. FIGS. 22 to 28 are views referred to for describing various examples of the method of operating the image display device of FIG. 21.

According to another embodiment of the present invention, in the multiple image display mode including the 3D image, the 3D image and the 2D image may be displayed respectively.

Referring to the drawings, a plurality of image display modes are entered (S2110). Entering the multiple image display mode may be performed by selecting the multiple image display mode object through a hot key of a remote control device or the like, or while the menu is displayed on the display 180.

The multiple image display mode is a mode for viewing different images for each of a plurality of users. For example, the first user wearing the first viewing device 195a may only view 3D images among the plurality of images. The user wearing the second viewing device 195b may be able to view only 2D images from among the plurality of images.

Next, the 3D image is arranged into a first left eye image and a first right eye image (S2120). The 2D image is arranged into a second left eye image and a second right eye image (S2130). The 3D image and the 2D image are displayed on the display at different times (S2140).

The formatter 360 arranges the 3D image into a left eye image and a right eye image. And 2D video is also arranged.

As illustrated in FIG. 22, the formatter 360 uses the left eye image 2105 of the 3D image 2110 to arrange the first left eye image 2142 so as to be disposed only in the left region according to the side by side format. Using the right eye image 2115 of the image 2110, the right eye image 2146 may be arranged to be disposed only in the left region according to the side by side format.

In this case, as shown in FIG. 22, the 2D image may be arranged between the left eye image 2105 and the right eye image 2115.

Accordingly, the display 180 displays the left eye image 2142 at the first time tx, displays the 2D image 2130 at the second time ty, and the right eye image at the third time tz. 2144 may be displayed.

On the other hand, the first viewing device 195a is turned on at the first time tx and the third time tz, and the second viewing device 195b is turned on at the second time ty.

Accordingly, the user wearing the first viewing device 195a can watch the left eye image 2142 and the right eye image 2146, and the combination thereof enables viewing of the 3D image 2110.

Meanwhile, the user wearing the second viewing device 195b may watch the 2D image 2130.

Meanwhile, since the turn-on periods of the viewing devices are alternated, the user wearing the first viewing device 195a may watch only the 3D image 2110, and the user wearing the second viewing device 195b may view the 2D image ( 2130) can be watched only.

Meanwhile, referring to FIG. 22, the area of the 3D image 2130, specifically, the area where the left eye image 2142 and the right eye image 2144 are displayed on the display 180, is the area where the 2D image 2130 is displayed on the display. It may be smaller than the area of. That is, the display area of the 2D image 2130 may be twice the display area of the 3D image 2130. When the 3D image 2130 is displayed by the polarization method, the 3D image 2130 is displayed on a portion of the display 180 to enable viewing without cross talk or the like when viewing the 3D image.

FIG. 23 is similar to FIG. 22 except that the image order arranged in the formatter 360 is different. There is a difference in that the 2D image 2130 is displayed after being arranged behind the left eye image 2142 and the right eye image 2144. As a result, the first viewing device 195a is turned on at the first time tx and the second time ty, and the second viewing device 195b is turned on at the third time tz.

FIG. 24 illustrates that a 3D image 2410 including a 3D object 2417 that can be viewed only by a user wearing the first viewing device 195a is displayed on the display at a first time, and the second viewing device 195b is displayed. For example, the 2D image 2420 that can be viewed only by the worn user is displayed on the display 180 at the second time.

Accordingly, 3D video and 2D video can be viewed using one display 180, so that user convenience can be increased.

Next, FIG. 25 illustrates that the left eye image 2505 and the right eye image 2515 of the 3D image 2510 are arranged in an interlaced format, and the left eye image 2530 is arranged between the left eye image 2542 and the right eye image 2544. To illustrate.

As a result, the first viewing device 195a is turned on at the first time tx and the third time tz, and the second viewing device 195b is turned on at the second time ty.

Next, FIG. 26 is similar to FIG. 25 except that the image order arranged in the formatter 360 is different. There is a difference in that the 2D image 2530 is displayed after being arranged behind the left eye image 2542 and the right eye image 2544. As a result, the first viewing device 195a is turned on at the first time tx and the second time ty, and the second viewing device 195b is turned on at the third time tz.

Next, FIG. 27 illustrates that the left eye image 2705 and the right eye image 2715 of the 3D image 2710 are arranged in a frame sequential format, and the left eye image 2730 is interposed between the left eye image 2742 and the right eye image 2744. The arrangement is illustrated.

Accordingly, the left eye glass is turned on at the first time tx, the right eye glass is turned on at the third time tz, and the second viewing device 195b is configured to be first. At 2 hours, both the left eye glass and the right eye glass are turned on.

Meanwhile, referring to FIG. 27, the area of the 3D image 2710, specifically, the left eye image 2705 and the right eye image 2715 displayed on the display 180, the 2D image 2730 is displayed on the display 180. It may be equal to the area of the displayed area. In the case of displaying the 3D image 2130 by the frame sequential method, the 3D image 2710, the left eye image 2705 and the right eye image 2715 at different times, the 2D image 2730 at different times. ), It becomes possible to view without crosstalk or the like at the time of viewing the 3D video.

Next, FIG. 28 is similar to FIG. 27 except that the image order arranged in the formatter 360 is different. There is a difference in that the 2D image 2730 is displayed after being arranged behind the left eye image 2705 and the right eye image 2715. As a result, the left eye glass is turned on at the first time tx, the right eye glass is turned on at the second time ty, and the second viewing device 195b is configured to be first. At 3 tz, both the left and right eye glasses are turned on.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (20)

Entering a plurality of 3D image display modes;
Arranging a first 3D image into a first left eye image and a first right eye image according to the mode;
Arranging a second 3D image into a second left eye image and a second right eye image according to the mode; And
And displaying the first 3D image and the second 3D image on a display at different times.
The method of claim 1,
A user wearing a first viewing device can watch only the first 3D image, and a user wearing a second viewing device can only watch a second 3D image.
The method of claim 1,
The displaying of the first 3D image and the second 3D image may include:
And the first left eye image, the second left eye image, the first right eye image, and the second right eye image.
The method of claim 1,
The displaying of the first 3D image and the second 3D image may include:
And displaying the first left eye image, the first right eye image, the second left eye image, and the second right eye image in the order of the first left eye image.
The method of claim 1,
The first left eye image and the first right eye image of the first 3D image are displayed in a first area of the display at different times,
And the second left eye image and the second right eye image of the second 3D image are displayed at different times in a second region different from the first region of the display.
The method of claim 1,
The first left eye image and the first right eye image of the first 3D image are displayed on a first area of the display,
And the second left eye image and the second right eye image of the second 3D image are displayed on the first area of the display.
Entering a plurality of image display modes including 3D images;
Arranging the 3D image into a first left eye image and a first right eye image according to the mode;
Arranging a 2D image according to the mode; And
And displaying the 3D image and the 2D image on a display at different times.
The method of claim 7, wherein
A user wearing a first viewing device can watch only the 3D image, and a user wearing a second viewing device can only watch a 2D image.
The method of claim 7, wherein
The 3D image and 2D image display step,
And displaying the first left eye image, the 2D image, and the second left eye image in the order of the first left eye image.
The method of claim 7, wherein
The 3D image and 2D image display step,
And the first left eye image, the first right eye image, and the 2D image.
The method of claim 7, wherein
And an area of an area where the 3D image is displayed on the display is smaller than an area of the area where the 2D image is displayed on the display.
The method of claim 7, wherein
And an area of an area where the 3D image is displayed on the display is equal to an area of an area where the 2D image is displayed on the display.
A formatter for arranging a first 3D image into a first left eye image and a first right eye image and arranging a second 3D image into a second left eye image and a second right eye image in a plurality of 3D image display modes; And
And a display configured to display the first 3D image and the second 3D image at different times.
The method of claim 13,
A first viewing device capable of viewing only the first 3D image; And
And a second viewing device capable of viewing only the second 3D image.
The method of claim 13,
The display,
Displaying the first left eye image and the first right eye image of the first 3D image in different regions at different times;
And the second left eye image and the second right eye image of the second 3D image are displayed at different times in the second region different from the first region.
The method of claim 13,
The display,
Displaying the first left eye image and the first right eye image of the first 3D image in different regions at different times;
And displaying the second left eye image and the second right eye image of the second 3D image in the first region at different times.
A formatter for arranging a 3D image into a first left eye image and a first right eye image and arranging a 2D image in a multiple image display mode including a 3D image; And
And a display for displaying the 3D image and the 2D image at different times.
18. The method of claim 17,
A first viewing device capable of viewing only the 3D image; And
And a second viewing device capable of viewing only the 2D image.
18. The method of claim 17,
And an area of an area where the 3D image is displayed on the display is smaller than an area of the area where the 2D image is displayed on the display.
18. The method of claim 17,
And an area of an area where the 3D image is displayed on the display is equal to an area of an area where the 2D image is displayed on the display.
KR1020110094191A 2011-09-19 2011-09-19 Image display apparatus, and method for operating the same KR20130030601A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110094191A KR20130030601A (en) 2011-09-19 2011-09-19 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110094191A KR20130030601A (en) 2011-09-19 2011-09-19 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20130030601A true KR20130030601A (en) 2013-03-27

Family

ID=48180125

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110094191A KR20130030601A (en) 2011-09-19 2011-09-19 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20130030601A (en)

Similar Documents

Publication Publication Date Title
KR20130007824A (en) Image display apparatus, and method for operating the same
KR101855939B1 (en) Method for operating an Image display apparatus
KR101832225B1 (en) Image display apparatus, and method for operating the same
KR20130026236A (en) Image display apparatus, and method for operating the same
KR20130031065A (en) Image display apparatus, and method for operating the same
KR101836846B1 (en) Image display apparatus, and method for operating the same
KR101912635B1 (en) Image display apparatus, and method for operating the same
KR101746808B1 (en) Image display apparatus, media apparatus and method for operating the same
KR101825669B1 (en) Image display apparatus, and method for operating the same
KR20130030601A (en) Image display apparatus, and method for operating the same
KR20130030603A (en) Image display apparatus, and method for operating the same
KR20130029590A (en) Image display apparatus, and method for operating the same
KR20140047427A (en) Image display apparatus and method for operating the same
KR101878808B1 (en) Image display apparatus and method for operating the same
KR101945811B1 (en) Image display apparatus, and method for operating the same
KR20130032684A (en) Image display apparatus and method for operating the same
KR20150043875A (en) Stereoscopic image display apparatus in glassless mode and method for operating the same
KR102014149B1 (en) Image display apparatus, and method for operating the same
KR101890323B1 (en) Image display apparatus, settop box and method for operating the same
KR20130016986A (en) Image display apparatus, and method for operating the same
KR20150145243A (en) Image display apparatus and method for operating the same
KR101882214B1 (en) Image display apparatus, server and method for operating the same
KR20140063275A (en) Image display apparatus and method for operating the same
KR20140098512A (en) Image display apparatus, and method for operating the same
KR20140044181A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination