KR20110121397A - Apparatus for displaying image and method for operating the same - Google Patents

Apparatus for displaying image and method for operating the same Download PDF

Info

Publication number
KR20110121397A
KR20110121397A KR1020100040977A KR20100040977A KR20110121397A KR 20110121397 A KR20110121397 A KR 20110121397A KR 1020100040977 A KR1020100040977 A KR 1020100040977A KR 20100040977 A KR20100040977 A KR 20100040977A KR 20110121397 A KR20110121397 A KR 20110121397A
Authority
KR
South Korea
Prior art keywords
image frame
eye image
right eye
left eye
signal
Prior art date
Application number
KR1020100040977A
Other languages
Korean (ko)
Inventor
박상백
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100040977A priority Critical patent/KR20110121397A/en
Publication of KR20110121397A publication Critical patent/KR20110121397A/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0209Crosstalk reduction, i.e. to reduce direct or indirect influences of signals directed to a certain pixel of the displayed image on other pixels of said image, inclusive of influences affecting pixels in different frames or fields or sub-images which constitute a same image, e.g. left and right images of a stereoscopic display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/008Aspects relating to glasses for viewing stereoscopic images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Plasma & Fusion (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

PURPOSE: A video display apparatus and operation method thereof are provided to enable a user to effectively watch a 3D video by preventing cross torque phenomenon. CONSTITUTION: The unit of a display module is separated to a left-eye video frame and a right-eye video frame(S710). The shutter glasses include a left-eye glasses corresponding to the left-eye video frame and a right-eye glasses corresponding to the right-eye frame. The left-eye glasses have a different opening time from the right-eye glasses. The starting point of the left-eye video frame is different from the starting point of the right-eye frame.

Description

Apparatus for displaying image and method for operating the same}

The present invention relates to a method of operating an image display device, and more particularly, to an image display device and an operation method thereof capable of reducing crosstalk in displaying a 3D image.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

Recently, various studies on stereoscopic images have been conducted, and stereoscopic imaging techniques are becoming more and more common and practical in computer graphics as well as in various other environments and technologies.

SUMMARY OF THE INVENTION An object of the present invention is to provide an image display apparatus and an operation method thereof capable of reducing cross talk in 3D image display.

According to an aspect of the present invention, a display module is driven by dividing a unit frame into a left eye image frame and a right eye image frame, and a left eye glass and a right eye image frame opened corresponding to the left eye image frame. And a shutter glass having a right eye glass that is open corresponding to the at least one eye glass, wherein at least one of the left eye glass and the right eye glass is opened. It is characterized by a predetermined time slower.

According to an aspect of the present invention, there is provided a method of operating an image display apparatus, by rearranging an input image to generate a left eye image frame and a right eye image frame, and driving the display module according to the generated frame. And displaying at least one of a left eye glass and a right eye glass of a shutter glass corresponding to a start time of the left eye image frame or a start time of the right eye image frame for a predetermined time.

According to the present invention, the crosstalk phenomenon can be prevented, and 3D video can be viewed more accurately and easily.

1 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
FIG. 2 is an internal block diagram of the controller of FIG. 1.
3 is a diagram illustrating an example of a 3D video signal format capable of implementing 3D video.
4 is a diagram illustrating an operation of a shutter glass according to a frame sequential format.
5 is an internal block diagram of a shutter glass according to an embodiment of the present invention.
6 is a diagram illustrating an example of a signal for a 3D video viewing apparatus.
7 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
8 is a perspective view illustrating an embodiment of a structure of a plasma display panel.
9 is a diagram illustrating an embodiment of an electrode arrangement of a plasma display panel.
FIG. 10 is a timing diagram illustrating an embodiment of a method of time-divisionally driving a plasma display panel by dividing one frame into a plurality of subfields.
11 is a timing diagram illustrating an example of waveforms of driving signals for driving a plasma display panel.
12 to 16 are diagrams for describing an operating method of an image display apparatus according to an exemplary embodiment of the present invention of FIG. 9.

Hereinafter, the present invention will be described in more detail with reference to the drawings.

The suffixes "module" and "unit" for components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not impart any particular meaning or role by themselves. Therefore, the "module" and "unit" may be used interchangeably.

1 to 2 are internal block diagrams of an image display apparatus according to an embodiment of the present invention.

Referring to FIG. 1, the image display apparatus 100 according to an exemplary embodiment of the present invention includes a tuner 110, a demodulator 120, an external device interface unit 130, a network interface unit 135, and a storage unit ( 140, the user input interface unit 150, the control unit 170, the display module 180, the audio output unit 185, the power supply unit 190, and the 3D image viewer, for example, the shutter glass 195. It may include.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner 110 may process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal CVBS / SIF output from the tuner 110 may be directly input to the controller 170.

Also, the tuner 110 can receive RF carrier signals of a single carrier according to an Advanced Television System Committee (ATSC) scheme or RF carriers of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and converts them into intermediate frequency signals or baseband video or audio signals. I can convert it.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

For example, when the digital IF signal output from the tuner 110 is an ATSC scheme, the demodulator 120 performs 8-VSB (8-Vestigal Side Band) demodulation. Also, the demodulation unit 120 may perform channel decoding. To this end, the demodulator 120 includes a trellis decoder, a de-interleaver, and a reed solomon decoder to perform trellis decoding, deinterleaving, Solomon decoding can be performed.

For example, when the digital IF signal output from the tuner 110 is a DVB scheme, the demodulator 120 performs COFDMA (Coded Orthogonal Frequency Division Modulation) demodulation. Also, the demodulation unit 120 may perform channel decoding. To this end, the demodulator 120 may include a convolutional decoder, a deinterleaver, a reed-soloman decoder, and the like to perform convolutional decoding, deinterleaving, and reed-soloman decoding.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. Specifically, the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.

On the other hand, the demodulator 120 described above can be provided separately according to the ATSC system and the DVB system. That is, it can be provided as an ATSC demodulation unit and a DVB demodulation unit.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display module 180 and outputs an audio to the audio output unit 185.

The external device interface unit 130 may connect the external device to the image display device 100. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire / wireless. The external device interface unit 130 transmits an image, audio or data signal input from the outside to the controller 170 of the image display device 100 through a connected external device. In addition, the controller 170 may output an image, audio, or data signal processed by the controller 170 to a connected external device. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI so that video and audio signals of an external device can be input to the video display device 100. (Digital Visual Interface) terminal, HDMI (High Definition Multimedia Interface) terminal, RGB terminal, D-SUB terminal and the like.

The wireless communication unit can perform short-range wireless communication with other electronic devices. The image display device 100 may be connected to other electronic devices and networks according to communication standards such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. Can be connected.

In addition, the external device interface unit 130 may be connected through at least one of the various set top boxes and the various terminals described above to perform input / output operations with the set top box.

The external device interface unit 130 may transmit / receive data and signals with the 3D image viewing apparatus 195 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. The network interface unit 135 may include an Ethernet terminal for connection with a wired network, and for connection with a wireless network, a WLAN (Wi-Fi) or a Wibro (Wireless). Broadband, Wimax (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA) communication standards, and the like may be used.

The network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider may be received through a network. In addition, the update information and the update file of the firmware provided by the network operator can be received. It may also transmit data to the Internet or content provider or network operator.

In addition, the network interface unit 135 is connected to, for example, an Internet Protocol (IP) TV, and receives the video, audio, or data signals processed in the set-top box for the IPTV to enable bidirectional communication. The signal processed by the controller 170 may be transmitted to the set-top box for the IPTV.

Meanwhile, the above-described IPTV may mean ADSL-TV, VDSL-TV, FTTH-TV, etc. according to the type of transmission network, and include TV over DSL, Video over DSL, TV overIP (TVIP), and Broadband TV ( BTV) and the like. In addition, IPTV may also mean an Internet TV capable of accessing the Internet, or a full browsing TV.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal-processed video, audio, or data signal.

In addition, the storage unit 140 may perform a function for temporarily storing an image, audio, or data signal input to the external device interface unit 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function such as a channel map.

The storage unit 140 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), It may include at least one type of storage medium such as RAM, ROM (EEPROM, etc.). The image display apparatus 100 may reproduce and provide a file (video file, still image file, music file, document file, etc.) stored in the storage 140 to a user.

1 illustrates an embodiment in which the storage unit 140 is provided separately from the control unit 170, but the scope of the present invention is not limited thereto. The storage 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the user input interface unit 150 may be powered on / off, channel selection, and screen from the remote controller 200 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method. A user input signal such as a setting may be received, or a signal from the controller 170 may be transmitted to the remote controller 200.

In addition, for example, the user input interface unit 150 may transmit a user input signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170.

In addition, for example, the user input interface unit 150 may transmit a user input signal input from a sensing unit (not shown) that senses a user's gesture to the controller 170 or may transmit a signal from the controller 170. The transmission may be transmitted to a sensing unit (not shown). Here, the sensing unit (not shown) may include a touch sensor, an audio sensor, a position sensor, an operation sensor, and the like.

The controller 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 130, and outputs a video or audio signal. You can create and output.

The image signal processed by the controller 170 may be input to the display module 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The voice signal processed by the controller 170 may be sound output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

Although not shown in FIG. 1, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 2.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

For example, the controller 170 controls the tuner 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, video, audio, or data signals of the selected channel are processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display module 180 or the audio output unit 185.

As another example, the controller 170 may, for example, receive an external device image playback command received through the user input interface unit 150, from an external device input through the external device interface unit 130, for example, a camera or a camcorder. The video signal or the audio signal may be output through the display module 180 or the audio output unit 185.

The controller 170 may control the display module 180 to display an image. For example, a broadcast image input through the tuner 110, an external input image input through the external device interface unit 130, or an image input through the network interface unit 135 or an image stored in the storage unit 140. May be controlled to be displayed on the display module 180.

In this case, the image displayed on the display module 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate and display a 3D object with respect to a predetermined object in the image displayed on the display module 180. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

Such a 3D object may be processed to have a depth different from that of the image displayed on the display module 180. Preferably, the 3D object may be processed to protrude compared to an image displayed on the display module 180.

The controller 170 recognizes a user's position based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the image display apparatus 100 corresponding to the user position can be grasped.

On the other hand, although not shown in the figure, it may be further provided with a channel browsing processing unit for generating a thumbnail image corresponding to the channel signal or the external input signal. The channel browsing processor may receive a stream signal TS output from the demodulator 120 or a stream signal output from the external device interface 130, extract a video from the input stream signal, and generate a thumbnail image. Can be. The generated thumbnail image may be input as it is or encoded to the controller 170. In addition, the generated thumbnail image may be encoded in a stream form and input to the controller 170. The controller 170 may display a thumbnail list including a plurality of thumbnail images on the display module 180 using the input thumbnail image. In this case, the thumbnail list may be displayed in a simple viewing manner displayed in a partial region while a predetermined image is displayed on the display module 180 or in an entire viewing manner displayed in most regions of the display module 180. .

The display module 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal, etc. received from the external device interface unit 130 processed by the controller 170 to drive signals. Create

The display module 180 may be a PDP, an LCD, an OLED, a flexible display, and the like, and in particular, it is preferable that a 3D display is possible according to an embodiment of the present invention.

For viewing the 3D image, the display module 180 may be divided into an additional display method and a single display method.

The independent display method may implement a 3D image by the display module 180 alone without additional display, for example, glasses, and the like, for example, various methods such as a lenticular method and a parallax barrier. This can be applied.

Meanwhile, the additional display method may implement an additional display, that is, a 3D image using a 3D image viewing apparatus, in addition to the display module 180. For example, various methods such as a head mounted display (HMD) type and glasses type may be applied. Can be. In addition, the spectacle type may be further divided into a passive method such as a polarized glasses type and an active method such as a shutter glass type. On the other hand, the head mounted display type can be divided into passive and active methods.

In one embodiment of the present invention, the shutter glasses 195 may be provided for viewing 3D images.

The display module 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs a voice signal. The audio output unit 185 may be implemented by various types of speakers.

Meanwhile, in order to detect a gesture of a user, as described above, a sensing unit (not shown) including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the image display apparatus 100. have. The signal detected by the sensing unit (not shown) is transmitted to the controller 170 through the user input interface unit 150.

The controller 170 may detect a gesture of the user by combining or combining the image photographed by the photographing unit (not shown) or the detected signal from the sensing unit (not shown).

The power supply unit 190 supplies the corresponding power throughout the image display apparatus 100. In particular, power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display module 180 for displaying an image, and an audio output unit 185 for audio output. Can be. In addition, according to the exemplary embodiment, power may be supplied to a heat generating unit including a heating wire.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. To this end, the remote control device 200 may use infrared (IR) communication, RF (Radio Frequency) communication, Bluetooth (Bluetooth), UWB (Ultra Wideband), ZigBee (ZigBee) method and the like. In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, and display or output the audio from the remote control apparatus 200.

The video display device 100 described above is a fixed type of ATSC (8-VSB) digital broadcasting, DVB-T (COFDM) digital broadcasting, ISDB-T (BST-OFDM) digital broadcasting, and the like. It may be a digital broadcast receiver capable of receiving at least one. In addition, as a mobile type, digital broadcasting of terrestrial DMB system, digital broadcasting of satellite DMB system, digital broadcasting of ATSC-M / H system, digital broadcasting of DVB-H system (COFDM system) and media flow link only system It may be a digital broadcast receiver capable of receiving at least one of digital broadcasts. It may also be a digital broadcast receiver for cable, satellite communications, or IPTV.

On the other hand, the video display device described in the present specification is a TV receiver, a mobile phone, a smart phone (notebook computer), a digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable Multimedia Player), etc. May be included.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 1 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

FIG. 2 is an internal block diagram of the controller of FIG. 1. FIG. 3 is a diagram illustrating an example of a 3D video signal format capable of implementing 3D video. FIG. 4 is a diagram illustrating an operation of a shutter glass according to a frame sequential format. .to be.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 210, the image processor 220, the OSD generator 240, the mixer 245, the frame rate converter 250, and a formatter 260. In addition, the apparatus may further include a voice processor (not shown) and a data processor (not shown).

The demultiplexer 210 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 210 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 130.

The image processor 220 may perform image processing of the demultiplexed image signal. To this end, the image processor 220 may include an image decoder 225 and a scaler 235.

The image decoder 225 decodes the demultiplexed image signal, and the scaler 235 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 225 may include decoders of various standards.

For example, when the demultiplexed video signal is an encoded 2D video signal of MPEG-2 standard, it may be decoded by an MPEG-2 decoder.

Also, for example, the demultiplexed 2D video signal is a digital video broadcasting (DMB) method or a video signal of H.264 standard according to DVB-H, or a depth video of MPEC-C part 3 , Multi-view video according to Multi-view Video Coding (MVC) or free-view video according to Free-viewpoint TV (TV), respectively, decoded by H.264 decoder, MPEC-C decoder, MVC decoder or FTV decoder Can be.

Meanwhile, the image signal decoded by the image processor 220 may be classified into a case in which only a 2D image signal is present, a case in which a 2D image signal and a 3D image signal are mixed, and a case in which only a 3D image signal is present.

The image processor 220 may detect whether the demultiplexed video signal is a 2D video signal or a 3D video signal. Whether the image is a 3D image signal may be detected based on a broadcast signal received from the tuner 110, an external input signal from an external device, or an external input signal received through a network. In particular, whether the 3D video signal is a 3D video signal may be determined by referring to a 3D video flag, 3D video metadata, or 3D video format information in the header of the stream. You can judge.

The image signal decoded by the image processor 220 may be a 3D image signal having various formats. For example, the image may be a 3D image signal including a color image and a depth image, or may be a 3D image signal including a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

The 3D video signal format may be determined according to a method of disposing a left eye image and a right eye image generated to implement a 3D image.

The 3D image may be composed of a multi-view image. The user may view the multi-view image through the left eye and the right eye. The user may feel a three-dimensional effect of the 3D image through the difference of the image detected through the left eye and the right eye. According to an embodiment of the present invention, a multi-view image for implementing a 3D image includes a left eye image that can be recognized by the user through the left eye and a right eye image that can be recognized through the right eye.

As shown in FIG. 3A, a method in which the left eye image and the right eye image are arranged left and right is referred to as a side by side format. As shown in FIG. 3B, a method of disposing the left eye image and the right eye image up and down is referred to as a top / down format. As shown in FIG. 3C, a method of time-divisionally arranging a left eye image and a right eye image is called a frame sequential format. As shown in FIG. 3D, a method of mixing the left eye image and the right eye image for each line is called an interlaced format. As shown in FIG. 3E, a method of mixing the left eye image and the right eye image for each box is called a checker box format.

The image signal included in the signal input to the image display apparatus 100 from the outside may be a 3D image signal capable of realizing a 3D image. In addition, the graphic user interface image signal generated to display the image display apparatus 100 related information or to input a command related to the image display apparatus 100 may be a 3D image signal. The formatter 260 may mix the 3D image signal and the graphic user interface 3D image signal included in the signal input to the image display apparatus 100 from the outside and output the mixed image to the display 180.

The OSD generator 240 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 245 may mix the OSD signal generated by the OSD generator 240 and the decoded image signal processed by the image processor 220. In this case, the OSD signal and the decoded video signal may each include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 250.

The frame rate converter 250 converts the frame rate of the input video. For example, a 60Hz frame rate is converted to 120Hz or 240Hz. When converting a frame rate of 60 Hz to 120 Hz, it is possible to insert the same first frame or insert a third frame predicted from the first frame and the second frame between the first frame and the second frame. When converting a frame rate of 60 Hz to 240 Hz, it is possible to insert three more identical frames or three predicted frames.

The formatter 260 may receive a mixed signal from the mixer 245, that is, an OSD signal and a decoded video signal, and separate the 2D video signal and the 3D video signal.

Meanwhile, in the present specification, a 3D video signal means a 3D object, and examples of such an object include a picture in picture (PIP) image (still image or a video), an EPG indicating broadcast program information, various menus, widgets, There may be an icon, text, an object in the image, a person, a background, a web screen (newspaper, magazine, etc.).

The formatter 260 may change the format of the 3D video signal. For example, it may be changed to any one of the various formats illustrated in FIG. 3. In particular, according to an embodiment of the present invention, changing from the format of the 3D video signal to the frame sequential format is taken as an example. That is, the left eye image signal L and the right eye image signal R are alternately arranged in sequence. Accordingly, the 3D image viewing apparatus described with reference to FIG. 1 is preferably a shutter glass 195.

4 illustrates an operation relationship between the shutter glass 195 and the frame sequential format. FIG. 4A illustrates that the left eye glass of the shutter glass 195 is opened and the right eye glass is closed when the left eye image L is displayed on the display 180, and FIG. 4B shows the shutter glass ( The left eye glass of 195) is closed and the right eye glass is opened. That is, the left and right eyeglasses of the shutter glass 195 are opened and closed in synchronization with the image displayed on the screen.

Meanwhile, the external device interface unit 130 may transmit / receive various data and signals with the shutter glass 195 according to various communication methods such as an RF (Radio Frequency) communication method and an infrared (IR) communication method.

The formatter 260 may convert a 2D video signal into a 3D video signal. For example, an edge or selectable object may be detected within the 2D image signal according to a 3D image generation algorithm, and an object or selectable object according to the detected edge may be separated into a 3D image signal and generated. Can be. In this case, the generated 3D image signal may be separated into a left eye image signal L and a right eye image signal R as described above.

The voice processing unit (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the voice processing unit (not shown) may include various decoders.

For example, if the demultiplexed speech signal is a coded speech signal, it can be decoded. Specifically, when the demultiplexed speech signal is an encoded speech signal of MPEG-2 standard, it may be decoded by an MPEG-2 decoder. In addition, when the demultiplexed speech signal is an encoded speech signal of MPEG 4 Bit Sliced Arithmetic Coding (BSAC) standard according to the terrestrial digital multimedia broadcasting (DMB) scheme, it may be decoded by an MPEG 4 decoder. In addition, when the demultiplexed speech signal is an encoded audio signal of the AAC (Advanced Audio Codec) standard of MPEG 2 according to the satellite DMB scheme or DVB-H, it may be decoded by the AAC decoder. In addition, when the demultiplexed speech signal is a encoded speech signal of the Dolby AC-3 standard, it may be decoded by the AC-3 decoder.

Also, the voice processing unit (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel. For example, the EPG information may be TSC-PSIP (ATSC-Program and System Information Protocol) information in the ATSC scheme, and may include DVB-Service Information (DVB-SI) in the DVB scheme. . The ATSC-PSIP information or the DVB-SI information may be information included in the aforementioned stream, that is, the header (4 bytes) of the MPEG-2 TS.

Meanwhile, although FIG. 2 illustrates that the signals from the OSD generator 240 and the image processor 220 are mixed in the mixer 245 and then 3D processed in the formatter 260, the mixer is not limited thereto. May be located after the formatter. That is, the output of the image processor 220 is 3D processed by the formatter 260, and the OSD generator 240 performs 3D processing together with OSD generation, and then mixes each processed 3D signal by the mixer 245. It is also possible.

Meanwhile, a block diagram of the controller 170 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, according to an exemplary embodiment, the frame rate converter 250 and the formatter 260 may not be provided in the controller 170, but may be separately provided. In some embodiments, the frame rate converter 250 may be a formatter. 260 may be included.

5 is an internal block diagram of a shutter glass according to an embodiment of the present invention.

Referring to the drawings, the 3D image viewing apparatus, the shutter glass according to an embodiment of the present invention, the left and right eye glasses 530, a receiving unit 510 for receiving at least one of a synchronization signal and a driving signal from the image display device, the synchronization The controller 520 generates a driving signal based on the signal, or controls an opening / closing operation of the left and right eyeglasses 530 based on the received driving signal.

The receiver 510 may receive a signal transmitted from the image display apparatus, and the controller 520 may convert the generated signal into a driving signal for controlling the left eye glass 531 and the right eye glass 532. On the other hand, although not shown in accordance with the embodiment, it may further include a mux (Mux) for signal processing, a frequency converter that can vary the period, frequency of various signals. According to the driving signal, the left and right glasses 530 are finally turned on, off, that is, opening and closing operations are performed.

The image display device may alternately align the left eye image signal L and the right eye image signal R sequentially. Accordingly, when the left eye image L is displayed on the display module 180, the left eye glass 531 It is open and the right eye glass 532 is closed. When the right eye image R is displayed, the left eye glass 531 is closed and the right eye glass 532 is opened. That is, the opening and closing operations of the left and right eye glasses 530 may be synchronized with the image displayed by the image display device.

The synchronization signal may be a signal synchronized with an image displayed on the image display device. It may be a vertical synchronization signal (Vsync), a signal synchronized to the left and right view image displayed on the image display device or a signal modified to be more efficient in transmission.

6 is a diagram illustrating an example of a signal for a shutter glass.

Referring to FIG. 6, a signal for a 3D image viewing apparatus, for example, a shutter glass, may be generated based on the vertical frequency V_sync of an image displayed on the image display apparatus. For example, the left and right eye images may be displayed by dividing the unit frame into a left eye image frame and a right eye image frame in half, and the left and right eye glasses may be opened and closed in synchronization with the left and right eye frames.

Meanwhile, the vertical frequency of the image may be set variously, such as 24 Hz, 50 Hz, 60 Hz.

However, when the right eye glass of the shutter glass is opened according to the response characteristic of the display module 180, the display module 180 is still displaying the left eye image frame, or the left and right eye glasses of the shutter glass are open (on). The operation of closing (off) may include a rising and falling section due to the response speed of the shutter, and in view of this, when the right eye glass is quickly opened, the display module 180 still displays the left eye image frame. Crosstalk may occur.

In addition, when interference or noise is introduced into the signal for the shutter glass, the shutter glass may determine that the introduced noise component is also a valid signal, thereby generating an incorrect driving signal. Therefore, the user may not be able to smoothly watch the 3D image due to the opening of the right eye glass in the section in which the left eye image is displayed on the image display apparatus, or due to the frequent opening and closing operation.

The present invention provides an image display device capable of reducing such cross talk phenomenon below.

7 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.

In an operation method of an image display apparatus according to an exemplary embodiment of the present invention, first, rearranged input images generate a left eye image frame and a right eye image frame (S710).

As described above with reference to FIG. 1, the image may be a 3D image, and the image display device according to the present invention may include a tuner for receiving a broadcast signal corresponding to a selected broadcast channel or a previously stored broadcast channel and external signals from an external device. And an external device interface unit configured to receive the 3D image, which may be a broadcast image from the broadcast signal or an external input image from the external signal.

The display module 180 may be driven by dividing the unit frame into a left eye image frame and a right eye image frame, and alternately display left and right eye frames. The display module 180 is driven according to the generated left eye image frame and right eye image frame to display a 3D image.

At least one of the left eye glass and the right eye glass of the shutter glass corresponding to the start time of the left eye image frame or the start time of the right eye image frame is slowly opened for a predetermined time (S730).

That is, the left eye glass and the right eye glass of the shutter glass that correspond exactly to the start time of the left eye image frame or the start time of the right eye image frame are not opened, but the left eye glass and the right eye glass of the shutter glass after a predetermined time. To open. Therefore, it is possible to prevent the user from seeing the left eye image frame being displayed when the right eye glass is opened due to the response characteristic of the display module 180 or the response speed of the shutter glass.

In addition, it is possible to determine whether interference or noise is introduced into the signal for the shutter glass at a predetermined time and to secure a time for filtering.

In some embodiments, the display module may be a plasma display panel (PDP), and the present invention is more effective when applied to the plasma display panel (PDP).

Hereinafter, a structure and a driving method of a plasma display panel (PDP) that can be used as the display module 180 will be described in detail with reference to FIGS. 8 to 11.

8 is a perspective view illustrating an embodiment of a structure of a plasma display panel, and FIG. 9 is a diagram illustrating an embodiment of an electrode arrangement of a plasma display panel.

FIG. 10 is a timing diagram illustrating an embodiment of a method of time-divisionally driving a plasma display panel by dividing one frame into a plurality of subfields, and FIG. 11 is a driving signal for driving the plasma display panel. Fig. 1 is a timing diagram showing an embodiment of a waveform.

As shown in FIG. 8, the plasma display panel includes a scan electrode 11, a sustain electrode 12, a sustain electrode pair formed on the upper substrate 10, and an address electrode 22 formed on the lower substrate 20. It includes.

The sustain electrode pairs 11 and 12 generally include transparent electrodes 11a and 12a and bus electrodes 11b and 12b formed of indium tin oxide (ITO), and bus electrodes 11b and 12b. ) Is formed on the transparent electrodes 11a and 12a, and serves to reduce the voltage drop caused by the transparent electrodes 11a and 12a having high resistance.

Meanwhile, according to the exemplary embodiment, only the bus electrodes 11b and 12b may be formed without the transparent electrodes 11a and 12a. This structure does not use the transparent electrodes (11a, 12a), there is an advantage that can lower the cost of manufacturing the panel.

Light between the scan electrodes 11 and the sustain electrodes 12 between the transparent electrodes 11a and 12a and the bus electrodes 11b and 11c to absorb external light generated outside the upper substrate 10 to reduce reflection. A black matrix (BM, 15) is arranged that functions to block and to improve the purity and contrast of the upper substrate 10. The black matrix BM 15 may be arranged at different positions according to the embodiment.

The upper dielectric layer 13 and the passivation layer 14 are stacked on the upper substrate 10 having the scan electrode 11 and the sustain electrode 12 side by side. Charged particles generated by the discharge are accumulated in the upper dielectric layer 13, and the protective electrode pairs 11 and 12 may be protected. The protective film 14 protects the upper dielectric layer 13 from sputtering of charged particles generated during gas discharge, and increases emission efficiency of secondary electrons.

In addition, the address electrode 22 is formed in a direction crossing the scan electrode 11 and the sustain electrode 12. In addition, the lower dielectric layer 23 and the partition wall 21 are formed on the lower substrate 20 on which the address electrode 22 is formed.

In addition, the phosphor layer 23 is formed on the surfaces of the lower dielectric layer 24 and the partition wall 21. The partition wall 21 has a vertical partition wall 21a and a horizontal partition wall 21b formed in a closed shape, and physically distinguishes discharge cells, and prevents ultraviolet rays and visible light generated by the discharge from leaking into adjacent discharge cells.

In an embodiment of the present invention, not only the structure of the partition wall 21 illustrated in FIG. 8, but also the structure of the partition wall 21 having various shapes may be possible. Meanwhile, in one embodiment of the present invention, although the R, G and B discharge cells are shown and described as being arranged on the same line, it may be arranged in other shapes.

In addition, the phosphor layer 23 emits light by ultraviolet rays generated during gas discharge to generate visible light of any one of red (R), green (G), and blue (B). Here, an inert mixed gas such as He + Xe, Ne + Xe and He + Ne + Xe for discharging is injected into the discharge space provided between the upper / lower substrates 10 and 20 and the partition wall 21.

FIG. 9 illustrates an embodiment of an electrode arrangement of a plasma display panel, and a plurality of discharge cells constituting the plasma display panel are preferably arranged in a matrix form as shown in FIG. 9. The plurality of discharge cells are provided at the intersections of the scan electrode lines Y1 to Ym, the sustain electrode lines Z1 to Zm, and the address electrode lines X1 to Xn, respectively. The scan electrode lines Y1 to Ym may be driven sequentially or simultaneously, and the sustain electrode lines Z1 to Zm may be driven simultaneously. The address electrode lines X1 to Xn may be driven by being divided into odd-numbered lines and even-numbered lines, or sequentially driven.

FIG. 10 is a timing diagram illustrating an embodiment of a time division driving method by dividing a frame into a plurality of subfields. The unit frame may be divided into a predetermined number, for example, eight subfields SF1, ..., SF8 to realize time division gray scale display. Each subfield SF1, ... SF8 is divided into a reset section (not shown), an address section A1, ..., A8 and a sustain section S1, ..., S8.

Here, according to an embodiment of the present invention, the reset period may be omitted in at least one of the plurality of subfields. For example, the reset period may exist only in the first subfield or may exist only in a subfield about halfway between the first subfield and all the subfields.

In each address section A1, ..., A8, a display data signal is applied to the address electrode X, and scan pulses corresponding to each scan electrode Y are sequentially applied.

In each of the sustain periods S1, ..., S8, a sustain pulse is alternately applied to the scan electrode Y and the sustain electrode Z to form wall charges in the address periods A1, ..., A8. Sustain discharge occurs in the discharge cells.

The luminance of the plasma display panel is proportional to the number of sustain discharge pulses in the sustain discharge periods S1, ..., S8 occupied in the unit frame. When one frame forming one image is represented by eight subfields and 256 gradations, each subfield in turn has different sustains at a ratio of 1, 2, 4, 8, 16, 32, 64, and 128. The number of pulses can be assigned. In order to obtain luminance of 133 gradations, cells may be sustained by addressing the cells during the subfield 1 section, the subfield 3 section, and the subfield 8 section.

The number of sustain discharges allocated to each subfield may be variably determined according to weights of the subfields according to the APC (Automatic Power Control) step. That is, in FIG. 10, a case in which one frame is divided into eight subfields has been described as an example. However, the present invention is not limited thereto, and the number of subfields forming one frame may be variously modified according to design specifications. . For example, a plasma display panel may be driven by dividing one frame into eight or more subfields, such as 12 or 16 subfields.

The number of sustain discharges allocated to each subfield can be variously modified in consideration of gamma characteristics and panel characteristics. For example, the gray level assigned to subfield 4 may be lowered from 8 to 6, and the gray level assigned to subfield 6 may be increased from 32 to 34.

11 is a timing diagram illustrating an embodiment of a drive signal for driving a plasma display panel.

The subfield may include a reset section for initializing the discharge cells of the screen, an address section for selecting the discharge cells, and a sustain section for maintaining the discharge of the selected discharge cells.

According to an embodiment, the method may further include a pre-reset section for forming the positive wall charges on the scan electrodes Y and the negative wall charges on the sustain electrodes Z. .

The reset section includes a setup section and a setdown section. In the setup section, rising ramp waveforms (Ramp-up) are simultaneously applied to all scan electrodes to generate fine discharges in all discharge cells. Thus, wall charges are generated. In the set-down period, a falling ramp waveform Ramp-down falling at a positive voltage lower than the peak voltage of the rising ramp waveform Ramp-up is simultaneously applied to all scan electrodes Y, thereby eliminating erase discharge in all discharge cells. Generated, thereby eliminating unnecessary charges during wall charges and space charges generated by the setup discharges.

In the address period, a scan signal having a negative scan voltage Vsc is sequentially applied to the scan electrode, and at the same time, a positive data signal is applied to the address electrode X. The address discharge is generated by the voltage difference between the scan signal and the data signal and the wall voltage generated during the reset period, thereby selecting the cell. On the other hand, in order to increase the efficiency of the address discharge, a sustain bias voltage Vzb is applied to the sustain electrode during the address period.

During the address period, the plurality of scan electrodes Y may be divided into two or more groups, and scan signals may be sequentially supplied to each group.

In addition, the plurality of scan electrodes Y may be divided into a first group located at an even number and a second group located at an odd number according to a position formed on the panel. For example, it may be divided into a first group located above and a second group located below based on the center of the panel.

The scan electrodes belonging to the first group divided by the above method are further divided into a first subgroup located at an even number and a second subgroup located at an odd number, or the first group. The first subgroup positioned above and the second group positioned below may be divided based on the center of the.

In the sustain period, a sustain pulse having a sustain voltage Vs is alternately applied to the scan electrode and the sustain electrode to generate sustain discharge in the form of surface discharge between the scan electrode and the sustain electrode.

The driving waveforms shown in FIG. 11 are examples of signals for driving a plasma display panel that can be used in the present invention, and the present invention is not limited to the waveforms shown in FIG.

12 to 16 are diagrams for describing an operating method of an image display apparatus according to an exemplary embodiment of the present invention of FIG. 9.

12 is a signal synchronized with a vertical frequency, left and right image frames when a plasma display panel (PDP) displays a 3D image, subfield arrangement (b) according to an embodiment of the present invention, and the present invention. The shutter glass driving signal c according to an embodiment of the present invention is illustrated.

As described above with reference to FIGS. 8 through 11, the plasma display panel PDP includes at least one subfield including a reset period for generating and preparing wall charges, an address period for selecting on cells or off cells, and a plasma display panel (PDP). ) May include a sustain period for generating light for displaying an image.

The left eye image frame and the right eye image frame may include a plurality of subfields, and at least one of the plurality of subfields may include a reset period, an address period, and a sustain period.

The first subfield of the plurality of subfields included in each of the left eye image frame and the right eye image frame may have a reset period longer than the remaining subfields, or the maximum reset period voltage of the first subfield may be greater than the remaining subfields.

In this case, from the second subfield, the wall charge state generated by the sustain discharge of the sustain section of the previous subfield can be changed to the wall charge state suitable for the address section and the sustain section for a short time and low voltage, thereby reducing power consumption, It is advantageous in terms of driving margin.

The controller 170 may arrange a plurality of subfields in the left eye image frame and the right eye image frame, or arrange the subfields, and allocate a reset period, an address period, and a sustain period to at least one subfield. It may include a field mapping unit. Alternatively, the subfield mapping unit may be included in the image display apparatus as a separate block.

On the other hand, afterglow resulting from the discharge of the sustain period included in the last subfield of the right eye image frame may remain even after the left eye image frame starts, but at least one of the open time point of the left eye glass and the open time point of the right eye glass may correspond. By controlling the start time of the left eye image frame or the start time of the right eye image frame to be controlled for a predetermined time, crosstalk due to afterglow can be prevented.

In addition, it is possible to determine whether interference or noise is introduced into the signal for the shutter glass for a predetermined time and to secure a time for filtering.

Meanwhile, since there is no light or insufficient light in the reset section and the address section R + A, the user cannot see the left eye image frame. Therefore, the present invention can utilize the reset period and the address period R + A as a blocking time for not opening the right eye glass portion.

Accordingly, it is possible to prevent the user from seeing the left eye image frame being displayed when the right eye glass is opened due to the response characteristic of the display module 180 or the response speed of the shutter glass.

On the other hand, the longer the predetermined time B, the lower the crosstalk phenomenon, but the longer the predetermined time B, the shorter the time A for opening the shutter glass becomes short, which may cause a decrease in luminance. Accordingly, the blocking time is controlled only within a certain section range, or the user is informed that the luminance and the prevention of crosstalk phenomenon are in a trade off relationship, or the approximate degree is provided in an intuitive menu. can do.

In addition, when the blocking time is greater than the sum of the reset period and the address period of the first subfield of the left eye image frame and the right eye image frame, the following problem may occur. Since light from the sustain discharge of the sustain section is already generated and the left eye image frame is displayed on the screen, the left eye glass is closed so that the user may feel uncomfortable to watch the 3D image or may be caused by some sustain discharge. Since no light is used, the brightness may be reduced or felt to be reduced.

Therefore, the predetermined blocking time may be smaller than the sum of the reset period and the address period of the first subfield of the left eye image frame and the right eye image frame.

Meanwhile, although FIG. 12 illustrates that the right eye image frame is displayed first and the left eye image frame is displayed later, the order may be set differently. In addition, the present invention can be applied not only to the first subfield of a late image frame among the left and right eye image frames in a frame, but also to the first subfield of an image frame having a rapid order.

The shutter glass driving signal may include a rising section, a holding section, and a falling section. In FIG. 12, the rising section and the falling section are indicated by dotted lines. The rising section and the falling section may be varied by setting or according to the response characteristic of the shutter glass.

According to an embodiment, as shown in FIG. 12, at least one of the closing time of the left eye glass and the closing time of the right eye glass is a predetermined time (C) than the end time of the corresponding left eye image frame or the end time of the right eye image frame. Can be fast. This may help to prevent afterglow remaining due to the discharge of the sustain period in the next image frame.

Alternatively, as illustrated in FIG. 13, at least one of the closing time of the left eye glass and the closing time of the right eye glass may be slower than the end time of the corresponding left eye image frame or the end time of the right eye image frame.

FIG. 13A illustrates a conventional shutter glass driving signal, and FIG. 13B illustrates an embodiment in which the opening time and the closing time of the shutter glass driving signal are all shifted by a predetermined time delay. That is, since the crosstalk phenomenon is more likely to occur when the opening time point is not suitable for the shutter glass and the left and right image frames, the drive signal may be delayed and the modified drive signal may be simply generated.

Meanwhile, in the method of operating an image display device according to an embodiment of the present invention, as shown in FIGS. 14 to 15, the method of displaying the menus 1410 and 1510 for setting the predetermined time and receiving an input corresponding to the menu are received. If so, the method may further include varying the predetermined time.

As shown in FIG. 14, the setting tab may be linearly changed using the pointer 1420 displayed in response to the operation of the remote control apparatus 200, thereby changing a predetermined time blocking time.

Alternatively, as illustrated in FIG. 15, one of the check boxes of the setting menu window 1510 may be selected to select a setting value. In this case, one of the check boxes may be selected using the pointer 1420 displayed in response to the operation of the remote control apparatus 200. In the drawing, although the menu box in the form of a check box is shown as an example, the present invention is not limited thereto, and may be implemented as another type of graphic user interface (GUI).

On the other hand, without displaying a menu according to the embodiment, the shutter glass 195 includes a switch unit 1610 for setting the predetermined time, and when the input for changing the predetermined time is received, the predetermined predetermined time is received. The time can vary.

In this case, the shutter glass can transmit a setting change signal to the controller.

Alternatively, the control unit of the shutter glass may directly drive the shutter glass driving signal such that at least one of the left eyeglass opening point and the right eyeglass opening point is slower than the start point of the corresponding left eye frame or the right eye frame. Can be generated.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

In addition, while the preferred embodiments of the present invention have been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

100: video display device
170:
180: display module
195: shutter glass
200: remote control device

Claims (20)

A display module driven by dividing a unit frame into a left eye image frame and a right eye image frame; And
And a shutter glass having a left eye glass opened in correspondence with the left eye image frame and a right eye glass opened in correspondence with the right eye image frame.
And at least one of an opening time point of the left eye glass and an opening time point of the right eye glass is predetermined time slower than a start time point of a corresponding left eye image frame or a start time point of a right eye image frame.
The method of claim 1,
And the display module is a plasma display panel (PDP).
The method of claim 1,
The left eye image frame and the right eye image frame include a plurality of subfields, at least one of the plurality of subfields includes a reset period, an address period, and a sustain period,
And the predetermined time is smaller than the sum of the reset period and the address period of the first subfield of the left eye image frame and the right eye image frame.
The method of claim 3,
And a subfield mapping unit arranged to arrange a plurality of subfields in the left eye image frame and the right eye image frame.
The method of claim 3,
The first subfield of the plurality of subfields included in the left eye image frame and the right eye image frame, respectively, has a reset period longer than the remaining subfields.
The method of claim 1,
And at least one of a closing time of the left eye glass and a closing time of the right eye glass is earlier than an end time of a corresponding left eye image frame or an end time of a right eye image frame.
The method of claim 1,
And a formatter configured to generate a shutter glass driving signal such that at least one of the opening time of the left eye glass and the opening time of the right eye glass is slower than a start time of a corresponding left eye image frame or a start time of a right eye image frame. Image display apparatus characterized in that.
The method of claim 7, wherein
The shutter glass driving signal may include a rising section, a holding section, and a falling section.
The method of claim 1,
And the display module displays a menu for setting the predetermined time.
The method of claim 1,
The shutter glass may include a controller configured to generate a shutter glass driving signal such that at least one of the opening time of the left eye glass and the opening time of the right eye glass is slower than a start time of a corresponding left eye image frame or a start time of a right eye image frame. And an image display device further comprising.
The method of claim 1,
And the shutter glass comprises a switch configured to set the predetermined time.
The method of claim 1,
And a formatter for alternately arranging the unit frame of the 3D image to the left eye image frame and the right eye image frame to the left eye image and the right eye image.
The method of claim 12,
A tuner for receiving a broadcast signal corresponding to a selected broadcast channel or a pre-stored broadcast channel; And
And an external device interface unit configured to receive an external signal from the external device.
And the 3D image is a broadcast image from the broadcast signal or an external input image from the external signal.
Rearranging the input image to generate a left eye image frame and a right eye image frame;
Displaying a 3D image by driving a display module according to the generated frame; And
And opening at least one of a left eye glass and a right eye glass of a shutter glass corresponding to a start time of the left eye image frame or a start time of a right eye image frame for a predetermined time.
The method of claim 14,
The left eye image frame and the right eye image frame include a plurality of subfields, at least one of the plurality of subfields includes a reset period, an address period, and a sustain period,
And the predetermined time is smaller than the sum of the reset period and the address period of the first subfield of the left eye image frame and the right eye image frame.
The method of claim 14,
Generating a shutter glass driving signal such that at least one of the opening time of the left eye glass and the opening time of the right eye glass is slower than a start time of a corresponding left eye image frame or a start time of a right eye image frame. Method of operating a video display device, characterized in that.
The method of claim 14,
And closing at least one of the left eye glass and the right eye glass corresponding to the end time of the left eye image frame or the end time of the right eye image frame.
The method of claim 14,
And when the input for changing the predetermined time is received, varying the predetermined time.
The method of claim 14,
Displaying a menu for setting the predetermined time; And
And when the input corresponding to the menu is received, varying the predetermined time period.
The method of claim 14,
Receiving a 3D image; And
And when the input corresponding to the menu is received, varying the predetermined time period.
KR1020100040977A 2010-04-30 2010-04-30 Apparatus for displaying image and method for operating the same KR20110121397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100040977A KR20110121397A (en) 2010-04-30 2010-04-30 Apparatus for displaying image and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100040977A KR20110121397A (en) 2010-04-30 2010-04-30 Apparatus for displaying image and method for operating the same

Publications (1)

Publication Number Publication Date
KR20110121397A true KR20110121397A (en) 2011-11-07

Family

ID=45392293

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100040977A KR20110121397A (en) 2010-04-30 2010-04-30 Apparatus for displaying image and method for operating the same

Country Status (1)

Country Link
KR (1) KR20110121397A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013119674A1 (en) * 2012-02-06 2013-08-15 3D Digital, Llc Apparatus, method and article for generating a three dimensional effect using filtering and stereoscopic images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013119674A1 (en) * 2012-02-06 2013-08-15 3D Digital, Llc Apparatus, method and article for generating a three dimensional effect using filtering and stereoscopic images

Similar Documents

Publication Publication Date Title
US9143771B2 (en) Image display device and method for operating the same
KR101615973B1 (en) Apparatus for displaying image and method for operating the same
US9124884B2 (en) Image display device and method for operating the same
US9007361B2 (en) Image display apparatus and method for operating the same
KR101729524B1 (en) Apparatus for displaying image and method for operating the same
KR20120019728A (en) Apparatus for displaying image and method for operating the same
KR20110121397A (en) Apparatus for displaying image and method for operating the same
KR101702967B1 (en) Apparatus for displaying image and method for operating the same
KR101668250B1 (en) Apparatus for displaying image and method for operating the same
KR101680031B1 (en) Apparatus for displaying image and method for operating the same
KR101203582B1 (en) Apparatus for displaying image and method for operating the same
KR101243960B1 (en) Apparatus for displaying image and method for operating the same
KR20120105235A (en) Apparatus for displaying image and method for operating the same
KR101961372B1 (en) Apparatus for displaying image and method for operating the same
KR20120014485A (en) Apparatus for viewing 3d image, image display apparatus and method for operating the same
KR20120011251A (en) Multi vision system
KR20110139596A (en) Apparatus for displaying image and method for operating the same
KR20110114996A (en) Apparatus for displaying image and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination