KR20120002852A - Method for operating an apparatus for displaying image - Google Patents

Method for operating an apparatus for displaying image Download PDF

Info

Publication number
KR20120002852A
KR20120002852A KR1020100063566A KR20100063566A KR20120002852A KR 20120002852 A KR20120002852 A KR 20120002852A KR 1020100063566 A KR1020100063566 A KR 1020100063566A KR 20100063566 A KR20100063566 A KR 20100063566A KR 20120002852 A KR20120002852 A KR 20120002852A
Authority
KR
South Korea
Prior art keywords
image
signal
information
user
video
Prior art date
Application number
KR1020100063566A
Other languages
Korean (ko)
Inventor
박재훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100063566A priority Critical patent/KR20120002852A/en
Publication of KR20120002852A publication Critical patent/KR20120002852A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/005Aspects relating to the "3D+depth" image format
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/008Aspects relating to glasses for viewing stereoscopic images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A method of operating an image display apparatus according to an embodiment of the present invention for achieving the above object includes receiving a 3D image signal, receiving identification information of the 3D image viewing apparatus, and is included in the 3D image signal. Generating a driving signal of the 3D image viewing apparatus based on content information and the identification information, and transmitting the driving signal to the 3D image viewing apparatus or displaying an image differently according to the identification information. .
According to the present invention, the 3D image viewing apparatus or the image display apparatus can be controlled according to the user's preference and the viewing grade of the content, thereby providing a customized image for each user.

Description

Method for operating an apparatus for displaying image}

The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method capable of watching a 3D image accurately and easily for each user.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

Also. Recently, various researches on stereoscopic images have been conducted, and stereoscopic imaging techniques are becoming more and more common and practical in computer graphics as well as in various other environments and technologies. In addition, research has been conducted on the #D video viewing environment optimized for each user and viewing grade.

Accordingly, an object of the present invention is to provide a video display device and a method of operating the 3D image viewing apparatus which can operate a 3D image viewing apparatus according to a user, a viewing grade, or provide a customized 3D image.

According to an aspect of the present invention, there is provided a method of operating an image display apparatus, the method including: receiving a 3D image signal, receiving identification information of the 3D image viewing apparatus, content information included in the 3D image signal; Generating a driving signal of the 3D image viewing apparatus based on the identification information, and transmitting the driving signal to the 3D image viewing apparatus.

According to another aspect of the present invention, there is provided a method of operating an image display device, the method comprising: receiving a 3D image signal, receiving identification information of the 3D image viewing apparatus, and displaying an image differently according to the identification information. Displaying.

According to the present invention, the 3D image viewing apparatus or the image display apparatus can be controlled according to the user's preference and the viewing grade of the content, thereby providing a customized image for each user.

1 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
FIG. 2 is an internal block diagram of the controller of FIG. 1.
3 is a diagram illustrating an operation of a 3D image viewing apparatus according to a frame sequential format.
4 is a diagram illustrating an example of a 3D video signal format capable of implementing 3D video.
5 is a diagram illustrating various scaling methods of a 3D video signal according to an embodiment of the present invention.
6 is a view showing a state in which the depth of the 3D image or 3D object is variable according to an embodiment of the present invention.
7 is a diagram illustrating a state in which a sense of depth of an image or the like is controlled according to an embodiment of the present invention.
8 and 9 are views illustrating an image display device and a remote control device according to an embodiment of the present invention.
10 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
11 to 13 are views referred to for describing a method of operating an image display apparatus according to an exemplary embodiment of the present invention of FIG. 10.
14 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
FIG. 15 is a view referred to for describing a method of operating an image display apparatus according to an exemplary embodiment of the present invention of FIG. 14.

Hereinafter, the present invention will be described in more detail with reference to the drawings.

The suffixes "module" and "unit" for components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not impart any particular meaning or role by themselves. Therefore, the "module" and "unit" may be used interchangeably.

1 to 2 are internal block diagrams of an image display apparatus according to an embodiment of the present invention.

Referring to FIG. 1, the image display apparatus 100 according to an exemplary embodiment of the present invention includes a tuner 110, a demodulator 120, an external device interface unit 130, a network interface unit 135, and a storage unit ( 140, a user input interface unit 150, a controller 170, a display 180, an audio output unit 185, a power supply unit 190, and a 3D image viewing apparatus 195.

The tuner 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner 110 may process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal CVBS / SIF output from the tuner 110 may be directly input to the controller 170.

Also, the tuner 110 can receive RF carrier signals of a single carrier according to an Advanced Television System Committee (ATSC) scheme or RF carriers of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and converts them into intermediate frequency signals or baseband video or audio signals. I can convert it.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

For example, when the digital IF signal output from the tuner 110 is the ATSC scheme, the demodulator 120 performs an 8-VSB (8-Vestigal Side Band) demodulation. Also, the demodulation unit 120 may perform channel decoding. To this end, the demodulator 120 includes a trellis decoder, a de-interleaver, and a reed solomon decoder to perform trellis decoding, deinterleaving, Solomon decoding can be performed.

For example, when the digital IF signal output from the tuner 110 is a DVB scheme, the demodulator 120 performs COFDMA (Coded Orthogonal Frequency Division Modulation) demodulation. Also, the demodulation unit 120 may perform channel decoding. For this, the demodulator 120 may include a convolution decoder, a deinterleaver, and a reed-solomon decoder to perform convolutional decoding, deinterleaving, and reed solomon decoding.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. Specifically, the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.

On the other hand, the demodulator 120 described above can be provided separately according to the ATSC system and the DVB system. That is, it can be provided as an ATSC demodulation unit and a DVB demodulation unit.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 may connect the external device to the image display device 100. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire / wireless. The external device interface unit 130 transmits an image, audio or data signal input from the outside to the controller 170 of the image display device 100 through a connected external device. In addition, the controller 170 may output an image, audio, or data signal processed by the controller 170 to a connected external device. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI so that video and audio signals of an external device can be input to the video display device 100. (Digital Visual Interface) terminal, HDMI (High Definition Multimedia Interface) terminal, RGB terminal, D-SUB terminal and the like.

The wireless communication unit can perform short-range wireless communication with other electronic devices. The image display device 100 may be connected to other electronic devices and networks according to communication standards such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. Can be connected.

In addition, the external device interface unit 130 may be connected through at least one of the various set top boxes and the various terminals described above to perform input / output operations with the set top box.

The external device interface unit 130 may transmit / receive data and signals with the 3D image viewing apparatus 195 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. The network interface unit 135 may include an Ethernet terminal for connection with a wired network, and for connection with a wireless network, a WLAN (Wi-Fi) or a Wibro (Wireless). Broadband, Wimax (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA) communication standards, and the like may be used.

The network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider may be received through a network. In addition, the update information and the update file of the firmware provided by the network operator can be received. It may also transmit data to the Internet or content provider or network operator.

In addition, the network interface unit 135 is connected to, for example, an Internet Protocol (IP) TV, and receives the video, audio, or data signals processed in the set-top box for the IPTV to enable bidirectional communication. The signal processed by the controller 170 may be transmitted to the set-top box for the IPTV.

Meanwhile, the above-described IPTV may mean ADSL-TV, VDSL-TV, FTTH-TV, etc. according to the type of transmission network, and include TV over DSL, Video over DSL, TV overIP (TVIP), and Broadband TV ( BTV) and the like. In addition, IPTV may also mean an Internet TV capable of accessing the Internet, or a full browsing TV.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal-processed video, audio, or data signal.

In addition, the storage unit 140 may perform a function for temporarily storing an image, audio, or data signal input to the external device interface unit 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function such as a channel map.

The storage unit 140 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), It may include at least one type of storage medium such as RAM, ROM (EEPROM, etc.). The image display apparatus 100 may reproduce and provide a file (video file, still image file, music file, document file, etc.) stored in the storage 140 to a user.

1 illustrates an embodiment in which the storage unit 140 is provided separately from the control unit 170, but the scope of the present invention is not limited thereto. The storage 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the user input interface unit 150 may be powered on / off, channel selection, and screen from the remote controller 200 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method. A user input signal such as a setting may be received, or a signal from the controller 170 may be transmitted to the remote controller 200.

In addition, for example, the user input interface unit 150 may transmit a user input signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170.

In addition, for example, the user input interface unit 150 may transmit a user input signal input from a sensing unit (not shown) that senses a user's gesture to the controller 170 or may transmit a signal from the controller 170. The transmission may be transmitted to a sensing unit (not shown). Here, the sensing unit (not shown) may include a touch sensor, an audio sensor, a position sensor, an operation sensor, and the like.

The controller 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner 110, the demodulator 120, or the external device interface unit 130, and outputs a video or audio signal. You can create and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The voice signal processed by the controller 170 may be sound output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

Although not shown in FIG. 1, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 2.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

For example, the controller 170 controls the tuner 110 to input a signal of a selected channel according to a predetermined channel selection command received through the user input interface unit 150. Then, video, audio, or data signals of the selected channel are processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display 180 or the audio output unit 185.

As another example, the controller 170 may, for example, receive an external device image playback command received through the user input interface unit 150, from an external device input through the external device interface unit 130, for example, a camera or a camcorder. The video signal or the audio signal may be output through the display 180 or the audio output unit 185.

The controller 170 may control the display 180 to display an image. For example, a broadcast image input through the tuner 110, an external input image input through the external device interface unit 130, or an image input through the network interface unit 135 or an image stored in the storage unit 140. May be controlled to be displayed on the display 180.

In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate and display a 3D object with respect to a predetermined object in the image displayed on the display 180. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

The controller 170 recognizes a user's position based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the image display apparatus 100 corresponding to the user position can be grasped.

On the other hand, although not shown in the figure, it may be further provided with a channel browsing processing unit for generating a thumbnail image corresponding to the channel signal or the external input signal. The channel browsing processor may receive a stream signal TS output from the demodulator 120 or a stream signal output from the external device interface 130, extract a video from the input stream signal, and generate a thumbnail image. Can be. The generated thumbnail image may be input as it is or encoded to the controller 170. In addition, the generated thumbnail image may be encoded in a stream form and input to the controller 170. The controller 170 may display a thumbnail list including a plurality of thumbnail images on the display 180 by using the input thumbnail image. In this case, the thumbnail list may be displayed in a simple viewing manner displayed in a partial region while a predetermined image is displayed on the display 180 or in an overall viewing manner displayed in most regions of the display 180.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or the like, and in particular, according to an embodiment of the present invention, it is preferable that a 3D display is possible.

The display 180 may be divided into an additional display method and a single display method for viewing a 3D image.

The independent display method may implement a 3D image by the display 180 alone without additional display, for example, glasses, and the like. For example, various methods such as a lenticular method and a parallax barrier may be used. Can be applied.

On the other hand, the additional display method, as an additional display in addition to the display 180, that is to implement a 3D image using a 3D image viewing device, for example, a variety of methods such as head-mounted display (HMD) type, glasses type can be applied have. In addition, the spectacle type may be further divided into a passive method such as a polarized glasses type and an active method such as a shutter glass type. On the other hand, the head mounted display type can be divided into passive and active methods.

In the embodiment of the present invention, a 3D image viewing apparatus 195 is provided for viewing 3D images. The 3D image viewing apparatus 195 is an additional display of the active type among the various types described above. Hereinafter, the description will be based on the shutter glass.

The display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs a voice signal. The audio output unit 185 may be implemented by various types of speakers.

Meanwhile, in order to detect a gesture of a user, as described above, a sensing unit (not shown) including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the image display apparatus 100. have. The signal detected by the sensing unit (not shown) is transmitted to the controller 170 through the user input interface unit 150.

The controller 170 may detect a gesture of the user by combining or combining the image photographed by the photographing unit (not shown) or the detected signal from the sensing unit (not shown).

The power supply unit 190 supplies the corresponding power throughout the image display apparatus 100. In particular, power may be supplied to the controller 170, which may be implemented in the form of a System On Chip (SOC), a display 180 for displaying an image, and an audio output unit 185 for audio output. have. In addition, according to the exemplary embodiment, power may be supplied to a heat generating unit including a heating wire.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. To this end, the remote control device 200 may use infrared (IR) communication, RF (Radio Frequency) communication, Bluetooth (Bluetooth), UWB (Ultra Wideband), ZigBee (ZigBee) method and the like. In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, and display or output the audio from the remote control apparatus 200.

The video display device 100 described above is a fixed type of ATSC (8-VSB) digital broadcasting, DVB-T (COFDM) digital broadcasting, ISDB-T (BST-OFDM) digital broadcasting, and the like. It may be a digital broadcast receiver capable of receiving at least one. In addition, as a mobile type, digital broadcasting of terrestrial DMB system, digital broadcasting of satellite DMB system, digital broadcasting of ATSC-M / H system, digital broadcasting of DVB-H system (COFDM system) and media flow link only system It may be a digital broadcast receiver capable of receiving at least one of digital broadcasts. It may also be a digital broadcast receiver for cable, satellite communications, or IPTV.

On the other hand, the video display device described in the present specification is a TV receiver, a mobile phone, a smart phone (notebook computer), a digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable Multimedia Player), etc. May be included.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 1 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more components may be combined into one component as needed, or one component may be divided into two or more components. In addition, the function performed in each block is for explaining an embodiment of the present invention, the specific operation or device does not limit the scope of the present invention.

FIG. 2 is an internal block diagram of the controller of FIG. 1, and FIG. 3 is a diagram illustrating an operation of a shutter glass according to a frame sequential format among formats of a 3D image.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 210, the image processor 220, the OSD generator 240, the mixer 245, the frame rate converter 250, and a formatter 260. In addition, the apparatus may further include a voice processor (not shown) and a data processor (not shown).

The demultiplexer 210 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 210 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface unit 130.

The image processor 220 may perform image processing of the demultiplexed image signal. To this end, the image processor 220 may include an image decoder 225 and a scaler 235.

The image decoder 225 decodes the demultiplexed image signal, and the scaler 235 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 225 may include decoders of various standards.

For example, when the demultiplexed video signal is an encoded 2D video signal of MPEG-2 standard, it may be decoded by an MPEG-2 decoder.

Also, for example, the demultiplexed 2D video signal is a digital video broadcasting (DMB) method or a video signal of H.264 standard according to DVB-H, or a depth video of MPEC-C part 3 , Multi-view video according to Multi-view Video Coding (MVC) or free-view video according to Free-viewpoint TV (TV), respectively, decoded by H.264 decoder, MPEC-C decoder, MVC decoder or FTV decoder Can be.

Meanwhile, the image signal decoded by the image processor 220 may be classified into a case in which only a 2D image signal is present, a case in which a 2D image signal and a 3D image signal are mixed, and a case in which only a 3D image signal is present.

The image processor 220 may detect whether the demultiplexed video signal is a 2D video signal or a 3D video signal. Whether the image is a 3D image signal may be detected based on a broadcast signal received from the tuner 110, an external input signal from an external device, or an external input signal received through a network. In particular, whether the 3D video signal is a 3D video signal may be determined by referring to a 3D video flag, 3D video metadata, or 3D video format information in the header of the stream. You can judge.

The image signal decoded by the image processor 220 may be a 3D image signal having various formats. For example, the image may be a 3D image signal including a color image and a depth image, or may be a 3D image signal including a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

The format of the 3D video signal may include a side by side format in which the left eye video signal L and the right eye video signal R are arranged left and right, a frame sequential format arranged in time division, Top / Down format arranged up and down, Interlaced format that mixes the left and right eye signals by line, and Checker box that mixes the left and right eye signals by box Format and the like.

The OSD generator 240 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 245 may mix the OSD signal generated by the OSD generator 240 and the decoded image signal processed by the image processor 220. In this case, the OSD signal and the decoded video signal may each include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 250.

The frame rate converter 250 converts the frame rate of the input video. For example, a 60Hz frame rate is converted to 120Hz or 240Hz. When converting a frame rate of 60 Hz to 120 Hz, it is possible to insert the same first frame or insert a third frame predicted from the first frame and the second frame between the first frame and the second frame. When converting a frame rate of 60 Hz to 240 Hz, it is possible to insert three more identical frames or three predicted frames.

The formatter 260 may receive a mixed signal from the mixer 245, that is, an OSD signal and a decoded video signal, and separate the 2D video signal and the 3D video signal.

Meanwhile, in the present specification, a 3D video signal means a 3D object, and examples of such an object include a picture in picture (PIP) image (still image or a video), an EPG indicating broadcast program information, various menus, widgets, There may be an icon, text, an object in the image, a person, a background, a web screen (newspaper, magazine, etc.).

The formatter 260 may change the format of the 3D video signal. For example, it may be changed to any one of the various formats illustrated in FIG. 3. In particular, according to an embodiment of the present invention, changing from the format of the 3D video signal to the frame sequential format is taken as an example. That is, the left eye image signal L and the right eye image signal R are alternately arranged in sequence. Accordingly, the 3D image viewing apparatus 195 of FIG. 1 is preferably a shutter glass.

3 illustrates an operation relationship between the shutter glass 195 and the frame sequential format. 3A illustrates that the left eye glass of the shutter glass 195 is opened and the right eye glass is closed when the left eye image L is displayed on the display 180. FIG. 3 (B) illustrates the shutter glass 195. The left eye glass of) is closed, and the right eye glass is opened. That is, the left and right eye glasses of the shutter glass 195 are opened and closed in synchronization with the image displayed on the screen.

Meanwhile, the external device interface unit 130 may transmit / receive various data and signals with the shutter glass 195 according to various communication methods such as an RF (Radio Frequency) communication method and an infrared (IR) communication method.

The formatter 260 may convert a 2D video signal into a 3D video signal. For example, an edge or selectable object may be detected within the 2D image signal according to a 3D image generation algorithm, and an object or selectable object according to the detected edge may be separated into a 3D image signal and generated. Can be. In this case, the generated 3D image signal may be separated into a left eye image signal L and a right eye image signal R as described above.

The voice processing unit (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the voice processing unit (not shown) may include various decoders.

For example, if the demultiplexed speech signal is a coded speech signal, it can be decoded. Specifically, when the demultiplexed speech signal is an encoded speech signal of MPEG-2 standard, it may be decoded by an MPEG-2 decoder. In addition, when the demultiplexed speech signal is an encoded speech signal of MPEG 4 Bit Sliced Arithmetic Coding (BSAC) standard according to the terrestrial digital multimedia broadcasting (DMB) scheme, it may be decoded by an MPEG 4 decoder. In addition, when the demultiplexed speech signal is an encoded audio signal of the AAC (Advanced Audio Codec) standard of MPEG 2 according to the satellite DMB scheme or DVB-H, it may be decoded by the AAC decoder. In addition, when the demultiplexed speech signal is a encoded speech signal of the Dolby AC-3 standard, it may be decoded by the AC-3 decoder.

Also, the voice processing unit (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel. For example, the EPG information may be TSC-PSIP (ATSC-Program and System Information Protocol) information in the ATSC scheme, and may include DVB-Service Information (DVB-SI) in the DVB scheme. . The ATSC-PSIP information or the DVB-SI information may be information included in the aforementioned stream, that is, the header (4 bytes) of the MPEG-2 TS.

Meanwhile, although FIG. 2 illustrates that the signals from the OSD generator 240 and the image processor 220 are mixed in the mixer 245 and then 3D processed in the formatter 260, the mixer is not limited thereto. May be located after the formatter. That is, the output of the image processor 220 is 3D processed by the formatter 260, and the OSD generator 240 performs 3D processing together with OSD generation, and then mixes each processed 3D signal by the mixer 245. It is also possible.

Meanwhile, a block diagram of the controller 170 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, according to an exemplary embodiment, the frame rate converter 250 and the formatter 260 may not be provided in the controller 170, but may be separately provided. In some embodiments, the frame rate converter 250 may be a formatter. 260 may be included.

4 is a diagram illustrating an example of a 3D video signal format capable of implementing 3D video. The 3D video signal format may be determined according to a method of disposing a left eye image and a right eye image generated to implement a 3D image.

The 3D image may be composed of a multi-view image. The user may view the multi-view image through the left eye and the right eye. The user may feel a three-dimensional effect of the 3D image through the difference of the image detected through the left eye and the right eye. According to an embodiment of the present invention, a multi-view image for implementing a 3D image includes a left eye image that can be recognized by the user through the left eye and a right eye image that can be recognized through the right eye.

As shown in FIG. 4A, a method in which the left eye image and the right eye image are arranged left and right is referred to as a side by side format. As shown in FIG. 4B, a method of disposing the left eye image and the right eye image up and down is referred to as a top / down format. As shown in FIG. 4C, a method of time-divisionally arranging a left eye image and a right eye image is called a frame sequential format. As shown in FIG. 4D, a method of mixing the left eye image and the right eye image for each line is called an interlaced format. As shown in FIG. 4E, a method of mixing the left eye image and the right eye image for each box is called a checker box format.

The image signal included in the signal input to the image display apparatus 100 from the outside may be a 3D image signal capable of realizing a 3D image. In addition, the graphic user interface image signal generated to display the image display apparatus 100 related information or to input a command related to the image display apparatus 100 may be a 3D image signal. The formatter 260 may mix the 3D image signal and the graphic user interface 3D image signal included in the signal input to the image display apparatus 100 from the outside and output the mixed image to the display 180.

5 is a diagram illustrating various scaling methods of a 3D video signal according to an embodiment of the present invention. For adjusting the size or tilt of the 3D object, see FIG. 5. 5 illustrates various ways of scaling of 3D video signals.

As shown in (a) of FIG. 5, the 3D image signal or the 3D object 510 in the 3D image signal may be enlarged or reduced 513 as a whole, and as shown in FIG. 5B, the 3D object may be enlarged. It may be partially enlarged or reduced (trapezoidal, 516). In addition, as shown in FIG. 5C, at least a part of the 3D object may be rotated (parallel quadrilateral shape 519). Through such scaling (scaling) or tilting, it is possible to emphasize a three-dimensional effect, that is, a three-dimensional effect, on a 3D image signal or a 3D object in the 3D image signal.

As described above, increasing the slope may increase the length difference between the parallel sides of the trapezoidal shape 416, as shown in FIG. 5 (b), or as shown in FIG. It can mean getting bigger.

In this case, scaling of the 3D video signal may be performed by the formatter 260 of the controller 170 described above. The 3D video signal of FIG. 5 may be a left eye video signal, a right eye video signal, or a signal in which a left eye video signal and a right eye video signal are combined.

The formatter 260 may receive the decoded video signal, separate the 2D video signal or the 3D video signal, and separate the 3D video signal into a left eye video signal and a right eye video signal. The left eye video signal and the right eye video signal may be scaled to one or more of the various examples shown in FIG. 5 and output in a predetermined format as shown in FIG. 4. Scaling can, on the other hand, be performed before or after the output format is formed.

In addition, the formatter 260 may receive an OSD signal of the OSD generator 174 or an OSD signal mixed with the decoded video signal, separate the 3D video signal, and separate the 3D video signal into a multi-view video signal. . For example, the 3D video signal may be divided into a left eye video signal and a right eye video signal, and the separated left eye video signal and the right eye video signal may be scaled as shown in FIG. 5 and output in a predetermined format shown in FIG. 4. .

The OSD generator 240 may directly perform the above-described scaling process with respect to the OSD output. When the OSD generator 240 directly performs scaling on the OSD, the formatter 260 does not need to perform scaling on the OSD. In this case, the OSD generator 240 not only generates the OSD signal, but also scales the OSD signal according to the depth or slope of the OSD, and further outputs the OSD signal in a suitable format. In this case, the format of the OSD signal output from the OSD generator 240 may be any one of various combination formats of a left eye image signal and a right eye image signal, or a left eye image and a right eye image, as shown in FIG. 4. In this case, the output format is the same as the output format of the formatter 260.

6 is a view showing a state in which the depth of the 3D image or 3D object is variable according to an embodiment of the present invention.

According to an embodiment of the present invention described above, the 3D image is composed of a multiview image, wherein the multiview image may be illustrated as a left eye image and a right eye image. In this case, FIG. 6 illustrates a state in which a position recognized as an image is formed from a user's point of view is changed by a distance between a left eye image and a right eye image. Referring to FIG. 6, a stereoscopic or perspective view of an image felt by a user according to an interval or parallax between a left eye image and a right eye image will be described.

In FIG. 6, a plurality of images or objects having different depths are illustrated. These objects are referred to as a first object 615, a second object 625, a third object 635, and a fourth object 645.

That is, the first object 615 is composed of a first left eye image based on the first left eye image signal and a first right eye image based on the first right eye image signal. That is, the video signal for displaying the first object includes a first left eye video signal and a first right eye video signal. 6 illustrates at which position of the display 180 a first left eye image based on the first left eye image signal and a first right eye image based on the first right eye image signal are displayed. 6 illustrates a distance between the first left eye image and the first right eye image displayed on the display 180. The description of the first object may be applied to the second to fourth objects. Hereinafter, for convenience of description, the left eye image and the right eye image displayed on the display 180 for one object, the interval set between the two images, and the serial number of the corresponding object will be described.

The first object 615 includes a first right eye image 613 (indicated by R1 in FIG. 6) and a first left eye image 611 (indicated by L1 in FIG. 6). The interval between the first right eye image 613 and the first left eye image 611 is set to d1. The user recognizes that the extension occurs at the point where the extension line connecting the left eye 601 and the first left eye image 611 and the extension line connecting the right eye 603 and the first right eye image 603 cross each other. Accordingly, the user recognizes that the first object 615 is located behind the display 180. The distance between the display 180 and the first object 615 recognized by the user may be expressed as a depth. In the present embodiment, the depth of the 3D object recognized by the user, as located behind the display 180, has a negative value (−). Therefore, the depth of the first object 615 has a negative value.

The second object 625 is composed of a second right eye image (indicated by 623 and R2) and a second left eye image (indicated by 621 and L2). According to the present exemplary embodiment, the second right eye image 623 and the second left eye image 621 are displayed at the same position on the display 180. The interval between the second right eye image 623 and the second left eye image 621 is zero. The user recognizes that the extension line connecting the left eye 601 and the second left eye image 621 and the extension line connecting the user's right eye 603 and the second right eye image 623 cross each other. Thus, the user recognizes the second object 625 as if it was displayed on the display 180. In this case, the second object 625 may be referred to as a 2D object and may be referred to as a 3D object. The second object 625 is an object having the same depth as the display 180, and the depth of the second object 625 is zero.

The third object 635 and the fourth object 645 are examples for explaining 3D objects that are recognized as being protruded toward the user on the display 180. Furthermore, the degree of perspective or stereoscopic perception perceived by the user according to the change of the distance between the left eye image and the right eye image may be described with reference to the examples of the third object 635 and the fourth object 645.

The third object 635 includes a third right eye image 633 (denoted as R3) and a third left eye image 663 and denoted by L3. The interval between the third right eye image 633 and the third left eye image 631 is set to d3. The user recognizes that an extension line connecting the left eye 601 and the third left eye image 631 and the point where the extension lines of the right eye 603 and the third right eye image 633 cross each other. Accordingly, the user recognizes that the third object 625 is located in front of the display 180, that is, near the user. That is, the third object 635 is recognized by the user as if it is positioned to protrude toward the user on the display 180. In the present embodiment, the depth of the 3D object recognized by the user, as located in front of the display 180, has a positive value (+). Thus, the depth of the third object 635 has a positive value.

The fourth object 645 is composed of a fourth right eye image (indicated by R3) and a fourth left eye image (indicated by 641 and L4). The interval between the fourth right eye image 643 and the fourth left eye image 641 is set to d4. Here, an inequality of 'd3 <d4' is established between d3 and d4. The user recognizes that an extension line connecting the left eye 601 and the fourth left eye image 641 and the extension line of the right eye 603 and the fourth right eye image 643 intersect. Accordingly, the user recognizes that the fourth object 645 is located in front of the display 180, that is, closer to the user, and closer to the user than the third object 635. That is, the fourth object 645 is recognized by the user as if the fourth object 645 is positioned to protrude toward the user rather than the display 180 and the third object 635. The fourth object 645 has a depth positive value.

The image display apparatus 100 adjusts the positions of the left eye image and the right eye image displayed on the display 180 so that an object consisting of the left eye image and the right eye image is recognized or positioned in front of the user as if the object is composed of the left and right eye images. It can be made visible to the user as if it were a user. In addition, the image display apparatus 100 may adjust the display interval of the left eye image and the right eye image displayed on the display 180 to adjust the depth of the object composed of the left eye image and the right eye image.

That is, according to the description with reference to FIG. 6, the depth of the object composed of the left eye image and the right eye image has a positive value (+) or a negative value (-) according to the left and right display positions of the left eye image and the right eye image. It can be seen that it is determined. As described above, an object having a positive value (+) having a depth is an object that is recognized by the user as if it is positioned to protrude from the display 180. In addition, an object having a negative value (−) having a depth is an object that is recognized by the user as if it is located backward from the display 180.

6, the depth of the object, that is, the distance between the point recognized by the user as if the 3D image is located and the display 180 vary according to the absolute value of the distance between the left eye image and the right eye image. Can be.

7 is a diagram illustrating a state in which a sense of depth of an image is controlled according to an embodiment of the present invention. Referring to FIG. 7, it can be seen that the depth of the same image or the same 3D object varies depending on the distance between the left eye image 701 and the right eye image 702 displayed on the display 180. In this embodiment, the depth of the display 180 is set to zero. The depth of the image to be recognized as if protruding from the display 180 is set to have a positive value.

The interval between the left eye image 701 and the right eye image 702 illustrated in FIG. 7A is a. The interval between the left eye image 701 and the right eye image 702 illustrated in FIG. 7B is b. Where b is greater than a. That is, in the example illustrated in FIG. 7B, the interval between the left eye image 701 and the right eye image 702 is wider than the example illustrated in FIG. 7A.

In this case, as described with reference to FIG. 6, the depth of the 3D image or the 3D object illustrated in FIG. 7B is greater than the depth of the 3D image or the 3D object illustrated in FIG. 7A. . In each case, when the depth is quantified and expressed as a 'and b', respectively, it can be seen that a '<b' relationship is also established according to a <b. In other words, when the 3D image is projected, the depth of the expression may be increased or decreased as the distance between the left eye image 701 and the right eye image 702 is increased or decreased.

8 and 9 are views illustrating an image display device and a remote control device according to an embodiment of the present invention.

The image display apparatus 100 may be controlled by a signal transmitted from the remote control apparatus 200. The user may input commands such as power on / off, channel up / down, volume up / down, etc. to the image display apparatus 100 using the remote control apparatus 200. The remote control apparatus 200 transmits a signal including a command corresponding to a user's manipulation to the image display apparatus 100. The image display apparatus 100 may determine a signal received from the remote control apparatus 200 to generate a control signal according to the signal, or perform an operation according to a command included in the signal.

The remote control apparatus 200 may transmit a signal to the image display apparatus 100 according to the IR communication standard. In addition, the remote control apparatus 200 may transmit a signal to the image display apparatus 100 or receive a signal transmitted by the image display apparatus 100 according to another type of wireless communication standard. Among the remote control apparatus 200, there may be a remote control apparatus 200 that detects a user's movement and transmits a signal including a command corresponding to the movement to the image display apparatus 100. In this embodiment, such a remote control device 200 will be described as an example of a spatial remote control. According to various embodiments of the present disclosure, in addition to the space remote control, a general wired / wireless mouse or an air mouse or various pointing means or remote controllers in various forms (rings, bracelets, thimbles, etc.) may correspond to the remote control apparatus 200. have.

In the embodiment described with reference to FIGS. 8 and 9, the spatial remote control 201 is one of the remote control apparatus 200 capable of inputting a command to the image display apparatus 100 for remote control of the image display apparatus 100. ) And a perspective view of the spatial remote control 201 is shown in FIGS. 8 and 9.

In the present embodiment, the spatial remote controller 201 may transmit and receive a signal with the image display apparatus 100 according to RF communication standard. As illustrated in FIG. 8, a pointer 202 corresponding to the spatial remote control 201 may be displayed on the image display apparatus 100.

The user may move or rotate the space remote control 201 up, down, left, and right. The pointer 202 displayed on the image display apparatus 100 corresponds to the movement of the spatial remote control 201. FIG. 9 illustrates a bar in which the pointer 202 displayed on the image display apparatus 100 moves in response to the movement of the spatial remote control 201.

In the example described with reference to FIG. 9, when the user moves the spatial remote control 201 to the left side, the pointer 202 displayed on the image display apparatus 100 also moves to the left side correspondingly. In this regard, the spatial remote control 201 may be provided with a sensor that can determine the movement. Information about the movement of the spatial remote controller 201 detected by the sensor of the spatial remote controller 201 is transmitted to the image display apparatus 100. The image display apparatus 100 may calculate the coordinates of the pointer 202 from information about the movement of the spatial remote control 201. The image display apparatus 100 may display the pointer 202 to correspond to the calculated coordinates.

As illustrated in FIGS. 8 and 9, the pointer 202 displayed on the image display apparatus 100 may move in response to the up, down, left, or right rotation of the spatial remote control 201. The moving speed or the moving direction of the pointer 202 may correspond to the moving speed or the moving direction of the spatial remote control 201.

For the above-described series of operations or functions of the space remote control 201, the space remote control 201 is a remote control wireless communication unit, a user input unit, a sensor unit, a remote control signal output unit, a power supply unit, a remote control information storage unit, a remote control unit And may include submodules. That is, the remote control unit of the spatial remote controller generates the remote control signal by processing the information or signal detected from the user input unit and / or the sensing unit. For example, the remote control signal includes information about a portion of a keypad or a button corresponding to a user input unit, pressure or touch duration, duration of pressure or touch, and coordinates or angles at which the spatial remote controller is moved or rotated through the sensing unit. Can be generated based on the information about the.

The remote control signal generated through the above process is transmitted to the image display device through the remote control wireless communication unit. More specifically, the remote control signal output through the remote control wireless communication unit is input to the remote control unit interface unit 140 of the image display device. In addition, the remote control wireless communication unit may receive a wired / wireless signal transmitted from the image display device.

The remote control information storage unit stores various types of programs and application data necessary for controlling or operating an image display device or a spatial remote controller. For example, when the wireless communication between the image display device and the spatial remote control is performed, the remote control information storage unit stores the used frequency band, so that the remote control information regarding the frequency band can be used for communication of the wall.

In addition, the power supply unit is a module for supplying power required for driving the spatial remote control. According to an example, when the power supply unit outputs a signal for commanding that the remote control unit temporarily stops or resumes the power supply according to the movement of the spatial remote controller detected by the sensing unit, the power supply unit supplies power according to the control signal. By varying t, power can be saved while the spatial remote control is not used or not in operation.

As another example, a predetermined command may be input to the image display apparatus 100 in response to the movement of the spatial remote control 201. That is, even if a predetermined pressure or a touch is not detected in the user input unit, a predetermined command may be input or generated only by the movement of the spatial remote controller. For example, when the spatial remote control 201 is moved back and forth, the size of the image displayed on the image display apparatus 100 may be enlarged or reduced. Therefore, examples regarding the spatial remote controller do not limit the scope of the present invention.

10 is a flowchart illustrating a method of operating an image display device according to an embodiment of the present invention, and FIGS. 11 to 13 are referred to for describing the method of operation of an image display device according to an embodiment of the present invention. It is a drawing.

According to an exemplary embodiment of the present invention, a method of operating an image display apparatus includes receiving a 3D image signal (S1010), receiving identification information of a 3D image viewing apparatus (S1020), and applying the 3D image signal to the 3D image signal. Generating a driving signal of the 3D image viewing apparatus based on the content information and the identification information included (S1030) and transmitting the driving signal to the 3D image viewing apparatus (S1040).

That is, by receiving the 3D image signal and the identification information of the 3D image viewing apparatus 195, a driving signal for the 3D image viewing apparatus 195 may be generated based on this, and the 3D image viewing apparatus 195 may be controlled. .

The content information may include at least one of audience rating information and depth information of the 3D image.

The audience rating information may be set, for example, by classifying a rating of 5 or more, a rating of 12 or more, a rating of 15 or more, or a rating of 18 or more. The type of viewing grade may be preset in the image display apparatus 100 or may be determined in consideration of the viewing grade of the received image signal. That is, the video signal may include information related to the content, such as EPG information, and the information related to the content may include viewing rating information of the corresponding content.

Meanwhile, the content information may include depth information of the 3D image, and the image display device may add or subtract the depth of the 3D image.

The identification information may include at least one of an identification name, an identification number, user information, and viewing grade information of the 3D image viewing apparatus. That is, the 3D video viewing device may include a name or number that can be distinguished from other 3D video viewing devices, or, in the case of a 3D video viewing device dedicated to an individual user, may include information about a corresponding user. Alternatively, since the 3D image viewing apparatus is often implemented in the form of glasses and is often divided into at least adult and children, it may be managed by dividing it into at least two or more grades.

Meanwhile, various environment setting values such as viewing grade information may be set for each of the divided 3D image viewing apparatuses, and the identification information may include viewing grade information. In addition, when there is a health singularity, such as a child, a pregnant woman, an elderly person, it may include an environment setting value for limiting viewing of the 3D image or reducing the depth of the 3D image.

The 3D image viewing apparatus may be a shutter glass. In an active method such as a shutter glass type, the left eye glass part and the right eye glass part may be opened or closed based on a driving signal received from the image display apparatus. Accordingly, the left eye glass part and the right eye glass part may be sequentially opened and closed by synchronizing with the 3D image, and in some cases, the left eye glass part and the right eye glass part may be closed or opened.

On the other hand, in the step of generating a drive signal (S1030), the drive signal is a signal for controlling to turn off the power of the 3D image viewing device, or to close the left eye glass and right eye glass included in the 3D image viewing apparatus. You can do

That is, if the user is an elderly person, such as a child or a pregnant woman, or if the 3D image restriction is set based on the identification information of the 3D image viewing device, the 3D image viewing device is turned off so that the user cannot view the 3D image, or the left eye glass part and the right eye You can completely block the viewing of the video by closing all the glass parts.

In addition, as shown in FIG. 11, when a plurality of users individually use the 3D image viewing apparatus, different driving signals may be generated and transmitted for each identified 3D image viewing apparatus. For example, when the displayed image corresponds to an adult or higher viewing grade, the normal driving signal is transmitted to the 3D image viewing apparatus 195A of the adult user, and the left eye glass unit is used as the 3D image viewing apparatus 195B of the child user. The driving signal for closing both the and right eye glass portions can be transmitted.

On the other hand, the operation method of the image display device according to an embodiment of the present invention may further include the step of registering the identification information. The user may register an identification name or an identification number for each 3D image viewing apparatus, or register 3D image related setting information, such as user personal information such as age and viewing grade information, 3D image availability, and 3D image depth information.

12 illustrates a screen for registering identification information and setting information of the 3D image viewing apparatus. The registration menu window 1200 may include an identification number 1210, a viewing grade information 1220, and 3D image depth information 1230 of the 3D image viewing apparatus.

Meanwhile, the user may change the setting of the registration menu window 1200 through a means such as a pointer 1250 displayed in response to the motion information of the remote controller.

Meanwhile, the operation method of the image display apparatus according to an embodiment of the present invention may further include displaying at least one of the content information 1310 and the identification information as shown in FIG. 13. That is, when registering or changing the identification information, the identification information may be displayed, or when the 3D image is displayed, at least one of the content information and the identification information may be displayed to provide information to the user. The user can easily grasp the content through the content information 1310.

14 is a flowchart illustrating a method of operating an image display device according to an embodiment of the present invention, and FIG. 15 is a view referred to for describing the method of operating an image display device according to an embodiment of the present invention. .

According to an exemplary embodiment of the present invention, a method of operating an image display apparatus includes receiving a 3D image signal (S1410), receiving identification information of a 3D image viewing apparatus (S1420), and the identification information according to the identification information. It may include the step of displaying the image differently (S1030).

The difference from the embodiment described with reference to FIGS. 10 to 13 is to adjust and display an image displayed on the image display apparatus according to the identification information of the 3D image viewing apparatus.

On the other hand, the image display step (S1030), it may be characterized in that to display a multi-view image constituting a 3D image on the basis of the 3D image signal, in this case it can be displayed by reducing the depth of the 3D image. .

That is, when a pregnant woman, a minor, or the like, the user sets to reduce the depth of the 3D image, the depth of the 3D image 1510 of FIG. 15A is reduced to reduce the depth of the 3D image 1510 of FIG. 15B. As shown in the 3D image 1520, the image may be displayed to protrude less.

As described with reference to FIG. 9, the depth of the 3D image may be controlled by a method of adjusting the disparity interval between the left eye image and the right eye image.

Alternatively, in the image display step (S1030), the 3D image signal may be converted and displayed as a 2D image.

That is, when a pregnant woman, a minor, or the like watches a content, when the user sets not to watch a 3D video, the content may be displayed as a 2D video. The method of converting and displaying a 3D image into a 2D image may be implemented by various methods such as displaying only a left eye image on a full screen or displaying only a right eye image on a full screen.

According to the present invention, the 3D image viewing apparatus is controlled by comparing the 3D content and the information included in the video signal with the personal setting of the 3D image viewing apparatus set in the 3DTV. In addition, by setting the basic information for each 3D glasses, it is possible to provide a 3D service in accordance with the user information, such as viewing ratings.

According to the present invention, each 3D glasses can be individually set so that each member's screen is not changed or shown by viewing grades or other information so as not to adversely affect children or pregnant women.

In addition, according to the identification information of the 3D image viewing apparatus, the image displayed on the image display apparatus may be converted into a 2D image or the depth of display may be displayed.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

In addition, while the preferred embodiments of the present invention have been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

100: video display device
170:
180: display
195: 3D video viewing device
200: remote control device

Claims (15)

Receiving a 3D video signal;
Receiving identification information of the 3D image viewing apparatus;
Generating a driving signal of the 3D image viewing apparatus based on the content information and the identification information included in the 3D image signal; And,
Transmitting the driving signal to the 3D image viewing apparatus.
The method of claim 1,
And the content information includes at least one of audience rating information and depth information of a 3D image.
The method of claim 1,
And the identification information includes at least one of an identification name, an identification number, user information, and viewing grade information of the 3D image viewing apparatus.
The method of claim 1,
And the 3D image viewing device is a shutter glass.
The method of claim 1,
And the driving signal is a signal for turning off the power of the 3D image viewing device or controlling the left eye glass part and the right eye glass part included in the 3D image viewing device to be closed.
The method of claim 1,
And registering the identification information.
The method of claim 1,
And displaying at least one of the content information and the identification information.
Receiving a 3D video signal;
Receiving identification information of the 3D image viewing apparatus; And,
And displaying the image differently according to the identification information.
The method of claim 8,
In the image displaying step, a multi-view image constituting a 3D image is displayed based on the 3D image signal.
10. The method of claim 9,
In the image displaying step, the depth of the 3D image is reduced and displayed.
The method of claim 8,
In the image displaying step, the 3D image signal is converted and displayed as a 2D image.
The method of claim 8,
And the content information includes at least one of audience rating information and depth information of a 3D image.
The method of claim 8,
And the identification information includes at least one of an identification name, an identification number, user information, and viewing grade information of the 3D image viewing apparatus.
The method of claim 8,
And registering the identification information.
The method of claim 8,
And displaying at least one of the content information and the identification information.
KR1020100063566A 2010-07-01 2010-07-01 Method for operating an apparatus for displaying image KR20120002852A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100063566A KR20120002852A (en) 2010-07-01 2010-07-01 Method for operating an apparatus for displaying image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100063566A KR20120002852A (en) 2010-07-01 2010-07-01 Method for operating an apparatus for displaying image

Publications (1)

Publication Number Publication Date
KR20120002852A true KR20120002852A (en) 2012-01-09

Family

ID=45610054

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100063566A KR20120002852A (en) 2010-07-01 2010-07-01 Method for operating an apparatus for displaying image

Country Status (1)

Country Link
KR (1) KR20120002852A (en)

Similar Documents

Publication Publication Date Title
US8803954B2 (en) Image display device, viewing device and methods for operating the same
US8797390B2 (en) Image display device, 3D viewing device, and method for operating the same
KR101349276B1 (en) Video display device and operating method therefor
KR20110052771A (en) Image display device and operating method for the same
US20120242808A1 (en) Image display apparatus and method for operating the same
KR20110052308A (en) Apparatus for displaying image and method for operating the same
KR20110117490A (en) Method for operating an apparatus for displaying image
US20130291017A1 (en) Image display apparatus and method for operating the same
KR20110122557A (en) Apparatus for displaying image and method for operating the same
KR101708692B1 (en) Image display apparatus and method for operating the same
KR101730424B1 (en) Image display apparatus and method for operating the same
KR101730323B1 (en) Apparatus for viewing image image display apparatus and method for operating the same
KR20120062428A (en) Image display apparatus, and method for operating the same
KR20120002852A (en) Method for operating an apparatus for displaying image
KR101737367B1 (en) Image display apparatus and method for operating the same
KR20110134087A (en) Image display apparatus and method for operating the same
KR101176500B1 (en) Image display apparatus, and method for operating the same
KR20120034836A (en) Image display apparatus, and method for operating the same
KR101691801B1 (en) Multi vision system
KR101716144B1 (en) Image display apparatus, and method for operating the same
KR101730423B1 (en) Apparatus for displaying image and method for operating the same
KR20110114996A (en) Apparatus for displaying image and method for operating the same
KR20110133296A (en) Apparatus for viewing 3d image and method for operating the same
KR20110114295A (en) Apparatus for viewing 3d image and method for operating the same
KR20120011641A (en) Method for operating an apparatus for displaying image

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination