KR20120034836A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20120034836A
KR20120034836A KR1020100096013A KR20100096013A KR20120034836A KR 20120034836 A KR20120034836 A KR 20120034836A KR 1020100096013 A KR1020100096013 A KR 1020100096013A KR 20100096013 A KR20100096013 A KR 20100096013A KR 20120034836 A KR20120034836 A KR 20120034836A
Authority
KR
South Korea
Prior art keywords
image
depth
signal
input
viewing
Prior art date
Application number
KR1020100096013A
Other languages
Korean (ko)
Inventor
황도청
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020100096013A priority Critical patent/KR20120034836A/en
Publication of KR20120034836A publication Critical patent/KR20120034836A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4753End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for user identification, e.g. by entering a PIN or password
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/005Aspects relating to the "3D+depth" image format

Abstract

The present invention relates to an image display apparatus and an operation method thereof. According to an embodiment of the present invention, a method of operating an image display apparatus includes receiving a 3D image, and when there is a variable input of a depth of the 3D image, the depth of the 3D image according to the viewing grade or age grade of the received 3D image. And varying the depth and displaying the 3D image of which the depth is variable. As a result, the 3D image can be displayed according to the viewing grade of the received 3D image.

Description

Image display apparatus, and method for operating the same}

The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method capable of displaying a 3D image suitable for the viewing grade of the received 3D image.

The image display device is a device having a function of displaying an image that a user can watch. The user can watch the broadcast through the image display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is shifting from analog broadcasting to digital broadcasting worldwide.

Digital broadcasting refers to broadcasting for transmitting digital video and audio signals. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. In addition, unlike analog broadcasting, digital broadcasting is capable of bidirectional services.

SUMMARY OF THE INVENTION An object of the present invention is to provide an image display apparatus and an operation F method thereof capable of displaying a 3D image in accordance with a viewing grade of a received 3D image.

According to an aspect of the present invention, there is provided a method of operating an image display apparatus, the method comprising: receiving a 3D image and inputting a variable depth of the 3D image to a viewing grade or age grade of the received 3D image. Accordingly, the method may include varying the depth of the 3D image and displaying the 3D image having the variable depth.

In addition, the operation method of the image display device according to an embodiment of the present invention for achieving the above object, the step of displaying a 3D image, and if there is a variable input depth of the 3D image, the viewing grade or detection grade of the 3D image Accordingly, the method may further include displaying an object requesting a login and varying a depth of the 3D image according to a depth variable input after login authentication.

According to an exemplary embodiment of the present invention, a display for displaying a 3D image and a depth variable input of the 3D image may include a 3D image according to a viewing grade or an age grade of the 3D image. It includes a control unit for varying the depth.

According to an embodiment of the present invention, when there is a variable input depth of the 3D image, by varying the depth of the 3D image according to the viewing grade or age grade of the received 3D image, the viewing grade or age grade or user of the received 3D image 3D image can be displayed suitably.

In particular, dizziness of the user due to excessive 3D depth can be prevented, and the user's convenience can be improved.

1 is a block diagram illustrating an image display apparatus according to an exemplary embodiment of the present invention.
2A to 2B are internal block diagrams of a set-top box and a display device according to an embodiment of the present invention.
3 is an internal block diagram of the controller of FIG. 1.
4 is a diagram illustrating various formats of a 3D image.
5 is a diagram illustrating an operation of a 3D viewing apparatus according to the format of FIG. 4.
6 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating an image formed by a left eye image and a right eye image.
8 is a diagram illustrating depth of a 3D image according to a distance between a left eye image and a right eye image.
9 is a diagram illustrating a 3D viewing apparatus and an image display apparatus according to an exemplary embodiment of the present invention.
FIG. 10 is a block diagram illustrating the 3D viewing apparatus and the image display apparatus of FIG. 9.
11 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
12 to 13B are views referred to for describing various examples of an operation method of the image display apparatus of FIG. 11.
14 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
15 to 16D are diagrams for describing various examples of an operation method of the image display apparatus of FIG. 14.
17 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention.
18A to 18D are views referred to for describing various examples of an operating method of the image display apparatus of FIG. 17.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffixes "module" and "unit" for components used in the following description are merely given in consideration of ease of preparation of the present specification, and do not impart any particular meaning or role by themselves. Therefore, the "module" and "unit" may be used interchangeably.

1 is a block diagram illustrating an image display apparatus according to an exemplary embodiment of the present invention.

Referring to FIG. 1, the image display apparatus 100 according to an exemplary embodiment of the present invention includes a tuner unit 110, a demodulator 120, an external device interface unit 130, a network interface unit 135, and a storage unit. 140, a user input interface unit 150, a sensor unit (not shown), a controller 170, a display 180, an audio output unit 185, and a 3D viewing device 195.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by a user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through an antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner 110 may process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal CVBS / SIF output from the tuner 110 may be directly input to the controller 170.

In addition, the tuner unit 110 may receive a single broadcast RF broadcast signal according to an ATSC (Advanced Television System Committee) scheme or a multiple broadcast RF broadcast signal according to a digital video broadcasting (DVB) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna in the present invention, and converts them to intermediate frequency signals or baseband video or audio signals. Can be converted to

On the other hand, the tuner unit 110 may be provided with a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner may be used to receive broadcast signals of multiple channels simultaneously.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

For example, when the digital IF signal output from the tuner unit 110 is an ATSC scheme, the demodulator 120 performs 7-VSB (7-Vestigal Side Band) demodulation. Also, the demodulation unit 120 may perform channel decoding. To this end, the demodulator 120 includes a trellis decoder, a de-interleaver, and a reed solomon decoder to perform trellis decoding, deinterleaving, Solomon decoding can be performed.

For example, when the digital IF signal output from the tuner unit 110 is a DVB scheme, the demodulator 120 performs coded orthogonal frequency division modulation (COFDMA) demodulation. Also, the demodulation unit 120 may perform channel decoding. For this, the demodulator 120 may include a convolution decoder, a deinterleaver, and a reed-solomon decoder to perform convolutional decoding, deinterleaving, and reed solomon decoding.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal. For example, the stream signal may be an MPEG-2 TS (Transport Stream) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, or the like. Specifically, the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.

On the other hand, the demodulator 120 described above can be provided separately according to the ATSC system and the DVB system. That is, it can be provided as an ATSC demodulation unit and a DVB demodulation unit.

The stream signal output from the demodulator 120 may be input to the controller 170. After performing demultiplexing, image / audio signal processing, and the like, the controller 170 outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 may transmit or receive data with the connected external device 190. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to the external device 190 such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire or wireless. . The external device interface unit 130 transmits an image, audio, or data signal input from the outside through the connected external device 190 to the controller 170 of the image display apparatus 100. In addition, the controller 170 may output an image, audio, or data signal processed by the controller 170 to a connected external device. To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The A / V input / output unit includes a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI so that video and audio signals of an external device can be input to the video display device 100. (Digital Visual Interface) terminal, HDMI (High Definition Multimedia Interface) terminal, RGB terminal, D-SUB terminal and the like.

The wireless communication unit can perform short-range wireless communication with other electronic devices. The image display apparatus 100 may communicate with Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance (DLNA). Depending on the specification, it can be networked with other electronic devices.

In addition, the external device interface unit 130 may be connected through at least one of the various set top boxes and the various terminals described above to perform input / output operations with the set top box.

The external device interface unit 130 may transmit / receive data with the 3D viewing device 195.

The network interface unit 135 provides an interface for connecting the image display apparatus 100 to a wired / wireless network including an internet network. The network interface unit 135 may include an Ethernet terminal for connection with a wired network, and for connection with a wireless network, a WLAN (Wi-Fi) or a Wibro (Wireless). Broadband, Wimax (World Interoperability for Microwave Access), High Speed Downlink Packet Access (HSDPA) communication standards, and the like may be used.

The network interface unit 135 may receive content or data provided by the Internet or a content provider or a network operator through a network. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from the Internet, a content provider, etc. may be received through a network. In addition, the update information and the update file of the firmware provided by the network operator can be received. It may also transmit data to the Internet or content provider or network operator.

In addition, the network interface unit 135 is connected to, for example, an Internet Protocol (IP) TV, and receives the video, audio, or data signals processed in the set-top box for the IPTV to enable bidirectional communication. The signal processed by the controller 170 may be transmitted to the set-top box for the IPTV.

Meanwhile, the above-described IPTV may mean ADSL-TV, VDSL-TV, FTTH-TV, etc. according to the type of transmission network, and include TV over DSL, Video over DSL, TV overIP (TVIP), and Broadband TV ( BTV) and the like. In addition, IPTV may also mean an Internet TV capable of accessing the Internet, or a full browsing TV.

The storage 140 may store a program for processing and controlling each signal in the controller 170, or may store a signal-processed video, audio, or data signal.

In addition, the storage unit 140 may perform a function for temporarily storing an image, audio, or data signal input to the external device interface unit 130. In addition, the storage 140 may store information on a predetermined broadcast channel through a channel storage function such as a channel map.

The storage unit 140 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), It may include at least one type of storage medium such as RAM, ROM (EEPROM, etc.). The image display apparatus 100 may reproduce and provide a file (video file, still image file, music file, document file, etc.) stored in the storage 140 to a user.

1 illustrates an embodiment in which the storage unit 140 is provided separately from the control unit 170, but the scope of the present invention is not limited thereto. The storage 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

For example, the user input interface unit 150 may be powered on / off, channel selection, and screen from the remote controller 200 according to various communication methods such as a radio frequency (RF) communication method and an infrared (IR) communication method. A user input signal such as a setting may be received, or a signal from the controller 170 may be transmitted to the remote controller 200.

In addition, for example, the user input interface unit 150 may transmit a user input signal input from a local key (not shown) such as a power key, a channel key, a volume key, and a set value to the controller 170.

The sensor unit (not shown) may detect the location of the user or the location of the user's gesture, touch, or 3D viewing device 195. To this end, the sensor unit (not shown) may include a touch sensor, a voice sensor, a position sensor, an operation sensor, a gyro sensor, and the like.

The detected user's location, user's gesture, touch, or location signal of the 3D viewing device 195 may be input to the controller 170. Alternatively, unlike the drawing, it may be input to the controller 170 through the user input interface unit 150.

The controller 170 may demultiplex the input stream or process the demultiplexed signals through the tuner unit 110, the demodulator 120, or the external device interface unit 130. Signals can be generated and output.

The image signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the image signal. In addition, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The voice signal processed by the controller 170 may be sound output to the audio output unit 185. In addition, the voice signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

Although not shown in FIG. 1, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 3.

In addition, the controller 170 may control overall operations of the image display apparatus 100. For example, the controller 170 may control the tuner 110 to control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

For example, the controller 170 controls the tuner 110 to input a signal of a channel selected according to a predetermined channel selection command received through the user input interface 150. Then, video, audio, or data signals of the selected channel are processed. The controller 170 may output the channel information selected by the user together with the processed video or audio signal through the display 180 or the audio output unit 185.

As another example, the controller 170 may output an external device 190, eg, a camera, input through the external device interface unit 130 according to an external device image playback command received through the user input interface unit 150. Alternatively, the video signal or the audio signal from the camcorder may be output through the display 180 or the audio output unit 185.

The controller 170 may control the display 180 to display an image. For example, the broadcast image input through the tuner unit 110, the external input image input through the external device interface unit 130, or the image input through the network interface unit 135 or stored in the storage unit 140. The image may be controlled to be displayed on the display 180.

In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate and display a 3D object with respect to a predetermined object in the image displayed on the display 180. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), an EPG (Electronic Program Guide), various menus, widgets, icons, still images, videos, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

The controller 170 recognizes a user's position based on an image photographed by a photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 may be determined. In addition, the x-axis coordinates and the y-axis coordinates in the display 180 corresponding to the user position may be determined.

Meanwhile, the controller 170 may perform signal processing to enable viewing of a corresponding video according to the viewing device.

For example, when the sensor unit (not shown) or the photographing unit (not shown) detects the presence, operation, or number of the viewing device 195, the controller 170 performs pairing with the viewing device 195. Signal processing can be performed. That is, the display device 195 may control to output the pairing signal to the viewing device 195 and to receive the response signal from the viewing device 195.

The controller 170 may control the tuner 110 to receive a broadcast image according to the number of detected viewing devices 195. For example, if the number of detected viewing devices is three, the tuner 110 including the plurality of tuners may be controlled to receive broadcast images of different channels. In addition, the controller 170 may control to display each broadcast video at different times in synchronization with each viewing device.

The controller 170 may also receive an input external input image according to the number of detected viewing devices. For example, when the number of detected viewing apparatuses is three, each of the plurality of viewing apparatuses may be controlled to receive an external input image from an optical device such as a broadcast image or a DVD and an external input image from a PC. In addition, the controller 170 may control to display each video (broadcast video, DVD video, PC video) at different times in synchronization with each viewing device.

Meanwhile, the controller 170 may increase the vertical synchronization frequency Vsync of the displayed image and control the corresponding images to be displayed whenever the number of the viewing apparatuses detected during image display increases. For example, for 1/60 seconds, the first image and the second image are displayed in synchronization with the first viewing device and the second 3D viewing device, respectively, and when the third viewing device is used, for 1/60 seconds, The first to third images may be controlled to be displayed in synchronization with the first to third viewing devices, respectively. That is, the first image and the second image may be displayed at 120 Hz, and the first to third images may be displayed at 180 HZ.

On the other hand, the control unit 170 may set different viewing targets, for example, channel searching targets of broadcast images, for each viewing apparatus. For example, a channel search target may be set differently for each age such as an adult and a child, and the search target may be different when the channel is searched. In addition, it is also possible to provide by classification by taste, gender, recent viewing channel, or program grade.

On the other hand, when the first viewing device and the second viewing device select the same image from each other, the controller 170 may control to notify a message indicating that it is a duplicate. Such a message may be displayed in the form of an object on the display 180 or may be transmitted as a wireless signal to each viewing device.

On the other hand, although not shown in the figure, it may be further provided with a channel browsing processing unit for generating a thumbnail image corresponding to the channel signal or the external input signal. The channel browsing processor may receive a stream signal TS output from the demodulator 120 or a stream signal output from the external device interface 130, extract a video from the input stream signal, and generate a thumbnail image. Can be. The generated thumbnail image may be input as it is or encoded to the controller 170. In addition, the generated thumbnail image may be encoded in a stream form and input to the controller 170. The controller 170 may display a thumbnail list including a plurality of thumbnail images on the display 180 by using the input thumbnail image. In this case, the thumbnail list may be displayed in a simple viewing manner displayed in a partial region while a predetermined image is displayed on the display 180 or in an overall viewing manner displayed in most regions of the display 180. The thumbnail images in the thumbnail list may be updated sequentially.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or a 3D display. In order to view the 3D image, the display 180 may be divided into an additional display method and a single display method.

The independent display method may implement a 3D image by the display 180 alone without additional display, for example, glasses, and the like, for example, a lenticular method, a parallax barrier, or the like. Various methods can be applied.

Meanwhile, the additional display method may implement 3D images by using the additional display as the 3D viewing apparatus 195 in addition to the display 180. For example, various methods such as a head mounted display (HMD) type and glasses type may be used. Can be applied.

On the other hand, the spectacle type may be divided into a passive system such as a polarized glasses type and an active system such as a shutter glass type. In addition, the head mounted display can be divided into a passive type and an active type.

The 3D viewing apparatus 195 may be 3D glasses capable of viewing stereoscopic images. The 3D glass 195 may include a passive polarized glass or an active shutter glass, and is described with the concept of including the aforementioned head mount type.

The display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives a signal processed by the controller 170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs a voice signal. The voice output unit 185 may be implemented by various types of speakers.

The photographing unit (not shown) photographs the user. The photographing unit (not shown) may be implemented by one camera, but is not limited thereto and may be implemented by a plurality of cameras. Meanwhile, the photographing unit (not shown) may be disposed above the display 180. The image information photographed by the photographing unit (not shown) is input to the controller 170.

The controller 170 may detect a user's gesture by combining or combining an image captured by a photographing unit (not shown) or a detected signal from a sensor unit (not shown).

The remote control apparatus 200 transmits the user input to the user input interface unit 150. To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. In addition, the remote control apparatus 200 may receive an image, an audio or a data signal output from the user input interface unit 150, and display or output the audio from the remote control apparatus 200.

The video display device 100 described above is a fixed type of ATSC (8-VSB) digital broadcasting, DVB-T (COFDM) digital broadcasting, ISDB-T (BST-OFDM) digital broadcasting, and the like. It may be a digital broadcast receiver capable of receiving at least one. In addition, as a mobile type, digital broadcasting of terrestrial DMB system, digital broadcasting of satellite DMB system, digital broadcasting of ATSC-M / H system, digital broadcasting of DVB-H system (COFDM system) and media flow link only system It may be a digital broadcast receiver capable of receiving at least one of digital broadcasts. It may also be a digital broadcast receiver for cable, satellite communications, or IPTV.

On the other hand, the image display device described in the present specification is a TV receiver, a projector, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, PDA (Personal Digital Assistants), PMP (Portable) Multimedia Player) may be included.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 1 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 that is actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

On the other hand, the image display apparatus 100 does not include the tuner 110 and the demodulator 120 shown in FIG. 1, unlike the illustrated in FIG. 1, but the network interface 130 or the external device interface unit ( Through 135, image content may be received and played back.

The image display apparatus 100 is an example of an image signal processing apparatus that performs signal processing of an image stored in an apparatus or an input image. Another example of the image signal processing apparatus is the display 180 illustrated in FIG. 1. And a set top box in which the audio output unit 185 is excluded, the above-described DVD player, Blu-ray player, game device, computer, etc. may be further illustrated. Among these, the set top box will be described with reference to FIGS. 2A to 2B below.

2A to 2B are internal block diagrams of a set-top box and a display device according to an embodiment of the present invention.

First, referring to FIG. 2A, the set-top box 250 and the display apparatus 300 may transmit or receive data by wire or wirelessly. Hereinafter, a description will be given focusing on differences from FIG. 1.

The set top box 250 may include a network interface unit 255, a storage unit 258, a signal processor 260, a user input interface unit 263, and an external device interface unit 265.

The network interface unit 255 provides an interface for connecting to a wired / wireless network including an internet network. It is also possible to transmit or receive data with other users or other electronic devices via the connected network or another network linked to the connected network.

The storage unit 258 may store a program for processing and controlling signals in the signal processing unit 260, and may include an image, audio, or data input from the external device interface unit 265 or the network interface unit 255. It may also serve as a temporary storage of the signal.

The signal processor 260 performs signal processing on the input signal. For example, demultiplexing or decoding of an input video signal may be performed, and demultiplexing or decoding of an input audio signal may be performed. To this end, a video decoder or an audio decoder may be provided. The signal processed video signal or audio signal may be transmitted to the display apparatus 300 through the external device interface unit 265.

The user input interface unit 263 transmits a signal input by the user to the signal processor 260 or transmits a signal from the signal processor 260 to the user. For example, various control signals, such as power on / off, operation input, setting input, etc., which are input through a local key (not shown) or the remote control apparatus 200, may be received and transmitted to the signal processor 260.

The external device interface unit 265 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, an interface for transmitting or receiving data with the display apparatus 300 is provided. In addition, it is possible to provide an interface for transmitting or receiving data with an external device such as a game device, a camera, a camcorder, a computer (laptop), or the like.

The set top box 250 may further include a media input unit (not shown) for reproducing a separate media. An example of such a media input unit may be a Blu-ray input unit (not shown). That is, the set top box 250 can be provided with a Blu-ray player or the like. The input media such as a Blu-ray disc may be transmitted to the display apparatus 300 through the external device interface unit 265 for display after signal processing such as demultiplexing or decoding in the signal processing unit 260. .

The display apparatus 300 includes a tuner unit 270, an external device interface unit 273, a demodulator 275, a storage unit 278, a controller 280, a user input interface unit 283, and a display 290. , And an audio output unit 295.

The tuner 270, the demodulator 275, the storage 278, the controller 280, the user input interface 283, the display 290, and the audio output unit 295 are described with reference to FIG. 1. The tuner 110, the demodulator 120, the storage 140, the controller 170, the user input interface 150, the display 180, and the audio output unit 185 are described. Omit the description.

The external device interface unit 273 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, it provides an interface for transmitting or receiving data with the set-top box 250.

Accordingly, the video signal or the audio signal input through the set top box 250 is output through the display 290 or the audio output unit 295 via the control unit 290.

Next, referring to FIG. 2B, the set-top box 250 and the display apparatus 300 are the same as the set-top box 250 and the display apparatus 300 of FIG. 2A, except that the tuner 270 and the demodulator ( The difference is that the position of 275 is located in the set-top box 250 and not in the display apparatus 300. Only the differences are described below.

The signal processor 260 may perform signal processing of a broadcast signal received through the tuner 270 and the demodulator 275. In addition, the user input interface unit 263 may receive an input such as channel selection and channel storage.

Meanwhile, although the audio output unit 185 of FIG. 1 is not illustrated in the set top box 250 of FIGS. 2A to 2B, it is also possible to have separate audio output units.

3 is an internal block diagram of the controller of FIG. 1, FIG. 4 is a diagram illustrating various formats of a 3D image, and FIG. 5 is a diagram illustrating an operation of a 3D viewing apparatus according to the format of FIG. 4.

Referring to the drawings, the control unit 170 according to an embodiment of the present invention, the demultiplexer 310, the image processor 320, the OSD generator 340, the mixer 345, the frame rate converter 350, and a formatter 360. The audio processing unit (not shown) and the data processing unit (not shown) may be further included.

The demultiplexer 310 demultiplexes an input stream. For example, when an MPEG-2 TS is input, it may be demultiplexed and separated into video, audio, and data signals, respectively. Here, the stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110, the demodulator 120, or the external device interface 130.

The image processor 320 may perform image processing of the demultiplexed image signal. To this end, the image processor 320 may include an image decoder 225 and a scaler 235.

The image decoder 225 decodes the demultiplexed image signal, and the scaler 235 performs scaling to output the resolution of the decoded image signal on the display 180.

The video decoder 225 may include decoders of various standards. For example, the image decoder 225 may include at least one of an MPEG-2 decoder, an H.264 decoder, an MPEC-C decoder (MPEC-C part 3), an MVC decoder, and an FTV decoder.

Meanwhile, the image signal decoded by the image processor 320 may be classified into a case in which only a 2D image signal is present, a case in which a 2D image signal and a 3D image signal are mixed, and a case in which only a 3D image signal is present.

For example, when the external video signal input from the external device 190 or the broadcast video signal of the broadcast signal received by the tuner 110 includes only the 2D video signal, the 2D video signal and the 3D video signal are mixed. , And 3D video signals, the signal may be processed by the controller 170, in particular, the image processor 320, and the like, and thus, mixed signals of 2D video signals, 2D video signals, and 3D video signals, respectively. The 3D video signal may be output.

The image signal decoded by the image processor 320 may be a 3D image signal having various formats. For example, the image may be a 3D image signal including a color image and a depth image, or may be a 3D image signal including a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

Here, the format of the 3D video signal is a side by side format (FIG. 4A) in which the left eye video signal L and the right eye video signal R are disposed left and right, as shown in FIG. 4, and up and down. Top / Down format (FIG. 4B) to arrange, Frame Sequential format (FIG. 4C) to arrange by time division, Interlaced format which mixes left-eye video signal and right-eye video signal line by line (FIG. 4B) 4d), a checker box format (FIG. 4E) for mixing the left eye image signal and the right eye image signal for each box may be used.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, a signal for displaying various types of information on a screen of the display 180 as a graphic or text may be generated based on a user input signal. The generated OSD signal may include various data such as a user interface screen, various menu screens, widgets, and icons of the image display apparatus 100. In addition, the generated OSD signal may include a 2D object or a 3D object.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded image signal processed by the image processor 320. In this case, the OSD signal and the decoded video signal may each include at least one of a 2D signal and a 3D signal. The mixed video signal is provided to the frame rate converter 350.

The frame rate converter (FRC) 350 converts the frame rate of the input video. For example, a frame rate of 60 Hz is converted to 120 Hz or 240 Hz or 480 Hz. When converting a frame rate of 60 Hz to 120 Hz, it is possible to insert the same first frame or insert a third frame predicted from the first frame and the second frame between the first frame and the second frame. When converting a frame rate of 60 Hz to 240 Hz, it is possible to insert three more identical frames or three predicted frames. When converting a frame rate of 60 Hz to 480 Hz, it is possible to insert seven more identical frames or insert seven predicted frames.

The frame rate converter 350 may output the input frame rate as it is without additional frame rate conversion. Preferably, when the 2D video signal is input, the frame rate can be output as it is. On the other hand, when a 3D video signal is input, the frame rate can be varied as described above.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. In addition, the synchronization signal Vsync for opening the left eye glass and the right eye glass of the 3D viewing apparatus 195 may be output.

Meanwhile, the formatter 360 may receive a mixed signal from the mixer 345, that is, an OSD signal and a decoded video signal, and separate the 2D video signal and the 3D video signal.

Meanwhile, in the present specification, the 3D video signal is meant to include a 3D object. Examples of the object include a picture in picture (PIP) image (still image or a video), an EPG indicating broadcast program information, various menus, widgets, There may be an icon, text, an object in the image, a person, a background, a web screen (newspaper, magazine, etc.).

The formatter 360 may change the format of the 3D video signal. For example, it may be changed to any one of various formats illustrated in FIG. 4. Accordingly, according to the format, as shown in FIG. 5, the operation of the glasses-type 3D viewing apparatus may be performed.

First, FIG. 5A illustrates an operation of the 3D glasses 195, in particular the shutter glass 195, when the formatter 360 arranges and outputs the frame sequential format among the formats of FIG.

That is, when the left eye image L is displayed on the display 180, the left eye glass of the shutter glass 195 is opened and the right eye glass is closed. When the right eye image R is displayed, The left eye glass is closed and the right eye glass is opened.

FIG. 5B illustrates an operation of the 3D glass 195, in particular, the polarization glass 195 when the formatter 360 arranges and outputs the side-by-side format among the formats of FIG. 4. Meanwhile, the 3D glass 195 applied in FIG. 5B may be a shutter glass, and the shutter glass may be operated as a polarized glass by keeping both the left eye glass and the right eye glass open. .

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, an edge or selectable object may be detected within the 2D image signal according to a 3D image generation algorithm, and an object or selectable object according to the detected edge may be separated into a 3D image signal and generated. Can be. In this case, the generated 3D image signal may be divided into a left eye image signal L and a right eye image signal R, as described above, and may be aligned.

Although not shown in the figure, a 3D processor (not shown) for processing a 3D effect signal may be further disposed after the formatter 360. The 3D processor (not shown) may process brightness, tint, and color adjustment of an image signal to improve 3D effects. For example, signal processing may be performed to sharpen the near distance and blur the far distance. Meanwhile, the functions of the 3D processor may be merged into the formatter 360 or merged into the image processor 320. This will be described later with reference to FIG. 6 and the like.

Meanwhile, the audio processor (not shown) in the controller 170 may perform voice processing of the demultiplexed voice signal. To this end, the audio processor (not shown) may include various decoders.

For example, if the demultiplexed speech signal is a coded speech signal, it can be decoded. Specifically, when the demultiplexed speech signal is an encoded speech signal of MPEG-2 standard, it may be decoded by an MPEG-2 decoder. In addition, when the demultiplexed speech signal is an encoded speech signal of MPEG 4 Bit Sliced Arithmetic Coding (BSAC) standard according to the terrestrial digital multimedia broadcasting (DMB) scheme, it may be decoded by an MPEG 4 decoder. In addition, when the demultiplexed speech signal is an encoded audio signal of the AAC (Advanced Audio Codec) standard of MPEG 2 according to the satellite DMB scheme or DVB-H, it may be decoded by the AAC decoder. In addition, when the demultiplexed speech signal is a encoded speech signal of the Dolby AC-3 standard, it may be decoded by the AC-3 decoder.

Also, the audio processor (not shown) in the controller 170 may process a base, a treble, a volume control, and the like.

The data processor (not shown) in the controller 170 may perform data processing of the demultiplexed data signal. For example, when the demultiplexed data signal is an encoded data signal, it may be decoded. The encoded data signal may be EPG (Electronic Progtam Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel. For example, the EPG information may be ATSC-PSIP (ATSC-Program and System Information Protocol) information in the case of the ATSC scheme, and may include DVB-Service Information (DVB-SI) in the case of the DVB scheme. . The ATSC-PSIP information or the DVB-SI information may be information included in the aforementioned stream, that is, the header (2 bytes) of the MPEG-2 TS.

In FIG. 3, the signals from the OSD generator 340 and the image processor 320 are mixed in the mixer 345 and then 3D processed in the formatter 360, but the present invention is not limited thereto. May be located after the formatter. That is, the output of the image processor 320 is 3D processed by the formatter 360, and the OSD generator 340 performs 3D processing together with OSD generation, and then mixes each processed 3D signal by the mixer 345. It is also possible.

Meanwhile, a block diagram of the controller 170 shown in FIG. 3 is a block diagram for one embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specification of the controller 170 that is actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be provided separately.

6 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.

Referring to the drawings, in order to increase the 3D effect, the controller 170 may perform 3D effect signal processing. Among them, in particular, the size or tilt of the 3D object in the 3D image may be adjusted.

As shown in FIG. 6A, the 3D image signal or the 3D object 510 in the 3D image signal may be enlarged or reduced 512 as a whole at a predetermined ratio, and as shown in FIGS. 6B and 6C. The 3D object may be partially enlarged or reduced (trapezoidal shapes 514 and 516). In addition, as illustrated in FIG. 6D, at least a part of the 3D object may be rotated (parallel quadrilateral shape) 518. Through such scaling (scaling) or tilting, it is possible to emphasize a three-dimensional effect, that is, a three-dimensional effect, of a 3D image or a 3D object in the 3D image.

On the other hand, as the slope becomes larger, as shown in FIG. 6 (b) or 6 (c), the length difference between the parallel sides of the trapezoidal shapes 514 and 516 increases, or as shown in FIG. 6 (d), the rotation angle is increased. It gets bigger.

Meanwhile, the size adjustment or the tilt adjustment may be performed after the 3D video signal is aligned in a predetermined format in the formatter 360. Alternatively, it may be performed by the scaler 235 in the image processor 320. On the other hand, the OSD generation unit 340, it is also possible to create the object in the shape as shown in Figure 6 to generate the OSD to emphasize the 3D effect.

On the other hand, although not shown in the figure, as a signal processing for a three-dimensional effect, in addition to the size adjustment or tilt adjustment illustrated in Figure 6, the brightness (brightness), tint (Tint) and It is also possible to perform signal processing such as color adjustment. For example, signal processing may be performed to sharpen the near distance and blur the far distance. Meanwhile, the signal processing for the 3D effect may be performed in the controller 170 or may be performed through a separate 3D processor. In particular, when performed in the controller 170, it may be performed in the formatter 360 together with the above-described size adjustment or tilt adjustment, or may be performed in the image processor 320.

FIG. 7 is a diagram illustrating an image formed by a left eye image and a right eye image, and FIG. 8 is a diagram illustrating a depth of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 615, 625, 635, and 645 are illustrated.

First, the first object 615 includes first left eye images 611 and L based on the first left eye image signal and first right eye images 613 and R based on the first right eye image signal. An interval between the first left eye image 611 and L and the first right eye image 613 and R is illustrated to be d1 on the display 180. In this case, the user recognizes that an image is formed at an intersection point of the extension line connecting the left eye 601 and the first left eye image 611 and the extension line connecting the right eye 603 and the first right eye image 603. Accordingly, the user recognizes that the first object 615 is located behind the display 180.

Next, since the second object 625 includes the second left eye images 621 and L and the second right eye images 623 and R and overlaps each other, the second object 625 is displayed on the display 180. do. Accordingly, the user recognizes that the second object 625 is located on the display 180.

Next, the third object 635 and the fourth object 645 are the third left eye image 631 and L, the second right eye image 633 and R, and the fourth left eye image 641 and L and the fourth object, respectively. The right eye images 643 and R are included, and the intervals are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 635 and the fourth object 645 are positioned at the positions where the images are formed, respectively, and in the drawing, each of them is located in front of the display 180.

In this case, it is recognized that the fourth object 645 is projected ahead of the third object 635, that is, more protruding from the fourth object 635, which is the distance between the fourth left eye images 641 and L and the fourth right eye images 643 and R. d4) is larger than the distance d3 between the third left eye images 631 and L and the third right eye images 633 and R.

Meanwhile, in the exemplary embodiment of the present invention, the distance between the display 180 and the objects 615, 625, 635, and 645 recognized by the user is expressed as a depth. Accordingly, the depth when the user is recognized as if it is located behind the display 180 has a negative value (-), and the depth when the user is recognized as if it is located before the display 180. (depth) is assumed to have a negative value (+). That is, the greater the degree of protrusion in the direction of the user, the greater the size of the depth.

Referring to FIG. 8, the distance a between the left eye image 701 and the right eye image 702 of FIG. 8A is the distance between the left eye image 701 and the right eye image 702 shown in FIG. 8B. When (b) is smaller, it can be seen that the depth b 'of the 3D object of FIG. 8 (b) is larger than the depth a' of the 3D object of FIG. 8 (a).

As such, when the 3D image is exemplified as the left eye image and the right eye image, a position recognized as image formation from the user's point of view varies depending on the distance between the left eye image and the right eye image. Therefore, by adjusting the display interval of the left eye image and the right eye image, it is possible to adjust the depth of the 3D image or 3D object composed of the left eye image and the right eye image.

9 is a diagram illustrating a 3D viewing apparatus and an image display apparatus according to an exemplary embodiment of the present invention, and FIG. 10 is an internal block diagram of the 3D viewing apparatus and the image display apparatus of FIG. 9.

9 and 10, the 3D viewing apparatus 195 according to an embodiment of the present invention includes a power supply unit 910, a switch 918, a controller 920, a wireless communication unit 930, and a left eye glass 940. ), The right eye glass 960 may be included.

The power supply unit 910 supplies power to the left eye glass 940 and the right eye glass 960. 4 to 6, the driving voltage VthL is applied to the left eye glass 940, and the driving voltage VthR is applied to the right eye glass 960. According to the driving voltages VthL and VthR applied, the liquid crystal arrays in the left eye glass 940 and the right eye glass 960 may be changed, and accordingly, the left eye glass 940 and the right eye glass 960 are open. Can be.

The power supply unit 910 may supply operation power for the operation of the controller 920 and the wireless communication unit 930 in the 3D viewing apparatus 195.

The switch 918 is used to turn on or off the operation of the 3D viewing apparatus 195. In particular, it is used to turn on or off the operating power supply. When the switch 918 is turned on, the power supply unit 910 operates to supply the corresponding power to the controller 920, the wireless communication unit 930, the left eye glass 940, and the right eye glass 960.

The controller 920 synchronizes the left eye image frame and the right eye image frame displayed on the display 180 of the image display apparatus 100 with the left eye glass 940 and the right eye glass 960 of the 3D viewing apparatus 195. It can be controlled to open and close. In this case, the left eye glass 940 and the right eye glass 960 may be opened and closed in synchronization with the synchronization signal Vsync received from the wireless communication unit 198.

On the other hand, when the image displayed on the image display apparatus 100 is a 3D image, the controller 920 alternately opens the left eye glass 940 and the right eye glass 960 in synchronization with the received synchronization signal Vsync. Can be controlled or closed.

In addition, the controller 920 may control operations of the power supply unit 910 and the wireless communication unit 930. When the switch 918 is turned on, the controller 920 may control the power supply 910 to operate to supply power to each component.

The controller 920 may control the wireless communication unit 930 to transmit the pairing signal to the image display apparatus 100 for pairing with the image display apparatus 100. Alternatively, the pairing signal may be received from the image display apparatus 100.

The wireless communication unit 930 may transmit or receive data to and from the wireless communication unit 198 of the image display apparatus 100 using the image display apparatus 100 and the IR (InfraRed) or RF (Radio Frequency) scheme. Can be. In particular, the synchronization signal Vsync for opening and closing the left eye glass 940 and the right eye glass 960 may be received from the wireless communication unit 198 of the image display apparatus 100. According to the synchronization signal Vsync, the opening and closing operations of the left eye glass 940 and the right eye glass 960 are controlled.

The wireless communication unit 930 may transmit or receive a pairing signal with the video display device 100. In addition, the synchronization signal Vsync may be received from the image display apparatus 100. In addition, it is also possible to transmit a signal whether the 3D viewing apparatus 195 is used to the image display apparatus 100.

The left eye glass 940 and the right eye glass 960 may be active left eye glasses and active right eye glasses that open according to an electric signal (voltage or current) applied thereto. The 3D viewing apparatus 195 may be a shutter glass type as described above.

As described above, the image display apparatus 100 may include a wireless communication unit 198, a controller 170, a display 180, and the like. Hereinafter, the operation with the 3D viewing apparatus 195 will be described.

When the 3D viewing apparatus 195 is detected, the wireless communication unit 198 in the image display apparatus 100 may transmit a pairing signal for pairing with the 3D viewing apparatus 195. In addition, a response signal may be received from the 3D viewing apparatus 195.

The wireless communication unit 198 in the image display apparatus 100 may transmit a synchronization signal Vsync to the 3D viewing apparatus 195. In addition, a signal indicating whether the type of the displayed image is a 2D image or a 3D image may be transmitted. As a result, the left eye glass 940 and the right eye glass 960 of the 3D viewing apparatus 195 may be opened or closed simultaneously.

Meanwhile, when there are a plurality of 3D viewing apparatuses, the wireless communication unit 198 in the image display apparatus 100 may transmit corresponding sync signals. In addition, audio signals for audio output of each 3D viewing apparatus may be transmitted.

On the other hand, the control unit 170 in the video display device 100 controls to output the above-described pairing signal, synchronization signal, or audio signal.

Meanwhile, in the wireless communication between the image display apparatus 100 and the 3D viewing apparatus 195, various methods such as infrared communication, RF communication, and Bluetooth communication may be used.

11 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 12 to 13B are views for explaining various examples of the method of operating the image display apparatus of FIG. 11.

Referring to FIG. 11, first, a 3D image is received (S1110). The external image input to the image display device 100 may be a broadcast image from a broadcast signal received by the tuner 110, an external input image from an external device, an image stored in the storage 140, or a content provider through a network. It may be an input image.

The input image may be demodulated by the demodulator 120 to be signal processed by the controller 170 or may be directly input to the controller 170. As described above, the controller 170 performs demultiplexing, decoding, and the like.

Next, the controller 170 detects a viewing grade or an age grade of the received 3D image (S1115). For example, when the received 3D image is a broadcast image, the controller 170 may detect viewing grade information or age grade information in the broadcast program information in the received stream. Specifically, this information may be extracted when the demultiplexer 310 demultiplexes the stream.

Next, a 3D image is displayed (S1120). The controller 170 controls to display the received 3D image according to a predetermined format.

FIG. 13A illustrates that a 3D image 1310 including a 3D object 1315 is displayed on the display 180. The user 1305 wearing the 3D viewing apparatus 195 may watch as the 3D object 1315 having a predetermined depth d1 protrudes. For example, the predetermined depth d1 may be 0.4 m.

Meanwhile, the object 1312 representing the extracted audience rating information may be displayed together. Accordingly, the viewing grade of the 3D image can be immediately confirmed.

Next, it is determined whether there is a variable depth input of the 3D image (S1125). If applicable, it is determined whether a preset depth change range exists (S1130). In this case, the depth of the 3D image may be varied according to the depth variable input, but may be varied within the depth change (S1135). In addition, a 3D image having a variable depth is displayed (S1140).

The variable depth input of the 3D image may be performed by selecting a specific icon while a specific key of the remote controller or a predetermined menu is displayed on the display.

FIG. 13B illustrates that the variable depth input is performed using the remote controller 1300 while the 3D image 1320 is displayed on the display 180.

On the other hand, the controller 170 may control to set the depth change range according to the viewing grade age grade of the 3D image, for the depth variation of the 3D image.

For example, as illustrated in FIG. 12, a depth change range of the 3D image may be set on the display 180 according to an age-specific viewing grade according to a user input. For example, '19 years old or older 'may adversely affect the user, such as scene transitions, violence, etc., thus limiting the depth of the 3D image considerably. The drawing illustrates that the depth cannot be set.

On the other hand, Figure 12 illustrates the invention of all ages available broadcast (1220), 9-14 age broadcast (1230), 15-18 age broadcast (1240), 19 years of age or more broadcast (1250), but various examples are possible Do.

As such, when there is a depth variable input of the 3D image in the state where the set depth change range exists, the depth may be changed to d2 as shown in FIG. 13B. As illustrated in FIG. 13B, since the viewing grade is '15 years old ', referring to FIG. 12, the set changeable depth may be 0.3m. Accordingly, a 3D image suitable for a user can be displayed. In particular, dizziness of the user due to excessive 3D depth can be prevented, and the user's convenience can be improved.

On the other hand, the variable of the depth, as described above, may be performed in the formatter 360 in the control unit 170. The formatter 360 may vary the depth according to the disparity between the left eye image and the right eye image. To increase the depth, the parallax between the left eye image and the right eye image may be increased, and to reduce the depth, the parallax between the left eye image and the right eye image may be reduced.

On the other hand, after the step 1130 (S1130), if there is no predetermined depth change range, the depth of the 3D image is varied according to the depth variable input (S1145). In operation 1140, a 3D image having a variable depth is displayed.

When the depth variable input from the remote controller or the like is an input for increasing the depth, the controller 170 varies the depth of the 3D image. For example, as illustrated in FIG. 13A, when the depth d1 of the 3D object 1315 is 0.4 m, it may be set to 0.5 m by the depth increase input. Moreover, it can also set to 0.3m by depth reduction input.

On the other hand, it is also possible to perform the user authentication when the above-described depth change range setting or depth change input. This will be described later with reference to FIG. 14 or below.

14 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 15 to 16D are views for explaining various examples of the method of operating the image display apparatus of FIG. 14.

Referring to FIG. 14, steps 1410 to 1425 S1425 are the same as steps 1110 to 1125 S1125 of FIG. 11, and thus description thereof is omitted.

Next, when there is a variable depth input of the 3D image, an object for password authentication is displayed (S1430).

FIG. 16A illustrates that a 3D image 1610 including a 3D object 1615 is displayed on the display 180. In this case, the object 1612 representing the extracted audience rating information may be displayed together. The user 1605 wearing the 3D viewing apparatus 195 may watch as the 3D object 1615 having a predetermined depth da protrudes. For example, the predetermined depth da may be 0.4 m.

FIG. 16B illustrates an object 1620 for password authentication when there is a variable depth input while the 3D image 1620 is displayed on the display 180.

In this case, the password may be previously stored in the storage 140. For example, as shown in FIG. 15, a password 1550 may be set. In addition, the age setting 1520, the grade setting 1530, and the depth setting 1540 may be performed. In particular, the depth setting 1540 may be exemplified as a range item, a setting item according to a range, or a non-setting item. The set contents may be stored in the storage 140.

Next, it is determined whether the input password corresponds to the preset password (S1433), and if applicable, the depth is varied according to the depth variable input (S1435). In operation S1440, a 3D image having a variable depth is displayed.

The controller 170 may vary the depth according to the depth variable input from the remote controller 1600 when the input password and the password stored in the storage 140 match.

For example, when the depth da of the 3D object 1315 of FIG. 16A is 0.4 m, it may be set to 0.5 m as shown in FIG. 16C by the depth increase input. Accordingly, the 3D image 1630 having the 3d object 1635 having the increased depth db may be displayed on the display 180. Accordingly, the user 1605 wearing the 3D viewing apparatus 195 may scan the 3d object 1635 as it protrudes further. The display 180 may display an object 1613 indicating completion of authentication.

On the other hand, if the input password is not the same as the preset password, the depth of the 3D image is variable according to the depth variable input, but within the set depth change range (S1445). In operation S1440, a 3D image having a variable depth is displayed.

When the input password and the password stored in the storage unit 140 do not match, the controller 170 considers the depth variable input from the remote controller 1600 and, within the preset depth change range, To vary.

For example, as shown in FIG. 16A, when the depth da of the 3D object 1315 is 0.4 m, when there is a depth increase input, the viewing grade of the displayed 3D image 1640 is '15 years old or older 'is broadcasted. Since the corresponding depth allowance range is 0.3m, the depth increase cannot be performed. Therefore, despite the depth increase input, the 3d object 1645 may decrease to a depth of 0.3 m (dc). Accordingly, the user 1605 wearing the 3D viewing apparatus 195 may scan the 3d object 1645 as less protruding. On the other hand, the display 180 may display an object 1614 indicating incomplete authentication.

17 is a flowchart illustrating a method of operating an image display apparatus according to an embodiment of the present invention, and FIGS. 18A to 18D are diagrams for describing various examples of the method of operating the image display apparatus of FIG. 17.

Since the operation method of the image display apparatus of FIG. 17 is partially similar to the operation method of the image display apparatus of FIG. 11 or the operation method of the image display apparatus of FIG. 14, the following description will focus on differences.

Referring to FIG. 17, steps 1710 to S1725 are the same as operations 1110 to 1125 to S125 of FIG. 11, and thus description thereof is omitted.

However, a step (S1723) of identifying a user's age may be further provided between the operations 1720 and S1725.

The age of the user is based on a user's direct input using the image display apparatus 100 or based on a captured image of the photographing unit 190, or based on an audio signal of the user, or a combination thereof. You can figure it out.

First, hereinafter, the description will be given based on grasping the age of the user based on the captured image of the photographing unit 190.

18A illustrates that a 3D image 1810 including a 3D object 1815 having a predetermined depth dk is displayed on the display 180. The 3D image 1810 may include an object 1812 indicating that the viewer is 19 years of age or older.

In this case, the controller 170 captures the user 1805 wearing the 3D viewing apparatus 195 through the photographing unit 190 and performs user authentication to determine the age of the user or based on the captured image itself. It is possible to determine the age of the user.

Next, when there is a variable depth input of the 3D image (S1725), the controller 170 determines whether a preset change range exists (S1730), and based on the user age and the viewing grade or age grade, the 3D The depth of the image is varied, but within the set depth change range (S1735). In operation S1740, a 3D image or a 2D image having a variable depth is displayed.

FIG. 18B illustrates that when there is a depth change input through the remote controller 1800 in a state where a viewable 3D image 1810 or older is displayed, the depth change is not performed compared to FIG. 18A, and the predetermined depth ( dk) illustrates that the 3D object is displayed as it is.

That is, when it is determined that the age of the user 1805 is 19 years old or more, the depth change may be freely performed according to the depth change input. However, as illustrated in FIG. 12, when the depth allowance range is set to no depth setting, The depth of the input 3D image can be displayed as it is.

Next, FIG. 18C shows that when there is a depth increase input through the remote controller 1800 in a state where a viewable 3D image 1810 or older is displayed, the depth increase is not performed, but rather decreases, compared to FIG. 18A. To illustrate that a 3D object having a predetermined depth dl is displayed.

That is, when the user 1805 is determined to be 15 years old, the depth change may be performed according to the depth change input and the depth allowance range. Similar to the example illustrated in FIG. 12, when the depth tolerance is set within 0.3 m, the depth of the 3D image is reduced. For example, the depth of 0.4 can be reduced to 0.3 m.

Next, FIG. 18D shows that when there is a depth increase input through the remote controller 1800 in a state where a viewable 3D image 1810 of 19 or older is displayed, the depth is reduced compared to FIG. For example, the 2D image including the object 1835 is displayed.

That is, when the user 1805 is determined to be 8 years old, the depth change may be performed according to the depth change input and the depth allowance range. Similar to that illustrated in FIG. 12, the depth tolerance can be freely set. Alternatively, the display may be converted into a 2D image having no depth at all. Accordingly, a user with a low age can watch 2D images stably without watching 3D.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. . The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

In addition, although the preferred embodiment of the present invention has been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

Claims (17)

Receiving a 3D image;
Varying a depth of the 3D image according to a viewing grade or an age grade of the received 3D image when there is a variable depth input of the 3D image; And
Displaying the depth-variable 3D image; and operating the image display device.
The method of claim 1,
The 3D depth variable step,
And a depth of the 3D image is varied according to the depth change range when a preset depth change range exists.
The method of claim 1,
The depth variable step,
When there is a preset depth change range, and the depth change range according to the viewing grade or age grade of the received 3D image is set to not set the depth, the operation method of the image display apparatus characterized in that the depth is not performed. .
The method of claim 1,
And setting a depth change range of the 3D image for each viewing grade or age.
The method of claim 1,
And displaying the object for password authentication when the depth variable input is present.
The method of claim 5,
The depth variable step,
And when the input password is the same as a predetermined password, varying the depth according to the depth variable input.
The method of claim 5,
The 3D depth variable step,
The input password is not the same as the predetermined password, and if there is a preset depth change range, the depth of the 3D image according to the depth change range, characterized in that the operation method of the video display device.
The method of claim 5,
And setting a depth change range of the 3D image or setting a password for each viewing grade or age.
Displaying a 3D image;
Displaying an object for which a login request is made according to a viewing grade or a detection grade of the 3D image when there is a variable depth input of the 3D image; And
And after the login authentication, varying the depth of the 3D image according to the variable input depth.
A display for displaying a 3D image; And
And a controller configured to vary the depth of the 3D image according to a viewing grade or an age grade of the 3D image when the depth variable input of the 3D image is present.
The method of claim 10,
And a storage unit for storing the set depth change range.
The control unit,
And a depth of the 3D image is varied according to the depth change range when a preset depth change range exists.
The method of claim 10,
The control unit,
And a depth change range is present, and when the depth change range according to the viewing grade or the age grade of the received 3D image is set to be impossible to set the depth, the image display apparatus does not perform the depth variation.
The method of claim 10,
The control unit,
And a depth change range of the 3D image for each viewing grade or age.
The method of claim 10,
The control unit,
And, if there is the variable depth input, controlling to display an object for password authentication.
The method of claim 14,
The control unit,
And when the input password is the same as a predetermined password, controlling the depth to be varied according to the depth variable input.
The method of claim 14,
The control unit,
And the input password is not the same as the preset password, and if the preset depth change range exists, the depth of the 3D image is controlled to vary according to the depth change range.
The method of claim 14,
The control unit,
An image display apparatus comprising setting a depth change range of a 3D image or a password by viewing grade or age.
KR1020100096013A 2010-10-01 2010-10-01 Image display apparatus, and method for operating the same KR20120034836A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100096013A KR20120034836A (en) 2010-10-01 2010-10-01 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100096013A KR20120034836A (en) 2010-10-01 2010-10-01 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20120034836A true KR20120034836A (en) 2012-04-13

Family

ID=46136936

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100096013A KR20120034836A (en) 2010-10-01 2010-10-01 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20120034836A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160068447A (en) * 2014-12-05 2016-06-15 삼성전자주식회사 Method for determining region of interest of image and device for determining region of interest of image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160068447A (en) * 2014-12-05 2016-06-15 삼성전자주식회사 Method for determining region of interest of image and device for determining region of interest of image

Similar Documents

Publication Publication Date Title
US8803954B2 (en) Image display device, viewing device and methods for operating the same
US8797390B2 (en) Image display device, 3D viewing device, and method for operating the same
KR101349276B1 (en) Video display device and operating method therefor
US9191651B2 (en) Video display apparatus and operating method therefor
US20120062551A1 (en) Image display apparatus and method for operating image display apparatus
KR20110082380A (en) Apparatus for displaying image and method for operating the same
US20130291017A1 (en) Image display apparatus and method for operating the same
KR101708692B1 (en) Image display apparatus and method for operating the same
KR101730424B1 (en) Image display apparatus and method for operating the same
KR20120027815A (en) Apparatus for displaying image and method for operating the same
KR20120062428A (en) Image display apparatus, and method for operating the same
KR20120034836A (en) Image display apparatus, and method for operating the same
KR101730323B1 (en) Apparatus for viewing image image display apparatus and method for operating the same
KR101176500B1 (en) Image display apparatus, and method for operating the same
KR101716144B1 (en) Image display apparatus, and method for operating the same
KR101737367B1 (en) Image display apparatus and method for operating the same
KR20110134087A (en) Image display apparatus and method for operating the same
KR20110133296A (en) Apparatus for viewing 3d image and method for operating the same
KR101730423B1 (en) Apparatus for displaying image and method for operating the same
KR20110114295A (en) Apparatus for viewing 3d image and method for operating the same
KR101691801B1 (en) Multi vision system
KR20120054323A (en) Method for operating an apparatus for displaying image
KR20120034995A (en) Image display apparatus, and method for operating the same
KR20120054324A (en) Method for operating an apparatus for displaying image
KR20110134094A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination