KR20140008188A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20140008188A
KR20140008188A KR1020120075634A KR20120075634A KR20140008188A KR 20140008188 A KR20140008188 A KR 20140008188A KR 1020120075634 A KR1020120075634 A KR 1020120075634A KR 20120075634 A KR20120075634 A KR 20120075634A KR 20140008188 A KR20140008188 A KR 20140008188A
Authority
KR
South Korea
Prior art keywords
image
user
display apparatus
counterpart
image display
Prior art date
Application number
KR1020120075634A
Other languages
Korean (ko)
Inventor
이용욱
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120075634A priority Critical patent/KR20140008188A/en
Publication of KR20140008188A publication Critical patent/KR20140008188A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers

Abstract

The present invention relates to an image display device and an operation method thereof. The operation method of the image display device according to an embodiment of the present invention includes a step of receiving a user image; a step of converting the user image into one among a two-dimensional (2D) image, an eyeglasses type three-dimensional (3D) image, and a non-eyeglasses type 3D image based on information on the image display device of the other user; and a step of transmitting the converted image to the image display device of the other user. Therefore, the present invention can improve user convenience of the image display device. [Reference numerals] (AA) Start; (BB) End; (S1110) Receive a user image; (S1120) Convert the user image into one among a 2D image, an eyeglasses type 3D image, and a non-eyeglasses type 3D image based on information on the image display device of the other user; (S1130) Transmit the converted image

Description

[0001] The present invention relates to an image display apparatus and a method of operating the same,

The present invention relates to an image display apparatus and a method of operating the same, and more particularly, converting a captured image into one of a 2D image, an eyeglass 3D image, and an autostereoscopic 3D image based on the information of the counterpart image display apparatus. An image display apparatus and an operation method thereof.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

SUMMARY OF THE INVENTION An object of the present invention is to provide an image display apparatus and an operation method thereof capable of providing an optimal image to a counterpart user even when there is no separate image conversion function in the counterpart image display apparatus that receives the image during video communication.

A method of operating an image display apparatus according to an embodiment of the present invention for achieving the above object, receiving a user image, based on the information of the other party image display device, the 2D image, the spectacle 3D image and the glasses-free And converting the converted image into one of 3D images and transmitting the converted image to the counterpart image display apparatus.

In addition, the image display apparatus according to an embodiment of the present invention for achieving the above object, a photographing unit for taking a user image, a display for displaying the other image received from the user image and the other image display device, the user image And a controller configured to convert one of the 2D image, the glasses type 3D image, and the glassesless 3D image to be transmitted to the counterpart image display apparatus based on the counterpart image display information.

According to an embodiment of the present invention, the image display apparatus may provide an optimal image to the counterpart user by converting and transmitting the image based on the counterpart image display device information during video communication. The device does not need a separate video conversion function.

In addition, the image display apparatus may include a function of setting parameters related to 3D image conversion, and may convert and transmit an image according to a request of a user who transmits the image.

Therefore, the user's ease of use of the image display device can be improved.

1 is a view showing a video communication system according to an embodiment of the present invention.
FIG. 2 is a diagram showing the lens unit of the video display device of FIG. 1 separated from the display.
3 is an internal block diagram of the image display device of FIG. 1.
4 is an internal block diagram of the control unit of FIG.
5 is a diagram showing a control method of the remote control apparatus of FIG.
Fig. 6 is an internal block diagram of the remote control device of Fig. 3; Fig.
Fig. 7 is a view for explaining how images are formed by the left eye image and the right eye image.
8 is a view for explaining the depth of the 3D image according to the interval between the left eye image and the right eye image.
9 is a diagram referred to explain the principle of a stereoscopic image display apparatus of the non-eyeglass system.
10 to 14 are views for explaining the principle of the image display device for displaying a 3D image of a glassesless method.
15 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 16 to 22 are views for explaining an operation method of the image display apparatus of FIG. 15.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a view showing a video communication system according to an embodiment of the present invention.

Referring to FIG. 1, an image communication system 10 according to an exemplary embodiment of the present invention may include a first image display apparatus 100 and a second image display apparatus 200 that may exchange image data with each other. Can be.

According to an exemplary embodiment, the first image display apparatus 100 may include a photographing unit 155, and the photographing unit 155 may include a first camera 155a and a second camera 155b. It may be a stereo camera. In addition, an image (user image) captured by the photographing unit 155 may be signal processed and transmitted to the second image display apparatus 600. In detail, the captured image (user image) may be transmitted to the second image display apparatus 600 on the opposite side through the network 500.

In this case, according to an embodiment of the present invention, the first image display apparatus 100 converts a user image based on the information of the second image display apparatus 600, and converts the converted image into a second image through the network 500. The display device 600 may transmit the data.

That is, an image (user image) captured by the first image display apparatus 100 may be converted into an image optimized for the condition of the second image display apparatus 600 and provided to the user of the second image display apparatus 600. Can be.

Conversely, the second image display apparatus 600 may also include a photographing unit 155, and the image captured by the photographing unit 155 (relative image) is based on the information of the first image display apparatus 100. After conversion, the data may be transmitted to the first image display apparatus 100 through the network 500.

Meanwhile, as shown in FIG. 1, the image display apparatus 100 under the video communication system 10 performs not only video signal processing but also audio signal processing, so that the processed video signal and the audio signal are displayed on the display 180 and the audio output, respectively. It may be output through the unit 185.

The image display apparatuses 100 and 600 may include a TV, a monitor, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), and a PMP. (Portable Multimedia Player), camera, navigation, tablet computer, e-book terminal and the like.

Meanwhile, in the drawing, video communication between the first image display apparatus 100 and the second image display apparatus 600 is illustrated, but video communication between multiple parties is also possible. That is, in addition to the first and second image display apparatuses 100 and 600, it is also possible that more image display apparatuses are used in the video communication system.

FIG. 2 is a view showing a lens unit and a display separated from the image display apparatus of FIG. 1.

Referring to FIG. 2, the image display apparatus according to an embodiment of the present invention may be an image display apparatus capable of displaying not only 2D images but also stereoscopic images, that is, 3D images, and images capable of displaying 3D images in an auto glasses type. It may be a display device.

In the case of an image display apparatus capable of displaying a 3D image in an autostereoscopic manner, the image display apparatus 100 includes a display 180 and a lens unit 195.

The display 180 may display an input image, and in particular, may display a multi-view image for displaying an autostereoscopic 3D image. Specifically, the sub-pixels constituting the multi-view image may be displayed by being arranged in a predetermined pattern.

The lens unit 195 may be spaced apart from the display 180 and disposed in the user direction. In Fig. 2, the separation between the display 180 and the lens portion 195 is exemplified.

The lens unit 195 may be configured to vary the traveling direction of the light according to an applied power source. For example, when a plurality of viewers watch 2D images, the first power is applied to the lens unit 195, and light can be emitted in the same direction as the light emitted from the display 180. [ Accordingly, the image display apparatus 100 can provide a 2D image to a plurality of viewers.

On the other hand, when a plurality of viewers watch the 3D image, the second power is applied to the lens unit 195, and the light emitted from the display 180 is scattered, The 3D image can be provided to the viewer.

The lens unit 195 may be a lenticular system using a lenticular lens, a parallax system using a slit array, a system using a microlens array, or the like. Do. In the embodiment of the present invention, the lenticular method will be mainly described.

3 is an internal block diagram of the image display device of FIG. 1.

Referring to FIG. 3, the video display device 100 according to an embodiment of the present invention may include a broadcast receiver 105, an external device interface 130, a storage 140, a user input interface 150, The sensor unit (not shown), the controller 170, the display 180, the audio output unit 185, the microphone 198, and the photographing unit 195 may be included.

The broadcast receiving unit 105 may include a tuner unit 110, a demodulation unit 120, and a network interface unit 130. Of course, it is possible to design the network interface unit 130 not to include the tuner unit 110 and the demodulation unit 120 as necessary, and to provide the network interface unit 130 with the tuner unit 110 And the demodulation unit 120 are not included.

Meanwhile, unlike the figure, the broadcast receiving unit 105 may include an external device interface unit 135 of FIG. 3.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by the user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through the antenna 50. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner unit 110 can process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner unit 110 can be directly input to the controller 170.

The tuner unit 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among the RF broadcast signals received through the antenna in the present invention, and sequentially selects RF broadcast signals of the intermediate frequency signal, baseband image, . ≪ / RTI >

On the other hand, the tuner unit 110 may be provided with a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner that simultaneously receives broadcast signals of a plurality of channels is also possible.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulation unit 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 may be connected to an external device such as a DVD (Digital Versatile Disk), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or a wired / wireless connection. You can also perform input and output operations.

The A / V input / output unit can receive video and audio signals from an external device. Meanwhile, the wireless communication unit can perform short-range wireless communication with other electronic devices.

The network interface unit 135 provides an interface for connection to a wired / wireless network including the Internet network. For example, the network interface unit 135 can receive, via the network, content or data provided by the Internet or a content provider or a network operator.

In addition, the network interface unit 135 may transmit or receive data with the counterpart image display apparatus through the connected network or another network 500 linked to the connected network.

In particular, the network interface unit 135 may transmit or receive a captured image according to an embodiment of the present invention.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage 140 may perform a function for temporarily storing an input image, audio, or data signal from the external device interface 130 or the network interface 135. In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

In particular, the storage 140 may store information regarding the second image display apparatus 600 that is provided from the second image display apparatus 600 or set according to a user input in accordance with an embodiment of the present invention. . In addition, the information about the second image display apparatus 200 may be stored together with information for identifying the counterpart image display apparatus, such as a model name, a user name, a user network address, and the like of the counterpart image display apparatus.

Accordingly, the first image display apparatus 100 may convert an image using information about the second image display apparatus 600 corresponding to the identification information.

2 illustrates an embodiment in which the storage 140 is provided separately from the controller 170, but the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

(Not shown), such as a power key, a channel key, a volume key, and a set value, from the remote control apparatus 200, (Not shown) that senses a user's gesture to the control unit 170 or transmits a signal from the control unit 170 to the control unit 170 It is possible to transmit it to the sensor unit (not shown).

The user input interface unit 150 may be a concept including the photographing unit 195 and the microphone 198. For example, the image captured by the photographing unit 195 may be used to detect a user input signal by a gesture such as a user's hand gesture, and the audio collected from the microphone 198 may be a user input by voice recognition of a user. Can be used for signal detection.

The external device interface unit 130 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. For example, it is possible to provide an interface for transmitting or receiving data with an external device such as a game device, a camera, a camcorder, a computer (laptop), or the like.

The controller 170 performs signal processing on the input signal.

For example, the controller 170 may multiplex or encode the video signal captured by the photographing unit 195 or the audio signal collected by the microphone 198. In addition, the controller 170 may control to transmit the multiplexed or encoded video signal and audio signal to the counterpart video conference device through the network interface unit 135 for video communication.

As another example, the controller 170 may demultiplex or decode the video signal or the audio signal received from the counterpart image display apparatus through the network interface unit 135.

The controller 170 may synthesize an image signal photographed by the photographing unit 195 and an image signal from a counterpart image display apparatus received through the network interface unit 135. In addition, the synthesized video signal may be controlled to be displayed on the display 180.

In addition, the controller 170 may synthesize the audio signal collected by the microphone 198 and the audio signal from the counterpart image display device received through the network interface unit 135. The synthesized audio signal may be controlled to be output through the audio output unit 185.

In addition, the control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner unit 110, the demodulator 120, or the external device interface unit 130 to process an image or an audio signal. Signals for output can be generated and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 3, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG. 4.

In addition, the control unit 170 can control the overall operation in the video display device 100. [

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

Meanwhile, the control unit 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined 2D object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), EPG (Electronic Program Guide), various menus, widgets, icons, still images, moving images, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

The controller 170 may recognize a location of the user based on the image photographed by the photographing unit 195. For example, the distance (z-axis coordinate) between the user and the videoconferencing device 100 may be determined. In addition, the x-axis coordinate and the y-axis coordinate in the display 180 corresponding to the user position can be grasped.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided. The channel browsing processing unit receives the stream signal TS output from the demodulation unit 120 or the stream signal output from the external device interface unit 130 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be stream-decoded together with a decoded image and input to the controller 170. The control unit 170 may display a thumbnail list having a plurality of thumbnail images on the display 180 using the input thumbnail image.

At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts a video signal, a data signal, an OSD signal, a control signal processed by the control unit 170, a video signal, a data signal, a control signal, and the like received from the external device interface unit 130, .

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or the like, and may also be capable of a 3D display.

Meanwhile, the display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives the signal processed by the control unit 170 and outputs it as a voice.

The imaging unit 155 captures an image. In particular, for video communication, the user may be photographed. The image information photographed by the photographing unit 155 may be input to the control unit 170. [

The photographing unit 155 may be implemented as one camera, but may include a plurality of cameras and a stereo camera. When the photographing unit 155 is implemented as a stereo camera, the controller 170 may obtain depth information of an object photographed by the stereo camera, and convert the photographed image into a 3D image to obtain the depth information of the obtained object. It is available. Meanwhile, the photographing unit 155 may be embedded in the upper portion of the display 180 or disposed separately. The image information photographed by the photographing unit 155 may be input to the control unit 170. [

The control unit 170 can sense the gesture of the user based on each of the images photographed by the photographing unit 155 or sensed signals from the sensor unit (not shown) or a combination thereof.

The microphone 198 collects an audio signal. In particular, for video communication, an audio signal of a user may be collected. In this case, the collected audio signal may be an audio signal of a multi channel mode. The audio signal collected by the microphone 198 may be input to the controller 170.

On the other hand, the microphone 198 may be embedded in the upper portion of the image display device 100 or disposed separately. The microphone 198 may be provided separately from the photographing unit 155, but alternatively, the microphone 198 may be provided integrally with the photographing unit 155.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. [ To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. Also, the remote control apparatus 200 can receive the video, audio, or data signal output from the user input interface unit 150 and display it or output it by the remote control apparatus 200.

Meanwhile, the video display device 100 may be a digital broadcast receiver capable of receiving a fixed or mobile digital broadcast.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

On the other hand, the image display apparatus 100 does not include the tuner 110 and the demodulator 120 illustrated in FIG. 2, unlike the illustrated in FIG. 3, and the network interface 130 or the external device interface unit ( Through 130, broadcast content may be received and reproduced.

4 is an internal block diagram of the control unit of FIG.

Referring to FIG. 4, the controller 170 of the image display apparatus of FIG. 3 according to an embodiment of the present invention may include a demultiplexer 310, an image processor 320, a processor 330, and an OSD generator ( 340, a mixer 345, a frame rate converter 350, a formatter 360, and an image encoder 370. An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120, the external device interface 130, or the network interface 135.

The image processor 320 may perform image processing of an input image signal. For this, the image processing unit 320 may include a video decoder 325 and a scaler 335.

The video decoder 325 decodes the demultiplexed video signal and the scaler 335 performs scaling so that the resolution of the decoded video signal can be output from the display 180. The video decoder 325 can include a decoder of various standards.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

Meanwhile, the image signal decoded by the image processing unit 320 may be a 3D image signal in various formats. For example, a 3D image signal composed of a color image and a depth image, or a 3D image signal composed of a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

Here, the format of the 3D video signal is a side-by-side format in which the left-eye image signal L and the right-eye image signal R are arranged in left and right directions, a top- An interlaced format in which the left and right eye image signals and the right eye image signal are mixed line by line, a checker box for mixing the left eye image signal and the right eye image signal box by box, Format, and the like.

 Meanwhile, in relation to an embodiment of the present invention, the image processor 320 may receive an image signal photographed by the photographing unit 155 and perform signal processing. If the captured image signal is an encoded image signal, it may be decoded. In addition, when the photographed video signal is not encoded, separate decoding may not be performed. On the other hand, for the decoded video signal or the like, resolution adjustment may be performed in the scaler.

For example, when the captured image signal received from the counterpart image display apparatus is input to the image processor 320, since the captured image signal is an encoded signal, the image decoder 325 decodes the scaled image 335. May perform scaling on the decoded video signal.

As another example, when the captured image signal received from the photographing unit 155 attached to the image display device 100 is input to the image processor 320, since the photographed image signal is an uncoded signal, the image decoder 325 ) May not perform separate decoding. The scaler 335 may perform scaling on the captured image signal.

The scaler 335 determines the size of the captured image received from the counterpart image display apparatus and the size of the captured image received from the image capturing unit 155 attached to the image display apparatus 100 by user setting during video communication. Can be variable. For example, the size of the captured image received from the photographing unit 155 attached to the image display apparatus 100 may be scaled up.

The processor 330 may control the overall operation in the image display apparatus 100 or in the control unit 170. [ For example, the processor 320 may perform control of signal processing for image communication. As another example, the processor 320 may control the tuner 110 to select an RF broadcast corresponding to a channel selected by a user or a pre-stored channel for viewing the received broadcast.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. [

In addition, the processor 330 may perform data transfer control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The OSD generating unit 340 can generate a pointer that can be displayed on the display based on the pointing signal input from the remote control device 200. [ In particular, such a pointer may be generated by the pointing signal processor, and the OSD generator 240 may include such a pointing signal processor (not shown). Of course, a pointing signal processing unit (not shown) may be provided separately from the OSD generating unit 240.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded video signal processed by the image processor 320. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the frame rate without conversion.

The formatter 360 can arrange the frame rate converted 3D image.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

In the present specification, a 3D video signal means a 3D object. Examples of the 3D object include a picuture in picture (PIP) image (still image or moving picture), an EPG indicating broadcasting program information, Icons, texts, objects in images, people, backgrounds, web screens (newspapers, magazines, etc.).

On the other hand, the formatter 360 can change the format of the 3D video signal. For example, when a 3D image is input in the above-described various formats, it can be changed to a multi-view image. In particular, it is possible to change the multiple viewpoint image to be repeated. As a result, an autostereoscopic 3D image may be displayed.

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal may be a multiple view image signal as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect.

Meanwhile, in relation to an embodiment of the present invention, the captured image signal received from the photographing unit 155 may be input to the formatter 360. The formatter 360 may change the format of the captured image signal in order to transmit the captured image (user image) to the second image display apparatus 600. The captured video signal may be converted into one of a 2D video signal, an eyeglass 3D video signal, and an eyeglass free 3D video.

For example, when the image displayed on the counterpart image display apparatus is an eyeglass type 3D image, and the format of the captured image signal is a non-glass type 3D image signal, the captured image signal may be converted into an eyeglass type 3D image signal.

In addition, the format-converted image signal may be input to the image encoder 370, and the image encoder 370 may encode the image signal input from the formatter 360. To this end, the image encoder 370 may be provided with encoders of various standards.

Although not shown in the drawing, it is preferable that a multiplexer (not shown) is further provided after the image encoder 370. The multiplexing unit multiplexes the video signal encoded by the image encoder 370 and the audio signal encoded by the audio encoder (not shown), and converts the image signal into a signal suitable for transmission to the counterpart image display apparatus 600. Can be.

The encoded video signal or audio signal or multiplexed signal may be transmitted to the counterpart image display apparatus 600 through the network interface 135.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

4 shows that the signals from the OSD generating unit 340 and the image processing unit 320 are mixed in the mixer 345 and then 3D processed in the formatter 360. However, May be located behind the formatter. That is, the output of the image processing unit 320 is 3D-processed by the formatter 360, and the OSD generating unit 340 performs 3D processing together with the OSD generation. Thereafter, the processed 3D signals are mixed by the mixer 345 It is also possible to do.

Meanwhile, the block diagram of the controller 170 shown in FIG. 4 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be separately provided.

5 is a diagram showing a control method of the remote control apparatus of FIG.

5A illustrates that the pointer 180 corresponding to the remote control device 200 is displayed on the display 180. In this case,

The user can move or rotate the remote control device 200 up and down, left and right (Figure 5 (b)), and back and forth (Figure 5 (c)). The pointer 205 displayed on the display 180 of the video display device corresponds to the movement of the remote control device 200. [ The remote control apparatus 200 may be referred to as a spatial remote controller because the pointer 205 is moved and displayed according to the movement in the 3D space as shown in the figure.

5B illustrates that when the user moves the remote control apparatus 200 to the left, the pointer 205 displayed on the display 180 of the image display apparatus also shifts to the left correspondingly.

Information on the motion of the remote control device 200 sensed through the sensor of the remote control device 200 is transmitted to the image display device. The image display apparatus can calculate the coordinates of the pointer 205 from the information on the motion of the remote control apparatus 200. [ The image display apparatus can display the pointer 205 so as to correspond to the calculated coordinates.

5C illustrates a case in which the user moves the remote control device 200 away from the display 180 while pressing a specific button in the remote control device 200. FIG. Thereby, the selected area in the display 180 corresponding to the pointer 205 can be zoomed in and displayed. Conversely, when the user moves the remote control device 200 close to the display 180, the selection area within the display 180 corresponding to the pointer 205 may be zoomed out and zoomed out. On the other hand, when the remote control device 200 moves away from the display 180, the selection area is zoomed out, and when the remote control device 200 approaches the display 180, the selection area may be zoomed in.

On the other hand, when the specific button in the remote control device 200 is pressed, it is possible to exclude recognizing the up, down, left, and right movement. That is, when the remote control device 200 moves away from or near the display 180, the up, down, left and right movements are not recognized, and only the front and back movements can be recognized. Only the pointer 205 is moved in accordance with the upward, downward, leftward, and rightward movement of the remote control device 200 in a state where the specific button in the remote control device 200 is not pressed.

On the other hand, the moving speed and moving direction of the pointer 205 may correspond to the moving speed and moving direction of the remote control device 200.

Fig. 6 is an internal block diagram of the remote control device of Fig. 3; Fig.

Referring to the drawings, the remote control device 200 includes a wireless communication unit 420, a user input unit 430, a sensor unit 440, an output unit 450, a power supply unit 460, a storage unit 470, The controller 480 may be included.

The wireless communication unit 420 transmits / receives a signal to / from any one of the video display devices according to the embodiments of the present invention described above. Of the video display devices according to the embodiments of the present invention, one video display device 100 will be described as an example.

In this embodiment, the remote control apparatus 200 may include an RF module 421 capable of transmitting and receiving signals with the image display apparatus 100 according to the RF communication standard. In addition, the remote control apparatus 200 may include an IR module 423 capable of transmitting and receiving signals to and from the image display apparatus 100 according to the IR communication standard.

In the present embodiment, the remote control device 200 transmits a signal containing information on the motion and the like of the remote control device 200 to the image display device 100 through the RF module 421.

Also, the remote control device 200 can receive the signal transmitted by the video display device 100 through the RF module 421. [ In addition, the remote control device 200 can transmit a command regarding power on / off, channel change, volume change, and the like to the video display device 100 through the IR module 423 as necessary.

The user input unit 430 may include a keypad, a button, a touchpad, or a touch screen. The user can input a command related to the image display apparatus 100 to the remote control apparatus 200 by operating the user input unit 430. [ When the user input unit 430 includes a hard key button, the user can input a command related to the image display apparatus 100 to the remote controller 200 through the push operation of the hard key button. When the user input unit 430 has a touch screen, the user can touch a soft key of the touch screen to input a command related to the image display apparatus 100 to the remote control apparatus 200. [ In addition, the user input unit 430 may include various types of input means such as a scroll key, a jog key, etc., which can be operated by the user, and the present invention does not limit the scope of the present invention.

The sensor unit 440 may include a gyro sensor 441 or an acceleration sensor 443. The gyro sensor 441 can sense information about the motion of the remote control device 200. [

For example, the gyro sensor 441 can sense information about the operation of the remote control device 200 based on the x, y, and z axes. The acceleration sensor 443 can sense information on the moving speed and the like of the remote control device 200. On the other hand, a distance measuring sensor can be further provided, whereby the distance to the display 180 can be sensed.

The output unit 450 may output an image or a voice signal corresponding to the operation of the user input unit 430 or corresponding to the signal transmitted from the image display apparatus 100. [ The user can recognize whether the user input unit 430 is operated or whether the image display apparatus 100 is controlled through the output unit 450.

For example, the output unit 450 includes an LED module 451 that is turned on when the user input unit 430 is operated or a signal is transmitted / received to / from the video display device 100 through the wireless communication unit 420, a vibration module 453 for outputting sound, an audio output module 455 for outputting sound, or a display module 457 for outputting an image.

The power supply unit 460 supplies power to the remote control device 200. The power supply unit 460 can reduce power waste by interrupting the power supply when the remote controller 200 is not moving for a predetermined period of time. The power supply unit 460 may resume power supply when a predetermined key provided in the remote control device 200 is operated.

The storage unit 470 may store various types of programs, application data, and the like necessary for the control or operation of the remote control apparatus 200. [ If the remote control device 200 wirelessly transmits and receives a signal through the image display device 100 and the RF module 421, the remote control device 200 and the image display device 100 transmit signals through a predetermined frequency band Send and receive. The control unit 480 of the remote control device 200 stores information on the frequency band and the like capable of wirelessly transmitting and receiving signals with the video display device 100 paired with the remote control device 200 in the storage unit 470 Can be referenced.

The control unit 480 controls various items related to the control of the remote control device 200. The control unit 480 transmits a signal corresponding to a predetermined key operation of the user input unit 430 or a signal corresponding to the motion of the remote control device 200 sensed by the sensor unit 440 through the wireless communication unit 420, (100).

The user input interface unit 150 of the image display apparatus 100 includes a wireless communication unit 411 capable of wirelessly transmitting and receiving signals to and from the remote control apparatus 200 and a pointer corresponding to the operation of the remote control apparatus 200. [ And a coordinate value calculation unit 415 that can calculate the coordinate value of the coordinate system.

The user input interface unit 150 can wirelessly transmit and receive signals to and from the remote control device 200 through the RF module 412. Also, the remote control device 200 can receive a signal transmitted through the IR module 413 according to the IR communication standard.

The coordinate value calculator 415 corrects the camera shake or error from the signal corresponding to the operation of the remote controller 200 received via the wireless communication unit 411 and outputs the coordinate value of the pointer 202 to be displayed on the display 170 (x, y) can be calculated.

The transmission signal of the remote controller 200 inputted to the image display apparatus 100 through the user input interface unit 150 is transmitted to the controller 170 of the image display apparatus 100. [ The control unit 170 can determine the information on the operation of the remote control apparatus 200 and the key operation from the signal transmitted from the remote control apparatus 200 and control the image display apparatus 100 in accordance with the information.

As another example, the remote control device 200 may calculate the pointer coordinate value corresponding to the operation and output it to the user input interface unit 150 of the video display device 100. [ In this case, the user input interface unit 150 of the image display apparatus 100 can transmit information on the received pointer coordinate values to the control unit 170 without any additional camera shake or error correction process.

As another example, the coordinate value calculating unit 415 may be provided in the control unit 170 instead of the user input interface unit 150, unlike the drawing.

FIG. 7 is a view for explaining how images are formed by a left eye image and a right eye image, and FIG. 8 is a view for explaining depths of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 515, 525, 535, 545 are illustrated.

First, the first object 515 includes a first left eye image 511, L based on the first left eye image signal and a first right eye image 513, R based on the first right eye image signal, It is exemplified that the interval between the first left eye image 511, L and the first right eye image 513, R is d1 on the display 180. [ At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 501 and the first left eye image 511 and an extension line connecting the right eye 503 and the first right eye image 503. Accordingly, the user recognizes that the first object 515 is located behind the display 180. [

Next, since the second object 525 includes the second left eye image 521, L and the second right eye image 523, R, and overlaps with each other and is displayed on the display 180, do. Accordingly, the user recognizes that the second object 525 is located on the display 180. [

Next, the third object 535 and the fourth object 545 are displayed on the display screen of the fourth left eye image 531, L, the second right eye image 533, R, the fourth left eye image 541, Right eye image 543 (R), and the intervals thereof are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 535 and the fourth object 545 are located at positions where images are formed, respectively, and recognizes that they are located before the display 180 in the drawing.

At this time, it is recognized that the fourth object 545 is projected before the third object 535, that is, more protruded than the third object 535. This is because the interval between the fourth left eye image 541, L and the fourth right eye image 543, d4 is larger than the interval d3 between the third left eye image 531, L and the third right eye image 533, R. [

Meanwhile, in the embodiment of the present invention, the distance between the display 180 and the objects 515, 525, 535, and 545 recognized by the user is represented by a depth. Accordingly, it is assumed that the depth when the user is recognized as being positioned behind the display 180 has a negative value (-), and the depth when the user is recognized as being positioned before the display 180 (depth) has a negative value (+). That is, the greater the degree of protrusion in the user direction, the greater the depth.

8, the interval a between the left eye image 601 and the right eye image 602 in FIG. 8 (a) is smaller than the interval a between the left eye image 601 and the right eye image 602 shown in FIG. 8 (b) it is understood that the depth a 'of the 3D object in FIG. 8 (a) is smaller than the depth b' of the 3D object in FIG. 8 (b).

In this way, when the 3D image is exemplified as the left eye image and the right eye image, the positions recognized as images are different depending on the interval between the left eye image and the right eye image. Accordingly, by adjusting the display intervals of the left eye image and the right eye image, the depth of the 3D image or the 3D object composed of the left eye image and the right eye image can be adjusted.

Fig. 9 is a diagram referred to explain the principle of a stereoscopic image display apparatus of a non-eyeglass system.

As described above, the stereoscopic image display apparatus of the non-eyeglass system includes a lenticular system and a parallax system, and a system using a microlens array. Hereinafter, the lenticular method and the parallax method will be described in detail. In addition, hereinafter, the viewpoint image will be described with an example of the left eye viewpoint image and the right eye viewpoint image, but this is for convenience of description and the present invention is not limited thereto.

9 (a) is a view showing a lenticular system using a lenticular lens. Referring to FIG. 9A, a block 720 (L) constituting a left eye view image and blocks 710 (R) constituting a right eye view image may be alternately arranged on a display 180. At this time, each block may include a plurality of pixels, but it is also possible to include one pixel. Hereinafter, the case where each block is composed of one pixel will be mainly described.

In the lenticular method, the lenticular lens 195a is disposed on the lens unit 195, and the lenticular lens 195a disposed in front of the display 180 has a traveling direction of light emitted from the pixels 710 and 720. Can be changed. For example, the light emitted from the pixels 720, L constituting the left eye view image is changed in the direction of progress toward the left eye 701 of the viewer, and the pixels 710, R constituting the right eye view image, The light emitted from the viewer can be changed toward the right eye 702 of the viewer.

Accordingly, in the left eye 702, the light emitted from the pixels 720, L constituting the left eye view image is merged to see the left eye view image. In the right eye 701, The light emitted from the right eye 710 (R) is combined to see the right eye view image, and the viewer is recognized as a three-dimensional image without wearing glasses.

9 (b) is a diagram showing a parallax system using a slit array. Referring to FIG. 9B, similarly to FIG. 9A, a pixel 720 (L) constituting a left eye view image and pixels 710 (R) constituting a right eye view image are alternately displayed on a display 180 Lt; / RTI > In the parallax system, a slit array 195b is disposed in the lens unit 195. The slit array 195b functions as a barrier so that light emitted from the pixels can travel only in a certain direction . Accordingly, in the same manner as the lenticular method, the user sees the left eye view image in the left eye 702, the right eye view image in the right eye 701, and the viewer recognizes the stereoscopic image without wearing any glasses.

10 to 14 are views for explaining the principle of the image display device for displaying a 3D image of the glasses-free method.

10 is a view showing an image display apparatus 100 including three viewpoint regions 821, 822 and 823, in which some pixels constituting three viewpoint images respectively displayed in three viewpoint regions are shown in FIG. 10 As shown, can be rearranged and displayed on the display 180. At this time, rearranging the pixels means changing the value of the pixel displayed on the display 180, rather than changing the physical position of the pixel.

The three viewpoint images may be images of the object 910 taken in different directions as shown in FIG.

The first pixel 811 displayed on the display 180 may be composed of a first subpixel 801, a second subpixel 802 and a third subpixel 803, (801, 802, and 803) may be subpixels representing any one of red, green, and blue.

FIG. 10 shows one pattern displayed by rearranging the pixels constituting the three view-point images. However, the present invention is not limited thereto, and may be rearranged and displayed in various patterns according to the lens unit 195.

In FIG. 10, the subpixels 801, 802 and 803 in which the numeral 1 is described are subpixels constituting the first viewpoint image, the subpixels in which the numeral 2 is described are subpixels constituting the second viewpoint image, The described subpixels may be subpixels constituting the third viewpoint image.

Accordingly, in the first viewpoint region 821, the subpixels in which the numeral 1 is described are combined to display the first viewpoint image, and in the second viewpoint region 822, the subpixels describing the numeral 2 are combined to display the second viewpoint image The third viewpoint image may be displayed by combining the subpixels in which the number 3 is described in the third viewpoint region.

That is, the first view image 901, the second view image 902, and the third view image 903 shown in FIG. 11 represent images displayed along the view direction. In addition, the first viewpoint image 901 is taken in the first viewpoint direction, the second viewpoint image 902 is taken in the second viewpoint direction, and the third viewpoint image 903 is taken in the third viewpoint direction It can be a video.

Therefore, when the left eye 922 of the viewer is located in the third viewpoint region 823 and the right eye 921 is located in the second viewpoint region 822 as shown in Fig. 12 (a), the left eye 922 The third viewpoint image 903 and the right eye view 921 see the second viewpoint image 902. FIG. Accordingly, as shown in Fig. 12 (b), it is recognized that the object 910 is located in front of the display 180 by the principle described in Fig. 7, so that the viewer can recognize the stereoscopic image do. Also, when the left eye 922 of the viewer is located in the second view region 822 and the right eye 921 is located in the first view region 821, stereoscopic images (3D images) can be recognized as well.

On the other hand, as shown in FIG. 10, when the pixels of the plural viewpoint images are rearranged only in the horizontal direction, the horizontal resolution is reduced to 1 / n (the number of viewpoint images) as compared with the 2D image. For example, the horizontal resolution of the stereoscopic image (3D image) of FIG. 10 is reduced to 1/3 of that of the 2D image. On the other hand, the vertical resolution has the same resolution as the multi-view images 901, 902, and 903 before being rearranged.

In the case where the number of viewpoint images per direction is large (the reason why the number of viewpoint images should be increased will be described later with reference to FIG. 14), only the horizontal resolution is reduced as compared with the vertical resolution, There is a problem that it may be degraded.

13, the lens unit 195 is disposed on the front surface of the display 180 while being inclined at a predetermined angle alpha with the longitudinal axis 185 of the display, and the lens unit 195 The subpixels constituting the multiple view image can be rearranged and displayed in various patterns. 13 shows an image display apparatus including a plurality of viewpoints in each of 25 directions according to an embodiment of the present invention. In this case, the lens unit 195 may be a lenticular lens or a slit array.

13, the red subpixels constituting the sixth view image are displayed every 5 pixels in the horizontal and vertical directions, and the stereoscopic image (3D image) The horizontal and vertical resolutions can be reduced to 1/5 of the directional multi-view images before being rearranged. Therefore, it is possible to balance the degradation in resolution compared to a method in which only the conventional horizontal resolution is reduced to 1/25.

14 is a diagram for explaining a sweet zone and a dead zone appearing on the front face of the video display device.

When the stereoscopic image is viewed using the image display device 100 as described above, the stereoscopic effect can be felt by a plurality of viewers who do not wear the special stereoscopic glasses, but the stereoscopic effect is limited to a certain area. There is an area where the viewer can view an optimal image, which can be defined by an optimal viewing distance (OVD) and a sweet zone (1020). First, the optimum viewing distance D can be determined by the distance between the left and right eyes, the pitch of the lens portion, and the focal length of the lens. The swizzone 1020 is an area in which a plurality of viewpoint regions are sequentially positioned so that a viewer can stably feel a three-dimensional feeling. 14, when the viewer is located in the sweet zone 1020, the 12th to 14th viewpoint images are recognized in the right eye 1001 and the 17th to 19th viewpoint images are recognized in the left eye view 1002 , The viewpoint image for each direction can be sequentially recognized in the left eye 1002 and the right eye 1001. [ Therefore, as described with reference to Fig. 12, the stereoscopic effect can be felt by the left eye image and the right eye image.

On the other hand, in the case where the viewer is located in the dead zone 1015 after leaving the sweet zone 1020, for example, the first to third view images are displayed in the left eye 1003, When the 23rd to 25th viewpoint images are recognized, the viewpoint-based viewpoint images are not sequentially recognized in the left eye 1003 and the right eye 1004, and the inversion of the left eye image and the right eye image may occur, . In addition, when the first view image and the 25th view image are recognized at once in the left eye 1003 or the right eye 1004, dizziness may be felt.

The size of the swath zone 1020 can be defined by the number n of the plurality of viewpoint images in the direction and the distance corresponding to one viewpoint. Since the distance corresponding to one viewpoint must be smaller than the distance between the eyes of the viewer, there is a limit to increase the size thereof. To increase the size of the sweet zone 1020, .

As described above, the image display apparatus 100 must display a plurality of viewpoint images including a left eye image and a right eye image in order to display a 3D image, and in particular, a specific lens unit 195 in order to display an agglomeration 3D image. ).

Therefore, when video communication is performed between video display apparatuses having different image display methods, it may occur that the counterpart image received cannot be displayed.

For example, when the user image transmitted from the first image display apparatus 100 is a 3D image, and the second image display apparatus 600 supports only 2D image display, the user image transmitted from the first image display apparatus 100 is received from the first image display apparatus 100. Images cannot be displayed.

In order to solve this problem, according to an embodiment of the present invention, an image display apparatus for converting a user image into one of a 2D image, an eyeglass 3D image, and an eyeglass-free 3D image based on the information of the counterpart image display apparatus is transmitted. to provide. Hereinafter, the image display device will be described in detail.

For convenience of description, the user image displayed on the first image display apparatus 100 is called a first image, and in order to perform image communication, the second image display apparatus (contrast image display apparatus, The image transmitted to 600 is called a second image. The second image may be displayed on the second image display device 600.

15 is a flowchart illustrating a method of operating an image display device according to an exemplary embodiment of the present invention, and FIGS. 16 to 22 are views for explaining an operation method of the image display device of FIG. 15.

First, referring to FIG. 15, when image communication is connected to the second image display apparatus, the first image display apparatus receives the first image (S1110). The first image 1210 is a user image of the first image display apparatus 100 used in video communication, and may be displayed on the first image display apparatus 100 as shown in FIG. 16.

Also, the first image 1210 may be an image input from a camera embedded in the first image display apparatus 100 or a camera disposed separately, or may be an image previously stored in the storage 140. In particular, the first image 1210 may include an image of a user of the first image display apparatus 100.

In addition, when the photographing unit 155 includes a single camera, the first image 1210 may be a 2D image input from the camera.

Alternatively, the controller 170 determines an image input from the camera as a left eye image, and generates a right eye image based on the determined left eye image, so that the first image 1210 includes the left eye image and the right eye image. It may be an image.

On the other hand, when the photographing unit 155 includes the first camera 155a and the second camera 155b (configured as a stereo camera), the first image 1210 includes the first and second cameras 155a and 155b. It may be a 2D image input from any one of), and may be a 3D image generated based on the image input from the first and second cameras (155a, 155b).

For example, a 3D image may be generated by using the images input from the first and second cameras 155a and 155b as the left eye image and the right eye image, respectively, and the generated image may be the first image 1210. have.

The first image display apparatus 100 converts the first image into a second image based on the information of the second image display apparatus 600, and the converted second image is a 2D image, an eyeglass 3D image, and an eyeless 3D image. It may be any one of the images (S1120).

For example, as shown in FIG. 16A, when the first image 1210 is a 3D image and the second image display apparatus 600 can display only a 2D image, the first image display apparatus 100 may be used. Converts the first image 1210 into a 2D image (second image) and transmits the image to the second image display apparatus 600. At this time, one of the left eye image and the right eye image of the first image (3D image) is transmitted. Can transmit Therefore, the second image display apparatus 600 may display the received 2D image (second image) 1220.

In addition, as shown in FIG. 16B, when the first image 1210 is a 2D image, and the second image display apparatus 600 displays a 3D image (glasses type or glassesless type), the first image display apparatus ( 100 may determine the first image 1210 as a left eye image, generate a right eye image based on the determined left eye image, and transmit the left eye image and the right eye image. Therefore, the second image display apparatus 600 may display the second image (3D image) 1220 by using the received left eye image and right eye image.

In addition, the first image display apparatus 100 may convert the left eye image and the right eye image into a predetermined format of a 3D image and transmit the same. For example, a side by side format in which the left and right eye signals are arranged left and right, a top / down format in which the left and right eye signals are arranged, and a frame sequential arranged in time division The image may be converted into any one of a format, an interlaced format in which a left eye video signal and a right eye video signal are mixed line by line, and a checker box format in which a left eye video signal and a right eye video signal are mixed box by box.

In particular, when the first image 1210 is a 2D image and the second image display apparatus 600 displays an agglomerate-less 3D image including a multi-view image, the first image display apparatus 100 may display the first image ( 1210 is determined as a first view image, and a plurality of view images (second image) including a second view image, a third view image, and the like, based on the first view image, the second image display apparatus 600. Can be sent to.

When the first image display apparatus 100 converts the first image 1210 into a 3D image, the first image display apparatus 100 recognizes an object included in the first image 1210 using the stereo cameras 155a and 155b and the object. By determining the actual distance between the stereo camera and the stereo camera, a preset depth value may be applied to correspond to the determined distance. In this case, the distance between the real object and the stereo camera may be determined through the distance between the stereo cameras 155a and 155b, the convergence distance of the camera, and the view of the camera.

Alternatively, the converted 3D image is based on at least one of a size of the second image display apparatus 600, a viewing distance of the second image display apparatus 600, and a parallax of a viewer watching the second image display apparatus 600. The number or depth values of the multi-view images may be determined.

For example, when the first image 1210 is converted into a 3D image (second image), as shown in FIG. 17, when the size of the second image display apparatus 600a is large or the viewing distance is long, The depth value d1 of the 3D image may be increased. On the other hand, when the viewing distance of the second image display apparatus 600b is short or the size of the second image display apparatus is small, the depth value d2 of the 3D image may be reduced.

In addition, when converting to an agglomerate-free 3D image, when the size of the second image display apparatus 600b is small or when the viewing position of the user is limited, such as a portable terminal, the number of multi-view images may be reduced. On the other hand, when the size of the counterpart image display apparatus 600a is large or when a plurality of users are watching, the number of multi-view images may increase.

Meanwhile, the information of the second image display apparatus 600 as described above is data received from the second image display apparatus 600 at the request of the first image display apparatus 100, or before the video communication, the first image. The data may be data previously stored in the storage 140 of the display device 100.

In this case, the information related to the second image display apparatus 600 may include information for identifying the second image display apparatus 600 such as a model name, a user name, and a user network address of the second image display apparatus 600. Can be stored together.

Accordingly, the first image display apparatus 100 receives a user input for inputting any one of a model name, a user name, and a user network address of the second image display apparatus 600 to which an image is to be transmitted, and corresponds to a second image corresponding thereto. Based on the information of the image display apparatus 600, the first image 1210 may be converted into the second image 1220.

As described above, the first image display apparatus 100 converts the first image 1210 into a second image 1220 on the basis of the information of the second image display apparatus 600, and thereby the second image display apparatus. By transmitting to, it is possible to provide an optimized image to viewers watching the second image 1220.

In addition, the first image display apparatus according to an embodiment of the present invention may convert the first image 1210 into a second image 1220 based on a user input.

For example, as shown in FIG. 18A, when the first image display apparatus 100 enters a video communication mode with the second image display apparatus 600, the display 180 may display a counterpart image 1305. A first image 1210 that is a user image is displayed. In this case, the counterpart image 1305 is an image received from the second image display apparatus 600, and may include a user image of the second image display apparatus 600.

When the input 1315 is selected to select the first image 1210, the menu window 1350 is displayed as shown in FIG. 18B. The menu window 1350 may include an image conversion item and a user image item.

The image conversion item includes the objects 1301, 1302, and 1303 corresponding to the 2D image, the eyeglass 3D image, and the eyeglass free 3D image. When the first image display apparatus 100 receives a user input for selecting one of the plurality of objects 1301, 1302, and 1303, the first image display apparatus 100 converts the first image 1210 into an image corresponding to the selected object. The image may be transmitted to the second image display device 600.

In this case, based on the information of the second image display apparatus 600, in case of an image that cannot be displayed on the second image display apparatus 600, an object corresponding to the image may be deactivated and displayed.

For example, as shown in FIG. 18B, when the second image display apparatus 600 cannot display the agglomerate-less 3D image, the object 1303 corresponding to the asterisk-free 3D image may be deactivated and displayed. have.

In addition, based on the information of the second image display apparatus 600, an object corresponding to the most appropriate image among the images that can be displayed on the second image display apparatus 600 is highlighted and displayed to provide information to the user. .

On the other hand, when the user selects the object 1301 corresponding to the 2D image, it can be converted to the 2D image 1220 and transmitted to the second image display device 600, as shown in Figure 16 (a), 3D glasses type When the objects 1302 and 1303 corresponding to the image and the non-glass 3D image are selected, the left eye image and the right eye image are converted based on the information of the second image display apparatus 600 as shown in FIG. Alternatively, the multi-view image may be transmitted to the second image display device 600.

Meanwhile, the user image item also includes objects 1321, 1322, and 1323 corresponding to the 2D image, the eyeglass 3D image, and the eyeglass free 3D image, similar to the image conversion item, and the first image display apparatus 100 includes any of the objects. The user image may be displayed as an image corresponding to the selected object by receiving an input for selecting one.

On the other hand, as shown in FIG. 19, when the counterpart image 1305 is selected, a menu window 1360 related to the counterpart image is displayed, and the menu window 1360 corresponds to a 2D image, an eyeglass 3D image, and an eyeless 3D image. Objects 1371, 1372, and 1373.

At this time, when the objects (1371, 1372) corresponding to the glasses 3D image and the glasses-free 3D image is selected, the counterpart image 1305 may be displayed as a 3D image, the depth value of the counterpart image (3D image, 1305) As described above, it may be determined based on an actual distance between the stereo camera and the object included in the captured image of the first image display apparatus 100. That is, the depth value of the counterpart image 1305 may be applied in consideration of the viewing distance of the user of the first image display device.

Meanwhile, the first image display apparatus 100 according to an exemplary embodiment of the present invention may convert the first image into the second image so that a predetermined region of the first image is focused.

For example, as shown in FIG. 20, when the first image 1410 and the rectangular object 1405 capable of region selection are displayed on the first image display apparatus 100, the user may determine the position and size of the rectangular object 1405. By input to adjust, a predetermined area can be selected.

In particular, as shown in FIG. 20B, when the first object 1421 and the second object 1422 are included in the first image 1410, one of the first object 1421 and the second object 1422 is included. You can choose either.

As described above, when a predetermined region of the first image 1410 is selected, the first image display apparatus 100 may include a first image in which the region in which the first image 1410 is selected is enlarged as shown in FIG. 20A. The second image may be converted into an image 1420 and transmitted to the second image display apparatus 600.

Alternatively, as illustrated in FIG. 20B, only the selected object 1421 may be converted into the second image 1420 to which the 3D effect is applied and transmitted to the second image display apparatus 600.

Meanwhile, the first image display apparatus 100 may display the depth adjusting object 1450 as shown in FIGS. 20A and 20B. In one example, the depth adjustment object may be a scroll bar. The user may adjust the depth value of the second image 1420 converted to the 3D image by using the scroll bar. In this case, the minimum value and the maximum value of the scroll bar are set according to the depth range that can be displayed on the second image display device 600.

In addition, the user may directly adjust the depth value of the second image 1420 converted to the 3D image by directly inputting a depth value for the 3D image or by inputting an increase or decrease button.

Meanwhile, the first image display apparatus 100 according to the embodiment may request a change of the second image transmitted to the second image display apparatus 600 according to an image communication environment.

For example, as illustrated in (a) of FIG. 21, the first image display apparatus 100 converts the first image 1510 into a 3D image (second image, 1520) and the second image display apparatus 600. If the network 500 environment connecting the first image display apparatus 100 and the second image display apparatus 600 is not smooth, the first image display apparatus 100 of FIG. By displaying the notification message 1530, the first image 1510 may be requested to be converted into a 2D image having a relatively small amount of data. Accordingly, the first image 1510 may be converted into a 2D image 1540 and transmitted.

In the above, the operation method when the first image display apparatus 100 performs image communication with the second image display apparatus 600 has been described. However, the present invention is not limited thereto, and the first image display apparatus 100 includes a plurality of images. The present invention can also be applied to video communication with a display device.

Referring to FIG. 22, when the first image display apparatus 100 performs image communication with a plurality of image display apparatuses 610, 620, and 630, based on the information of each image display apparatus 610, 620, and 630. To transmit the converted image.

For example, the second image display device 610 displays a 2D image, the third image display device 630 displays a spectacle 3D image, and the fourth image display device 640 displays a non-glass 3D image. In this case, the first image display apparatus 100 converts the first image 1610 into a 2D image 1620, an eyeglass 3D image 1630, and an eyeglass-free 3D image 1640, so that the second, 3, and 4 images are displayed. Each of the display devices 610, 620, and 630 may be transmitted.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

Claims (21)

Receiving a user image;
Converting the user image into one of a 2D image, an eyeglass 3D image, and an eyeglass 3D image based on the counterpart image display information; And
And transmitting the converted image to the counterpart image display device.
The method of claim 1,
The counterpart image display apparatus information may include at least one of a display method of the counterpart image display apparatus, a viewing distance, a size of a displayed image, and a parallax of a user who views the counterpart image display apparatus. How it works.
The method of claim 1,
And the user image is a 3D image captured by a stereo camera and acquired based on the stereo camera information.
The method of claim 3,
The stereo camera information,
And a distance between the stereo cameras, a convergence distance of the stereo cameras, and a field of view of the stereo cameras.
The method of claim 1,
Before converting the user image,
And receiving the counterpart image display device information.
The method of claim 1,
Requesting video communication with the counterpart video display device;
Receiving a response to the video communication request from the counterpart video display device; And
Further comprising entering a video communication mode,
And receiving the user image, converting the user image, and transmitting the converted image to the counterpart image display apparatus in the video communication mode.
The method of claim 1,
Wherein the converting comprises:
Displaying an object corresponding to the 2D image, the glasses 3D image, and the glasses-free 3D image;
Receiving a user input of selecting one of the displayed objects; And
And converting the user image into an image corresponding to the selected object.
The method of claim 1,
The counterpart video display information is
And a pre-stored data based on at least one of a model name, a user name, and a user network address of the counterpart image display device.
The method of claim 1,
When the user image is converted into any one of the spectacle 3D image and the spectacle-free 3D image, depth values of the spectacle 3D image and the spectacle-free 3D image are determined based on information of the counterpart image display apparatus. Operation method of the video display device.
The method of claim 1,
In the case of converting the user image into an asteriskless 3D image, the number of multiple view images included in the asteriskless 3D image is determined based on information of the counterpart image display apparatus. .
The method of claim 1,
And converting the user image into a predetermined format of the spectacle 3D image based on information of the counterpart image display apparatus when the user image is converted into the spectacle 3D image.
A photographing unit for photographing a user image;
A display configured to display the user image and the counterpart image received from the counterpart image display device; And
And a controller configured to convert the user image into one of a 2D image, an eyeglass 3D image, and an eyeglass free 3D image based on the counterpart image display device information, and transmit the converted user image to the counterpart image display device.
The method of claim 12,
The counterpart image display apparatus information includes at least one of a display method of the counterpart image display apparatus, a viewing distance, a size of a displayed image, and a parallax of a user who views the counterpart image display apparatus.
The method of claim 12,
The photographing unit is composed of a stereo camera,
And the user image is a 3D image obtained based on the stereo camera information.
15. The method of claim 14,
The stereo camera information,
And at least one of a distance between the stereo cameras, a convergence distance of the stereo camera, and a field of view of the stereo camera.
The method of claim 12,
And a storage unit in which the counterpart image display device information is pre-stored.
17. The method of claim 16,
The counterpart video display information is
And stored based on at least one of a model name, a user name, and a user network address of the counterpart image display device.
The method of claim 12,
The control unit,
And receiving a user input of selecting one of the objects corresponding to the 2D image, the eyeglass 3D image, and the non-eyeglass 3D image displayed on the display, and controlling to convert the user image into an image corresponding to the selected object. A video display device.
The method of claim 12,
When the user image is converted into any one of the spectacle 3D image and the spectacle-free 3D image, depth values of the spectacle 3D image and the spectacle-free 3D image are determined based on information of the counterpart image display device. Image display device.
The method of claim 12,
And converting the user image into an asteriskless 3D image, the number of multiple view images included in the asteriskless 3D image is determined based on information of the counterpart image display apparatus.
The method of claim 12,
And converting the user image into a predetermined format of the spectacle 3D image based on information of the counterpart image display apparatus when the user image is converted into the spectacle 3D image.
KR1020120075634A 2012-07-11 2012-07-11 Image display apparatus, and method for operating the same KR20140008188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120075634A KR20140008188A (en) 2012-07-11 2012-07-11 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120075634A KR20140008188A (en) 2012-07-11 2012-07-11 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20140008188A true KR20140008188A (en) 2014-01-21

Family

ID=50142173

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120075634A KR20140008188A (en) 2012-07-11 2012-07-11 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20140008188A (en)

Similar Documents

Publication Publication Date Title
KR101924058B1 (en) Image display apparatus, and method for operating the same
KR20140063272A (en) Image display apparatus and method for operating the same
KR20150116302A (en) Image display apparatus, server and method for operating the same
KR20140038799A (en) Image display apparatus, server and method for operating the same
KR101855939B1 (en) Method for operating an Image display apparatus
US20130229409A1 (en) Image processing method and image display device according to the method
KR20150044732A (en) Stereoscopic image display apparatus in glassless mode and method for operating the same
KR20140061098A (en) Image display apparatus and method for operating the same
KR20130033815A (en) Image display apparatus, and method for operating the same
KR20130137927A (en) Image display apparatus, and method for operating the same
KR101832225B1 (en) Image display apparatus, and method for operating the same
KR20130026236A (en) Image display apparatus, and method for operating the same
KR101912635B1 (en) Image display apparatus, and method for operating the same
KR101836846B1 (en) Image display apparatus, and method for operating the same
KR20140098512A (en) Image display apparatus, and method for operating the same
KR20130120255A (en) Image display apparatus, and method for operating the same
KR20140008188A (en) Image display apparatus, and method for operating the same
KR101825669B1 (en) Image display apparatus, and method for operating the same
KR20150043875A (en) Stereoscopic image display apparatus in glassless mode and method for operating the same
KR101878808B1 (en) Image display apparatus and method for operating the same
KR101945811B1 (en) Image display apparatus, and method for operating the same
KR20140073231A (en) Image display apparatus, and method for operating the same
KR101890323B1 (en) Image display apparatus, settop box and method for operating the same
KR102014149B1 (en) Image display apparatus, and method for operating the same
KR20140063275A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination