KR20140057058A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20140057058A
KR20140057058A KR1020120123702A KR20120123702A KR20140057058A KR 20140057058 A KR20140057058 A KR 20140057058A KR 1020120123702 A KR1020120123702 A KR 1020120123702A KR 20120123702 A KR20120123702 A KR 20120123702A KR 20140057058 A KR20140057058 A KR 20140057058A
Authority
KR
South Korea
Prior art keywords
eye image
image
left eye
right eye
arrangement
Prior art date
Application number
KR1020120123702A
Other languages
Korean (ko)
Inventor
송병철
손지덕
이봉수
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120123702A priority Critical patent/KR20140057058A/en
Publication of KR20140057058A publication Critical patent/KR20140057058A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0085Motion estimation from stereoscopic image signals

Abstract

The present invention relates to an image display device and an operating method thereof. The operating method of the image display device according to an embodiment of the present invention comprises the following steps of: extracting a left eye image and a right eye image from an inputted signal; displaying the left eye image and the right eye image by arranging the left eye and right eye images in a given format; calculating parallax information using the left eye and right eye images; calculating motion information on either the left eye image or the right eye image; identifying whether the left eye image and the right eye image are reversely arranged based on the motion information and the parallax information; and displaying an object to indicate the identification result. As a result, user convenience can be improved.

Description

[0001] The present invention relates to an image display apparatus and a method of operating the same,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve the usability of a user.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Compared to analog broadcasting, digital broadcasting is strong against external noise and has a small data loss, is advantageous for error correction, has a high resolution, and provides a clear screen. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

It is an object of the present invention to provide an image display apparatus and an operation method thereof that can improve the usability of the user.

Another object of the present invention is to provide an image display apparatus and a method of operating the same that can easily recognize a change of a left eye image and a right eye image during 3D image viewing.

According to another aspect of the present invention, there is provided a method of operating an image display device, comprising: extracting a left eye image and a right eye image from an input signal; arranging and displaying a left eye image and a right eye image in a predetermined format; Calculating the parallax information using the left eye image and the right eye image, computing motion information for either the left eye image or the right eye image, and calculating the motion information for the left eye image and the right eye image based on the motion information and the time difference information. Determining whether the arrangement of the images has been rearranged, and displaying the object indicating the determination result.

According to another aspect of the present invention, there is provided a method of operating a video display device, comprising: extracting a left eye image and a right eye image from an input signal; extracting parallax information using a left eye image and a right eye image; Calculating motion information for either the left eye image or the right eye image, determining whether the arrangement of the left eye image and the right eye image is reversed based on the motion information and the time difference information, And replacing and displaying the arrangement positions of the left eye image and the right eye image when the arrangement is reversed.

According to another aspect of the present invention, there is provided an image display apparatus including an image extracting unit for extracting a left eye image and a right eye image from input signals, an image extracting unit for extracting a left eye image and a right eye image, A display for displaying a left eye image and a right eye image arranged in a format, a parallax information operation unit for calculating parallax information using a left eye image and a right eye image, and motion information for either the left eye image or the right eye image, And an image arrangement determining unit that determines whether or not the arrangement of the left eye image and the right eye image has been rearranged based on the motion information and the time difference information and displays the determination result of the image arrangement determination unit do.

According to the embodiment of the present invention, it is determined whether or not the rearrangement of the left eye image and the right eye image has been rearranged based on the motion information and the time difference information, and an object representing the determination result is displayed, The change of the right eye image can be easily recognized. Accordingly, the usability of the user can be increased.

Particularly, based on the parallax information, when a part of the area around the foreground in the depth map is an occlusion area covered by the background area or a part of the boundary area around the background is an area covering the foreground area , It is possible to easily recognize the change of the left eye image and the right eye image during 3D image viewing by judging that the arrangement of the left eye image and the right eye image is reversed. Accordingly, the usability of the user can be increased.

In particular, it is possible to determine whether the left and right sides of the stereo image have been changed using only the information of the input image, and present the guide to the user. Therefore, the user can avoid seeing images with the left and right sides changed for a long time,

1 is a view showing an appearance of a video display device of the present invention.
2 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
3 is an internal block diagram of the control unit of FIG.
4 is a diagram showing various formats of a 3D image.
5 is a diagram showing the operation of the viewing apparatus according to the format of FIG.
6 is a diagram illustrating various scaling methods of a 3D image signal according to an exemplary embodiment of the present invention.
Fig. 7 is a view for explaining how images are formed by the left eye image and the right eye image.
8 is a view for explaining the depth of the 3D image according to the interval between the left eye image and the right eye image.
FIG. 9 is a view illustrating a 3D image display of a polarization method and a 3D image display of a shutter method.
Fig. 10 is a diagram illustrating the reversal of the left eye image and the right eye image in the polarization method and the shutter method.
11 is a flowchart illustrating an operation method of an image display apparatus according to an embodiment of the present invention.
FIGS. 12 to 24B are diagrams for explaining various examples of the operation method of the image display apparatus of FIG.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a view showing an appearance of a video display device of the present invention.

Referring to FIG. 1, an image display apparatus 100 according to an embodiment of the present invention may be a fixed image display apparatus or a mobile image display apparatus.

According to the embodiment of the present invention, the image display apparatus 100 can perform signal processing of a 3D image. For example, when the 3D image input to the image display apparatus 100 is composed of a plurality of view-point images, the left-eye image and the right-eye image are subjected to signal processing, and the left- , And a 3D image can be displayed in accordance with the format.

Meanwhile, the video display device 100 described in the present specification may include a TV receiver, a monitor, a projector, a notebook computer, a digital broadcasting terminal, a mobile phone, a smart phone, and a tablet PC.

2 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.

2, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a broadcast receiving unit 105, an external device interface unit 130, a storage unit 140, a user input interface unit 150, (Not shown), a control unit 170, a display 180, an audio output unit 185, and a viewing device 195.

The broadcast receiving unit 105 may include a tuner unit 110, a demodulation unit 120, and a network interface unit 130. Of course, it is possible to design the network interface unit 130 not to include the tuner unit 110 and the demodulation unit 120 as necessary, and to provide the network interface unit 130 with the tuner unit 110 And the demodulation unit 120 are not included.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by the user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through the antenna 50. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner unit 110 can process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner unit 110 can be directly input to the controller 170.

The tuner unit 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among the RF broadcast signals received through the antenna in the present invention, and sequentially selects RF broadcast signals of the intermediate frequency signal, baseband image, . ≪ / RTI >

On the other hand, the tuner unit 110 can include a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner that simultaneously receives broadcast signals of a plurality of channels is also possible.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. At this time, the stream signal may be a signal in which a video signal, a voice signal, or a data signal is multiplexed.

The stream signal output from the demodulation unit 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 can be connected to an external device such as a DVD (Digital Versatile Disk), a Blu ray, a game device, a camera, a camcorder, a computer , And may perform an input / output operation with an external device.

The A / V input / output unit can receive video and audio signals from an external device. Meanwhile, the wireless communication unit can perform short-range wireless communication with other electronic devices.

The network interface unit 135 provides an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. For example, the network interface unit 135 can receive, via the network, content or data provided by the Internet or a content provider or a network operator.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage unit 140 may perform a function for temporarily storing video, audio, or data signals input to the external device interface unit 130. [ In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

Although the storage unit 140 of FIG. 2 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

(Not shown), such as a power key, a channel key, a volume key, and a set value, from the remote control apparatus 200, (Not shown) that senses a user's gesture to the control unit 170 or transmits a signal from the control unit 170 to the control unit 170 It is possible to transmit it to the sensor unit (not shown).

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner unit 110 or the demodulation unit 120 or the external device interface unit 130 so as to output the video or audio output Signals can be generated and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to FIG.

In addition, the control unit 170 can control the overall operation in the video display device 100. [ For example, the control unit 170 may control the tuner unit 110 to control the tuning of the RF broadcast corresponding to the channel selected by the user or the previously stored channel.

In addition, the controller 170 may control the image display apparatus 100 according to a user command or an internal program input through the user input interface unit 150.

Meanwhile, the control unit 170 may control the display 180 to display an image. At this time, the image displayed on the display 180 may be a still image or a moving image, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined 2D object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), EPG (Electronic Program Guide), various menus, widgets, icons, still images, moving images, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to be projected relative to the image displayed on the display 180.

On the other hand, the control unit 170 can recognize the position of the user based on the image photographed from the photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 can be grasped. In addition, the x-axis coordinate and the y-axis coordinate in the display 180 corresponding to the user position can be grasped.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided. The channel browsing processing unit receives the stream signal TS output from the demodulation unit 120 or the stream signal output from the external device interface unit 130 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be stream-decoded together with a decoded image and input to the controller 170. The control unit 170 may display a thumbnail list having a plurality of thumbnail images on the display 180 using the input thumbnail image.

At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts a video signal, a data signal, an OSD signal, a control signal processed by the control unit 170, a video signal, a data signal, a control signal, and the like received from the external device interface unit 130, .

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or the like, and may also be capable of a 3D display.

In order to view the three-dimensional image, the display 180 may be divided into an additional display method and a single display method.

The single display method can implement a 3D image only on the display 180 without a separate additional display, for example, glass, and examples thereof include a lenticular method, a parallax barrier, and the like Various methods can be applied.

In addition, the additional display method can implement a 3D image using an additional display as the viewing device 195 in addition to the display 180. For example, various methods such as a head mount display (HMD) type and a glasses type are applied .

On the other hand, the glasses type can be further divided into a passive type such as a polarizing glasses type and an active type such as a shutter glass type. Also, the head mount display type can be divided into a passive type and an active type.

On the other hand, the viewing apparatus 195 may be a 3D glass for stereoscopic viewing. The glass 195 for 3D may include a passive polarizing glass or an active shutter glass, and may be a concept including the head mount type described above.

For example, when the viewing apparatus 195 is a polarizing glass, the left eye glass can be realized as a left eye polarizing glass, and the right eye glass can be realized as a right eye polarizing glass. At this time, the display 180 may include a polarization filter, for example, a film-type patterned retarder (FPR).

As another example, when the viewing apparatus 195 is a shutter glass, the left eye glass and the right eye glass can be alternately opened and closed.

Meanwhile, the display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives the signal processed by the control unit 170 and outputs it as a voice.

A photographing unit (not shown) photographs the user. The photographing unit (not shown) may be implemented by a single camera, but the present invention is not limited thereto, and may be implemented by a plurality of cameras. On the other hand, the photographing unit (not shown) may be embedded in the image display device 100 on the upper side of the display 180 or may be disposed separately. The image information photographed by the photographing unit (not shown) may be input to the control unit 170.

The control unit 170 can detect the gesture of the user based on each of the images photographed from the photographing unit (not shown) or the signals sensed from the sensor unit (not shown) or a combination thereof.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. [ To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. Also, the remote control apparatus 200 can receive the video, audio, or data signal output from the user input interface unit 150 and display it or output it by the remote control apparatus 200.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. The functions performed in each block are for the purpose of describing the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of rights of the present invention.

2, the video display apparatus 100 does not include the tuner unit 110 and the demodulation unit 120 shown in FIG. 2, but may be connected to the network interface unit 130 or the external device interface unit 135 to play back the video content.

On the other hand, the image display apparatus 100 is an example of a video signal processing apparatus that performs signal processing of an image stored in the apparatus or an input image. Another example of the image signal processing apparatus includes a display 180 shown in FIG. 2, A set-top box excluding the audio output unit 185, a DVD player, a Blu-ray player, a game machine, a computer, and the like may be further exemplified.

FIG. 3 is an internal block diagram of the control unit of FIG. 2, FIG. 4 is a diagram illustrating various formats of a 3D image, and FIG. 5 is a diagram illustrating operations of a viewing apparatus according to the format of FIG.

The control unit 170 includes a demultiplexing unit 310, an image processing unit 320, a processor 330, an OSD generating unit 340, a mixer 345, A frame rate conversion unit 350, and a formatter 360. [0031] An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processing unit 320 may perform image processing of the demultiplexed image signal. For this, the image processing unit 320 may include a video decoder 325 and a scaler 335.

The video decoder 325 decodes the demultiplexed video signal and the scaler 335 performs scaling so that the resolution of the decoded video signal can be output from the display 180.

The video decoder 325 can include a decoder of various standards.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

Meanwhile, the image signal decoded by the image processing unit 320 may be a 3D image signal in various formats. For example, a 3D image signal composed of a color image and a depth image, or a 3D image signal composed of a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

4, the format of the 3D video signal is a side-by-side format (Fig. 4A) in which the left eye image signal L and the right eye image signal R are arranged left and right, A frame sequential format (FIG. 4C) for arranging in a time division manner, an interlaced format (FIG. 4B) for mixing the left eye image signal and the right eye image signal line by line 4d), a checker box format (FIG. 4e) for mixing the left eye image signal and the right eye image signal box by box, and the like.

The processor 330 may control the overall operation in the image display apparatus 100 or in the control unit 170. [ For example, the processor 330 may control the tuner 110 to select a channel selected by the user or an RF broadcast corresponding to a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. [

In addition, the processor 330 may perform data transfer control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generation unit 340 generates an OSD signal according to a user input or by itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The OSD generating unit 340 can generate a pointer that can be displayed on the display based on the pointing signal input from the remote control device 200. [ In particular, such a pointer may be generated by a pointing signal processing unit, and the OSD generating unit 240 may include such a pointing signal processing unit (not shown). Of course, a pointing signal processing unit (not shown) may be provided separately from the OSD generating unit 240.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded video signal processed by the image processor 320. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the frame rate without conversion.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. The left eye glass of the 3D viewing apparatus 195 and the synchronization signal Vsync for opening the right eye glass can be output.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

In the present specification, a 3D video signal means a 3D object. Examples of the 3D object include a picuture in picture (PIP) image (still image or moving picture), an EPG indicating broadcasting program information, Icons, texts, objects in images, people, backgrounds, web screens (newspapers, magazines, etc.).

On the other hand, the formatter 360 can change the format of the 3D video signal. For example, it can be changed to any one of various formats exemplified in FIG. Thus, according to the format, the operation of the eyeglass type viewing apparatus can be performed as shown in Fig.

5A illustrates operation of the 3D-use glass 195, particularly, the shutter glass 195 when the formatter 360 arranges and outputs the frames in the frame sequential format of the format shown in FIG. 4. FIG.

That is, when the left eye image L is displayed on the display 180, the left eye glass of the shutter glass 195 is opened and the right eye glass is closed. When the right eye image R is displayed, The left eye glass is closed, and the right eye glass is opened.

On the other hand, FIG. 5B illustrates the operation of the 3D-use glass 195, particularly the polarizing glass 195, when the formatter 360 arranges and outputs the side-by-side format of the format shown in FIG. On the other hand, the 3D glass 195 applied in FIG. 5 (b) may be a shutter glass, and the shutter glass at this time may be operated as a polarizing glass by keeping both the left-eye glass and right-eye glass open .

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal can be separated into the left eye image signal L and the right eye image signal R, as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. On the other hand, the functions of such a 3D processor can be merged into the formatter 360 or merged into the image processing unit 320. [ This will be described later with reference to FIG. 6 and the like.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

3, the signals from the OSD generating unit 340 and the image processing unit 320 are mixed in the mixer 345 and then 3D processed in the formatter 360. However, the present invention is not limited to this, May be located behind the formatter. That is, the output of the image processing unit 320 is 3D-processed by the formatter 360, and the OSD generating unit 340 performs 3D processing together with the OSD generation. Thereafter, the processed 3D signals are mixed by the mixer 345 It is also possible to do.

Meanwhile, the block diagram of the controller 170 shown in FIG. 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be separately provided.

6 is a diagram illustrating various scaling methods of a 3D image signal according to an exemplary embodiment of the present invention.

Referring to the drawing, in order to increase the 3-dimensional effect, the controller 170 may perform 3D effect signal processing. In particular, it is possible to perform a size adjustment or a tilt adjustment of a 3D object in a 3D image.

The 3D object 510 in the 3D image signal or the 3D image signal can be enlarged or reduced 512 as a whole at a certain ratio as shown in FIG. 6 (a), and also, as shown in FIG. 6 (b) , The 3D object may be partially enlarged or reduced (trapezoidal shape, 514, 516). 6 (d), at least a portion of the 3D object may be rotated (parallelogram shape 518). This scaling (scaling) or skew adjustment can emphasize the 3D effect of a 3D object in a 3D image or a 3D image, that is, a 3D effect.

On the other hand, as the slope becomes larger, the difference in length between the parallel sides of the trapezoidal shapes 514, 516 becomes larger as shown in Fig. 6B or 6C, .

The size adjustment or the tilt adjustment may be performed after the 3D image signal is aligned in a predetermined format in the formatter 360. [ Or in the scaler 335 in the image processing unit 320. [ On the other hand, the OSD generating unit 340 may generate an object in a shape as illustrated in FIG. 6 for the OSD generated for 3D effect enhancement.

Although not shown in the drawing, signal processing for a 3D effect (3-dimensional effect) may be performed by adjusting the brightness, tint, and brightness of an image signal or object, It is also possible that signal processing such as color adjustment is performed. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. The signal processing for the 3D effect may be performed in the controller 170 or may be performed through a separate 3D processor. Particularly, when it is performed in the control unit 170, it is possible to perform it in the formatter 360 or in the image processing unit 320 together with the above-described size adjustment or tilt adjustment.

FIG. 7 is a view for explaining how images are formed by a left eye image and a right eye image, and FIG. 8 is a view for explaining depths of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 615, 625, 635, and 645 are illustrated.

First, the first object 615 includes a first left eye image 611 (L) based on the first left eye image signal and a first right eye image 613 (R) based on the first right eye image signal, It is exemplified that the interval between the first left eye image 611, L and the first right eye image 613, R is d1 on the display 180. [ At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the first left eye image 611 and an extension line connecting the right eye 603 and the first right eye image 603. Accordingly, the user recognizes that the first object 615 is positioned behind the display 180. [

Next, since the second object 625 includes the second left eye image 621, L and the second right eye image 623, R and overlaps with each other and is displayed on the display 180, do. Accordingly, the user recognizes that the second object 625 is located on the display 180. [

Next, the third object 635 and the fourth object 645 are arranged in the order of the third left eye image 631, L, the second right eye image 633, R, the fourth left eye image 641, Right eye image 643 (R), and their intervals are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 635 and the fourth object 645 are positioned at positions where the images are formed, respectively, and recognizes that they are located before the display 180 in the drawing.

At this time, it is recognized that the fourth object 645 is projected before the third object 635, that is, more protruded than the third object 635. This is because the interval between the fourth left eye image 641, L and the fourth right eye image 643, d4 is larger than the interval d3 between the third left eye image 631, L and the third right eye image 633, R. [

Meanwhile, in the embodiment of the present invention, the distance between the display 180 and the objects 615, 625, 635, and 645 recognized by the user is represented by a depth. Accordingly, it is assumed that the depth when the user is recognized as being positioned behind the display 180 has a negative value (-), and the depth when the user is recognized as being positioned before the display 180 (depth) has a positive value (+). That is, the greater the degree of protrusion in the user direction, the greater the depth.

8, the interval a between the left eye image 701 and the right eye image 702 in FIG. 8A is smaller than the interval between the left eye image 701 and the right eye image 702 shown in FIG. 8 (b) it is understood that the depth a 'of the 3D object in FIG. 8 (a) is smaller than the depth b' of the 3D object in FIG. 8 (b).

In this way, when the 3D image is exemplified as the left eye image and the right eye image, the positions recognized as images are different depending on the interval between the left eye image and the right eye image. Accordingly, by adjusting the display intervals of the left eye image and the right eye image, the depth of the 3D image or the 3D object composed of the left eye image and the right eye image can be adjusted.

FIG. 9 is a view illustrating a 3D image display of a polarization method and a 3D image display of a shutter method.

First, FIG. 9 (a) illustrates a 3D video display of a polarization method. The left eye image and the right eye image can be simultaneously displayed on the image display apparatus 100 and the left eye image and the right eye image are outputted to the outside by different polarization filters, for example, FPR, respectively.

The user wears the polarizing glass 195 to watch the left eye image with the left eye glass 195a and the right eye image with the right eye glass 195b.

In the drawing, the left eye image L1 and the right eye image R1 are displayed at T1, the left eye image L2 and the right eye image R2 are displayed at T2, the left eye image L3 and the right eye image R3 are displayed at T3, And the left eye image L4 and the right eye image R4 are displayed at T4.

Next, FIG. 9 (b) illustrates 3D-image display of the shutter system. In the image display apparatus 100, the left eye image and the right eye image are sequentially displayed.

The user wears the shutter glass 195 and sequentially watches the left eye image and the right eye image by sequentially opening and closing the left eye glass 195a and the right eye glass 195b.

In the drawing, it is exemplified that a left eye image L1 is displayed on T1, a right eye image R1 is displayed on T2, a left eye image L2 is displayed on T3, and a right eye image R2 is displayed on T4.

Fig. 10 is a diagram illustrating the reversal of the left eye image and the right eye image in the polarization method and the shutter method.

FIG. 10A illustrates that the left eye image and the right eye image are appropriately displayed during Ta, Tc, and Td during the polarization-based 3D image display, but the left eye image and the right eye image are reversed during Tb. In such a case, the user may feel dizzy or deteriorated in stereoscopic feeling when viewing the display image.

FIG. 10B illustrates that the left eye image and the right eye image are appropriately displayed during Ta and Tb during the shutter type 3D image display, but the left eye image and the right eye image are reversed during Tc and Td, respectively. In such a case, the user may feel dizzy or deteriorated in stereoscopic feeling when viewing the display image.

Especially when watching 3D images, since there is no guide presented, the user may ignore the slight inconvenience and continue to watch. If this state continues, it is easy to feel fatigue. In addition, whenever the user feels uncomfortable, checking the images while changing the left and right sides of the images interferes with smooth viewing, which is inefficient.

In the embodiment of the present invention, when the left eye image and the right eye image are reversed in a polarization method, a shutter method, or the like, an object indicating the judgment result is displayed. This will be described in detail with reference to FIG. 11 and the following figures.

FIG. 11 is a flowchart illustrating an operation method of an image display apparatus according to an embodiment of the present invention, and FIGS. 12 to 24B are referred to for explaining various examples of an operation method of the image display apparatus of FIG.

First, the control unit 170 of the video display device 100 receives an image. The received image may be a broadcast image through the broadcast receiving unit 105, an external input image through the external device interface unit 130 or the network interface unit 135, or a stored image through the storage unit 140.

Next, the image extracting unit 1210 of the image display apparatus 100 extracts the left eye image and the right eye image from the input image (S1110). For example, the image extracting unit 1210 may be provided in the formatter 360. 3, the formatter 360 may be provided in the control unit 170, but may be provided separately from the control unit 170. FIG. In the following, it is assumed that the formatter 360 is provided in the control unit 170. [

The image extracting unit 1210 can receive an image, particularly a 3D image. The 3D image may be divided into a left eye image and a right eye image, or a color difference image and a depth image.

The image extracting unit 1210 can extract the left eye image and the right eye image from the received multiple view image. Alternatively, the image extracting unit 1210 can extract the left eye image and the right eye image using the inputted color difference image and the depth image.

On the other hand, if the input image is a 2D image, the image extracting unit 1210 can convert the input image into a 3D image. The image extracting unit 1210 can extract the left eye image and the right eye image from the converted 3D image.

13 illustrates the left eye image 1310 and the right eye image 1315 extracted by the image extracting unit 1210. FIG.

The left eye image 1310 and the right eye image 1315 may have common background images and may have objects 1312 and 1317 whose positions are different from each other.

Next, the image display apparatus 100 arranges and displays the extracted left eye image and right eye image in a predetermined format (S1115).

For example, the image arrangement unit 1260 can arrange the extracted left eye image and right eye image in the interlace format of FIG. 4 (d) in the case of the polarization method. Then, the display 180 can simultaneously display the left eye image and the right eye image as shown in Fig. 9 (a).

As another example, the image arrangement unit 1260 can arrange the extracted left eye image and right eye image in the frame sequential format of FIG. 4 (c) in the case of the shutter system. Then, the display 180 can simultaneously display the left eye image and the right eye image, as shown in Fig. 9 (b).

Thus, an understanding user can feel a stereoscopic feeling while watching the left eye image and the right eye image.

Alternatively, step 1115 (S1115) may be performed between step 1130 (S1130) and step 1135 (S1135). That is, after determining whether the arrangement of the left eye image and the right eye image is correct, it is possible to display the left eye image and the right eye image.

Next, the image display apparatus 100 calculates parallax information using the extracted left eye image and right eye image (S1120).

The parallax information calculation unit 1220 receives the left eye image and the right eye image from the image extraction unit 1210, and calculates parallax information (disparity) between the left eye image and the right eye image.

For example, assuming that there is no vertical difference between the left eye image and the right eye image, the image extracting unit 1210 calculates the horizontal distance value of the corresponding pixel or block between the left eye image and the right eye image as parallax information can do. Here, the parallax information may be an object located closer to -, and a closer object closer to + may be located in front of the object.

As described above, the parallax information calculation between the left eye image and the right eye image can be referred to as stereo matching.

The parallax information calculation unit 1220 can generate a depth map of the left eye image and the right eye image based on the calculated parallax information.

The depth map is caused by parallax information per object, and can be generated with only luminance information without color information. That is, the higher the brightness level is, the more the object is projected, and the lower the brightness level is, the more the object is depressed.

FIG. 14 illustrates a depth map 1320 using parallax information between the left eye image 1310 and the right eye image 1315 of FIG.

On the other hand, the parallax information calculation unit 1220 can separate the objects having parallax information equal to or larger than the reference value into foreground images, and the objects less than the reference value can be separated into backgrounds based on the calculated parallax information.

For example, when the parallax information is divided into -127 to 128 levels, the closer to +, the more protruded, so that the level 10 can be set as the reference value. An object exceeding level 10 can be separated into foreground, and objects of level 10 or lower can be separated into a background. On the other hand, the reference value can be set at various levels, for example, it can be set to 0 level.

Fig. 15 illustrates that the foreground and the background are separated.

Fig. 15 (a) illustrates extraction of an area excluding the automobile object in the background 1510 from either the left eye image 1310 or the right eye image 1315 in Fig. 13, and Fig. 15 (b) An example of extracting an automobile object in the foreground 1520 from any one of the left eye image 1310 and the right eye image 1315 of FIG.

Next, the image display apparatus 100 calculates motion information for either the left eye image or the right eye image (S1125).

The motion information calculation unit 1230 receives the left eye image and the right eye image from the image extraction unit 1210, and calculates motion information on any one of the left eye image and the right eye image. Here, the motion information may include, for each image frame, a motion vector.

The motion information calculation unit 1220 can estimate the position of the unit block in the temporally preceding and succeeding frames. In particular, the motion information operation unit 1220 can search a block that minimizes a luminance difference between a specific block of the current frame and each pixel within a constant search area of the previous frame.

16 illustrates an example of extracting motion information from the motion information operation unit 1230. FIG.

Fig. 16 (a) shows the first image frame 1610 at time T1, and Fig. 16 (b) shows the second image frame 1620 at time T2.

The motion information calculation unit 1230 can calculate the motion vector MV based on the positional difference between the first image frame 1610 and the second image frame 1620 of the object 1615 representing the automobile.

Next, the image display apparatus 100 determines the arrangement of the left eye image and the right eye image based on the motion information and the time difference information (S1130).

The image arrangement determination unit 1240 receives the left eye image and the right eye image from the image arrangement unit 1260 and determines whether the arrangement of the left eye image and the right eye image is correct.

For this, the image arrangement determination unit 1240 can receive motion information from the motion information operation unit 1230 and time difference information from the time difference information operation unit 1220.

The criterion for determining whether the arrangement of the left eye image and the right eye image is correct, according to the embodiment of the present invention, is a boundary area around the foreground and a boundary area around the background.

Specifically, the image arrangement determining unit 1240 determines, based on the motion information, whether or not a part of the boundary area around the foreground is an occlusion area covered by the background area or a boundary area around the background In the case where a certain region is an area covering the foreground region, it is determined that the arrangement of the left eye image and the right eye image is reversed.

The direction of the motion vector MV is changed from the foreground 1715 to the background 1718 in a state in which the foreground 1715 and the background 1718 are separated from each other in the depth map 1710 as shown in Fig. 17 (a) The border area of the background 1718 is obscured. The occluded area can be called occlusion area.

If the arrangement of the left eye image and the right eye image is correct and the boundary region of the background 1718 is covered by the foreground 1715 but the arrangement of the left eye image and the right eye image is reversed, A phenomenon occurs in which the area is covered by the background 1718. [

In the embodiment of the present invention, in consideration of the foreground, background, and motion vector direction, it is determined whether the foreground is obscured, the background is obscured or not, and the arrangement of the left eye image and the right eye image is reversed Or not.

FIGS. 18 and 19 are drawings referred to for determining the arrangement of the left eye image and the right eye image.

FIG. 18 illustrates a case where the arrangement of the left eye image and the right eye image is correct.

Fig. 18A illustrates moving only the foreground among the background 1810 and the foreground 1815. Fig. 18B illustrates a case in which the background 1815 of Fig. 18A is displayed when the motion vector is the background direction in the foreground. The block B displayed on the screen 1820 is covered by the movement of the foreground 1815. [

18 (c) illustrates moving only the background among the background 1820 and the foreground 1825. Fig. 18 (d) shows the background 1815 of Fig. 18 (c) The block C displayed in the background 1820 is covered by the movement of the background 1820. [

19 illustrates a case where the arrangement of the left eye image and the right eye image is reversed. Thus, the foreground is shown behind and the background is shown in front.

Fig. 19A illustrates moving only the background 1910 out of the background 1910 and the foreground 1915. Fig. 19B illustrates a case where the motion vector is in the foreground direction in the background, The block E displayed in the block 1915 is covered by the movement of the background 1910. [

19C illustrates moving only the foreground among the background 1925 and the foreground 1920. Fig. 19D illustrates a case where the foreground 1920 of Fig. 19C is displayed when the motion vector is the background direction in the foreground. The block F displayed on the screen 1920 is covered by the movement of the foreground 1920. [

19, the block E or the block F displayed in the foreground 1915 is obscured by the movement of the background 1910 or the foreground 1920.

The image arrangement determining unit 1240 determines that the arrangement of the left eye image and the right eye image is reversed when the foreground image is obscured by using a feature that the foreground is obscured or the background is not maskable .

Next, the image display apparatus 100 displays the object indicating the determination result (S1125).

The image arrangement determination unit 1240 can digitize the arrangement of the left eye image and the right eye image reversed.

For example, the image arrangement determination unit 1240 may quantify the ratio of a boundary region in the depth map to an occluded region covered by another region or an area that covers another region.

That is, the ratio of the area of the occlusion area covered by the other area and the area of the area covering the other area, as compared with the area of the entire edge area in the depth map, can be calculated as the left eye right eye change probability.

The image arrangement determination unit 1240 can determine that the arrangement of the left eye image and the right eye image is reversed when the computed numerical value is equal to or larger than the reference value.

For example, if the left eye change probability is 30% or more, it can be determined that the arrangement of the left eye image and the right eye image is reversed.

Here, the reference value may be variable corresponding to at least one of the parallax information or the motion information. For example, as the parallax information becomes larger or the motion information becomes larger, or a combination thereof, the reference value may become larger. In other words, when the depth is large or when there is a large amount of motion, a case where the boundary is unintentionally hidden or hidden in the boundary area of the foreground or the boundary area of the background is issued. Therefore, the reference value can be raised in consideration of this.

On the other hand, the image arrangement determination unit 1240 can determine whether the left eye image and the right eye image have been changed for a plurality of frames, and then convert the left eye image and the right eye image into digital values. That is, it is also possible to calculate the left eye right eye change probability value for a predetermined period, average it, and determine whether the left eye and right eye images have been changed based on the averaged value.

The image arrangement determination unit 1240 can output an object representing the numerical value to the display 180 for display of the numerical value. Then, the display 180 can display these objects together when the 3D image is displayed.

Such an object may include at least one of a probability value, a color, a graphic image, and a pop-up window.

20 (a) illustrates the display of the 3D image 2010 and the left eye right eye change probability value 2015, which are suitable for the left eye image and the right eye image arrangement. The 3D image 2010 includes a foreground object 2014 and a background object 2012. [ In the figure, it is exemplified that the left eye right eye change probability value 2015 is 15%. With this probability value 2015, the user can easily see that the arrangement of the left eye image and the right eye image is suitable.

Next, FIG. 20B illustrates display of the 3D image 2020 in which the left eye image and the right eye image arrangement are reversed, and the left eye right eye change probability value 2025. The 3D image 2010 includes a foreground object 2024 and a background object 2022 in which the left eye image and the right eye image are reversed. In the figure, it is exemplified that the left eye right eye change probability value 2025 is 80%. With this probability value 2015, the user can easily see that the arrangement of the left eye image and the right eye image has been rearranged.

On the other hand, unlike the case of the digitization as shown in Fig. 20, various displays are possible.

FIG. 21 illustrates a display of a 3D image 2020 in which the left eye image and the right eye image are rearranged, and a bar type object 2125 showing a left eye right eye change. The greater the degree of reversal of the left eye image and the right eye image arrangement, the higher the height of the displayed bar.

22 illustrates display of the 3D image 2020 in which the left eye image and the right eye image arrangement are reversed, and the color object 2225 indicating the left eye right eye change. For example, in the case of a red object, it indicates that the left eye and the right eye have been changed, and in the case of yellow, it indicates that the arrangement of the left eye and right eye images is appropriate.

23 illustrates display of a 3D image 2020 in which the left eye image and the right eye image arrangement are reversed, and a pop-up window 2325 indicating a left eye right eye change. Such a pop-up window may disappear after being displayed for a certain time.

Next, the image display apparatus 100 determines whether or not there is a replacement input for the left eye and right eye images (S1140). If so, the arrangement positions of the left eye image and the right eye image are replaced and displayed (S1145).

The control unit 170 displays an object indicating that the left and right eye images have been rearranged and then determines whether or not there is a replacement input for changing the arrangement of the left eye and the right eye images from the user. Such replacement input can be performed using a specific key or a pointer of the remote control device, etc., under the 3D image setting menu.

The control unit 170 can replace the arrangement position of the left eye image and the right eye image of the displayed 3D image by the replacement input. Specifically, the image arrangement unit 1260 can replace the arrangement positions of the left eye image and the right eye image. Thus, the user can easily perform the replacement of the arrangement positions of the left eye image and the right eye image.

24A shows an example of replacement for the left eye and right eye images.

First, if there is a replacement input, an object 2410 representing the replacement input may be displayed on the display 180, as shown in FIG. 24A. At this time, when the replacement item 2412 is selected, the image arrangement unit 1260 replaces the arrangement positions of the left eye image and the right eye image of the displayed 3D image. That is, as shown in FIG. 24B, the arrangement of the 3D image 2010 of a suitable arrangement can be displayed on the display 180.

On the other hand, when it is determined that the left eye image and the right eye image are reversed in the image arrangement determination unit 1240, the image arrangement determination unit 1240 may control the arrangement of the left eye image and the right eye image to be automatically changed have. Thereby, the image arranging unit 1260 replaces the arrangement positions of the left eye image and the right eye image of the displayed 3D image. That is, the 3D image 2010 of the arrangement can be rearranged and displayed on the display 180, as shown in FIG. 24B, without displaying the object indicating the replacement as shown in FIG. 24A.

As a result, according to the embodiment of the present invention, it is possible to determine whether the left and right sides of the stereo image have been changed using only the information of the input image, and to present the guide to the user. Therefore, the user can avoid seeing the image in the state where the left and right are changed for a long time, so that the user can feel comfortable for a long time.

It is to be understood that the present invention is not limited to the configuration and the method of the embodiments described above but may be applied to all or any of the embodiments so that various modifications may be made. Some of which may be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (19)

Extracting a left eye image and a right eye image from an input signal;
Arranging and displaying the left eye image and the right eye image in a predetermined format;
Calculating parallax information using the left eye image and the right eye image;
Calculating motion information on any one of the left eye image and the right eye image;
Determining whether the arrangement of the left eye image and the right eye image is rearranged based on the motion information and the parallax information; And
And displaying an object indicating the result of the determination.
The method according to claim 1,
Wherein,
Detecting foreground and background in the depth map based on the parallax information; And
Wherein when the partial area of the border area around the foreground is an occlusion area covered by the background area or a part of the border area around the background is located in the foreground area And determining that the arrangement of the left eye image and the right eye image is reversed on the basis of a case where the left eye image and the right eye image overlap each other.
The method according to claim 1,
Wherein,
Further comprising the step of digitizing the rearrangement of the arrangement of the left eye image and the right eye image,
Wherein the object display step comprises:
Wherein the numerical value is displayed in at least one of a probability value, a color, a graphic image, and a pop-up window.
The method according to claim 1,
Wherein,
Further comprising the step of digitizing the rearrangement of the arrangement of the left eye image and the right eye image,
The numerical value,
Wherein the ratio is a ratio of a clogged area covered by another area or an area covered by another area of the boundary map in the depth map.
The method according to claim 1,
Wherein,
Numerically indicating that the arrangement of the left eye image and the right eye image is reversed; And
And determining that the arrangement of the left eye image and the right eye image is reversed when the calculated numerical value is equal to or greater than a reference value.
6. The method of claim 5,
The reference value,
Wherein the motion information is variable according to at least one of the parallax information and the motion information.
The method according to claim 1,
Wherein,
Further comprising the step of digitizing the rearrangement of the arrangement of the left eye image and the right eye image,
Wherein the digitized numerical value is proportional to at least one of the parallax information or the motion information.
The method according to claim 1,
And replacing and displaying the arrangement positions of the left eye image and the right eye image when there is a replacement input of the left eye image and the right eye image.
The method according to claim 1,
And replacing and displaying the arrangement positions of the left eye image and the right eye image when it is determined whether the arrangement of the left eye image and the right eye image is reversed Way.
Extracting a left eye image and a right eye image from an input signal;
Calculating parallax information using the left eye image and the right eye image;
Calculating motion information on any one of the left eye image and the right eye image;
Determining whether the arrangement of the left eye image and the right eye image is rearranged based on the motion information and the parallax information; And
And replacing and displaying the arrangement position of the left eye image and the right eye image when the arrangement is reversed.
An image extracting unit for extracting a left eye image and a right eye image from input signals;
An image arrangement unit arranging the left eye image and the right eye image in a predetermined format;
A display for displaying the left eye image and the right eye image arranged in the format;
A parallax information calculation unit for calculating parallax information using the left eye image and the right eye image;
A motion information calculation unit for calculating motion information on any one of the left eye image and the right eye image;
And an image arrangement determining unit that determines whether the arrangement of the left eye image and the right eye image is rearranged based on the motion information and the parallax information,
Wherein the display unit displays the determination result of the image arrangement determination unit.
12. The method of claim 11,
Wherein the image arrangement determination unit comprises:
Based on the parallax information, detecting foreground and background in the depth map,
Wherein when the partial area of the border area around the foreground is an occlusion area covered by the background area or a part of the border area around the background is located in the foreground area Eye image and the right-eye image are reversed on the basis of a case where the left eye image and the right eye image are overlapped with each other.
12. The method of claim 11,
Wherein the image arrangement determination unit comprises:
Eye image, the left-eye image and the right-
Wherein the display comprises:
Wherein the display unit displays at least one of a probability value, a color, a graphic image, and a pop-up window as the object.
12. The method of claim 11,
Wherein the image arrangement determination unit comprises:
Further comprising the step of digitizing the rearrangement of the arrangement of the left eye image and the right eye image,
The numerical value,
Wherein the ratio is a ratio of an occluded region covered by another region or a region covering another region out of the boundary regions in the depth map.
12. The method of claim 11,
Wherein the image arrangement determination unit comprises:
Eye image, the left-eye image and the right-
And determines that the arrangement of the left eye image and the right eye image is reversed when the calculated numerical value is equal to or greater than a reference value.
16. The method of claim 15,
The reference value,
Wherein the motion information is variable according to at least one of the parallax information and the motion information.
12. The method of claim 11,
Wherein the image arrangement determination unit comprises:
Eye image, the left-eye image and the right-
Wherein the digitized numerical value is proportional to at least one of the parallax information or the motion information.
12. The method of claim 11,
Wherein the display comprises:
And replacing and displaying the arrangement positions of the left eye image and the right eye image when there is a replacement input of the left eye image and the right eye image.
12. The method of claim 11,
Wherein the display comprises:
Wherein when the arrangement of the left eye image and the right eye image is reversed, an arrangement position of the left eye image and the right eye image are replaced and displayed.
KR1020120123702A 2012-11-02 2012-11-02 Image display apparatus, and method for operating the same KR20140057058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120123702A KR20140057058A (en) 2012-11-02 2012-11-02 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120123702A KR20140057058A (en) 2012-11-02 2012-11-02 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20140057058A true KR20140057058A (en) 2014-05-12

Family

ID=50888094

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120123702A KR20140057058A (en) 2012-11-02 2012-11-02 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20140057058A (en)

Similar Documents

Publication Publication Date Title
KR101924058B1 (en) Image display apparatus, and method for operating the same
US20110102321A1 (en) Image display apparatus and method for controlling the image display apparatus
KR20110052771A (en) Image display device and operating method for the same
KR20130007824A (en) Image display apparatus, and method for operating the same
KR20150116302A (en) Image display apparatus, server and method for operating the same
US9024875B2 (en) Image display apparatus and method for operating the same
EP2672716A2 (en) Image display apparatus and method for operating the same
KR20130026236A (en) Image display apparatus, and method for operating the same
KR101912635B1 (en) Image display apparatus, and method for operating the same
KR101836846B1 (en) Image display apparatus, and method for operating the same
KR20140057058A (en) Image display apparatus, and method for operating the same
KR20140055124A (en) Image display apparatus, and method for operating the same
KR101880479B1 (en) Image display apparatus, and method for operating the same
KR102014149B1 (en) Image display apparatus, and method for operating the same
KR101878808B1 (en) Image display apparatus and method for operating the same
KR20130076349A (en) Image display apparatus, and method for operating the same
KR101825669B1 (en) Image display apparatus, and method for operating the same
KR20140098512A (en) Image display apparatus, and method for operating the same
KR20130068964A (en) Method for operating an image display apparatus
KR101890323B1 (en) Image display apparatus, settop box and method for operating the same
KR101945811B1 (en) Image display apparatus, and method for operating the same
KR20130120255A (en) Image display apparatus, and method for operating the same
KR20140079107A (en) Image display apparatus, and method for operating the same
KR20130030603A (en) Image display apparatus, and method for operating the same
KR20140063276A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination