KR20130120255A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20130120255A
KR20130120255A KR1020120043363A KR20120043363A KR20130120255A KR 20130120255 A KR20130120255 A KR 20130120255A KR 1020120043363 A KR1020120043363 A KR 1020120043363A KR 20120043363 A KR20120043363 A KR 20120043363A KR 20130120255 A KR20130120255 A KR 20130120255A
Authority
KR
South Korea
Prior art keywords
viewer
image
display mode
display
displaying
Prior art date
Application number
KR1020120043363A
Other languages
Korean (ko)
Inventor
이용욱
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120043363A priority Critical patent/KR20130120255A/en
Publication of KR20130120255A publication Critical patent/KR20130120255A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Abstract

The present invention relates to an image display apparatus and an operation method thereof. According to an embodiment of the present invention, an image display apparatus includes a display for displaying a plurality of viewpoint images, a lens unit disposed on a front surface of the display, and a mixture for displaying the 2D images and the 3D images together to separate the plurality of viewpoint images by directions; In the video display mode, when the first viewer is set to the 2D video display mode, the control to display the 2D video based on the location information of the first viewer, and when the second viewer is set to the 3D video display mode, And a controller configured to display the 3D image based on location information of a second viewer. Accordingly, it is possible to improve the user's ease of use when displaying a stereoscopic image by the autostereoscopic method.

Description

[0001] The present invention relates to an image display apparatus and a method of operating the same,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve the usability of a user when displaying a stereoscopic image by a non-eyeglass system.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

It is an object of the present invention to provide an image display apparatus and an operation method thereof that can improve the usability of a user in stereoscopic image display by the non-eyeglass system.

Another object of the present invention is to provide an image display apparatus and a method of operating the same, which can display 2D video or 3D video according to the selection of a plurality of viewers when a plurality of viewers are watching together in the autostereoscopic method. Is in.

An image display apparatus according to an embodiment of the present invention for achieving the above object is a display for displaying a multi-view image, a lens unit disposed on the front of the display, and separating the multi-view image for each direction and 2D image and 3D In the mixed image display mode for displaying an image together, when the first viewer is set to the 2D image display mode, the display is controlled to display the 2D image based on the location information of the first viewer, and the second viewer displays the 3D image. When the mode is set to a mode, the controller may control to display the 3D image based on the location information of the second viewer.

In addition, the operation method of the image display device according to an embodiment of the present invention for achieving the above object, the step of entering a mixed image display mode for displaying a 2D image and a 3D image together, the image of the first viewer and the second viewer Receiving a selection input for a display mode; when the first viewer is selected as a 2D image display mode, displaying a 2D image to the first viewer by using the location information of the first viewer; and the second If the viewer is selected as the 3D image display mode, using the position information of the second viewer, displaying the 3D image to the second viewer.

According to an embodiment of the present invention, the 2D image or the 3D image can be displayed together according to the selection of each of the plurality of viewers, thereby improving the usability of the plurality of viewers.

In addition, even when the positions of a plurality of viewers are overlapped, stable 2D video or 3D video can be displayed by switching to the same video display mode or displaying a notification message.

1 is a view showing an appearance of an image display apparatus according to an embodiment of the present invention.
FIG. 2 is a diagram showing the lens unit of the video display device of FIG. 1 separated from the display.
3 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
4 is an internal block diagram of the control unit of FIG.
5 is a diagram showing a control method of the remote control apparatus of FIG.
Fig. 6 is an internal block diagram of the remote control device of Fig. 3; Fig.
FIG. 7 is a diagram illustrating an image formed by a left eye image and a right eye image.
8 is a view for explaining the depth of the 3D image according to the interval between the left eye image and the right eye image.
9 is a diagram referred to explain the principle of a stereoscopic image display apparatus of the non-eyeglass system.
10 to 14 are views referred to explain the principle of an image display apparatus including a plurality of view-point images.
15 is a flowchart illustrating an operation method in an image display apparatus according to an exemplary embodiment.
16 to 22 are views for explaining an operating method of the image display apparatus of FIG. 15.
23 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment.
24 to 28 are diagrams for describing an operating method of the image display apparatus of FIG. 23.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

FIG. 1 is a block diagram showing an appearance of an image display apparatus according to an embodiment of the present invention, and FIG. 2 is a diagram showing a separate display of a lens unit and a display of the image display apparatus of FIG.

Referring to the drawings, an image display apparatus according to an embodiment of the present invention is an image display apparatus capable of displaying a stereoscopic image, that is, a 3D image. In the embodiment of the present invention, an image display apparatus capable of 3D-image display in a non-eyeglass system is exemplified.

To this end, the image display apparatus 100 includes a display 180 and a lens unit 195.

The display 180 may display an input image, and in particular, may display a plurality of viewpoint images according to an embodiment of the present invention. Specifically, subpixels constituting a plurality of view-point images can be arranged and displayed in a predetermined pattern.

The lens unit 195 may be spaced apart from the display 180 and disposed in the user direction. In Fig. 2, the separation between the display 180 and the lens portion 195 is exemplified.

The lens unit 195 may be configured to vary the traveling direction of the light according to an applied power source. For example, when a plurality of viewers watch 2D images, the first power is applied to the lens unit 195, and light can be emitted in the same direction as the light emitted from the display 180. [ Accordingly, the image display apparatus 100 can provide a 2D image to a plurality of viewers.

On the other hand, when a plurality of viewers watch the 3D image, the second power is applied to the lens unit 195, and the light emitted from the display 180 is scattered, The 3D image can be provided to the viewer.

The lens unit 195 may be a lenticular method using a lenticular lens, a parallax method using a slit array, a method using a microlens array, or the like. Do. In the embodiment of the present invention, the lenticular method will be mainly described.

3 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.

3, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a broadcast receiving unit 105, an external device interface unit 130, a storage unit 140, a user input interface unit 150, A controller 180, a display 180, an audio output unit 185, a power supply unit 190, and a lens unit 195. The display unit 180 may include a display unit 155, a sensor unit (not shown), a controller 170,

The broadcast receiving unit 105 may include a tuner unit 110, a demodulation unit 120, and a network interface unit 130. Of course, it is possible to design the network interface unit 130 not to include the tuner unit 110 and the demodulation unit 120 as necessary, and to provide the network interface unit 130 with the tuner unit 110 And the demodulation unit 120 are not included.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by the user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through the antenna. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner unit 110 can process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner unit 110 can be directly input to the controller 170.

The tuner unit 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among the RF broadcast signals received through the antenna in the present invention, and sequentially selects RF broadcast signals of the intermediate frequency signal, baseband image, . ≪ / RTI >

On the other hand, the tuner unit 110 may be provided with a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner that simultaneously receives broadcast signals of a plurality of channels is also possible.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 can be connected to an external device such as a DVD (Digital Versatile Disk), a Blu ray, a game device, a camera, a camcorder, a computer , And may perform an input / output operation with an external device.

The A / V input / output unit can receive video and audio signals from an external device. Meanwhile, the wireless communication unit can perform short-range wireless communication with other electronic devices.

The network interface unit 135 provides an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. For example, the network interface unit 135 can receive, via the network, content or data provided by the Internet or a content provider or a network operator.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage unit 140 may perform a function for temporarily storing video, audio, or data signals input to the external device interface unit 130. [ In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

Although the storage unit 140 of FIG. 3 is separately provided from the controller 170, the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

(Not shown), such as a power key, a channel key, a volume key, and a set value, from the remote control apparatus 200, (Not shown) that senses a user's gesture to the control unit 170 or transmits a signal from the control unit 170 to the control unit 170 It is possible to transmit it to the sensor unit (not shown).

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner unit 110 or the demodulation unit 120 or the external device interface unit 130 so as to output the video or audio output Signals can be generated and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 3, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to Fig.

In addition, the control unit 170 can control the overall operation in the video display device 100. [ For example, the control unit 170 may control the tuner unit 110 to control the tuning of the RF broadcast corresponding to the channel selected by the user or the previously stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

Meanwhile, the control unit 170 may control the display 180 to display an image. At this time, the image displayed on the display 180 may be a still image or a moving image, and may be a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), EPG (Electronic Program Guide), various menus, widgets, icons, still images, moving images, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

On the other hand, the control unit 170 can recognize the position of the user based on the image photographed by the photographing unit 155. [ For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 can be grasped. In addition, the x-axis coordinate and the y-axis coordinate in the display 180 corresponding to the user position can be grasped.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided. The channel browsing processing unit receives the stream signal TS output from the demodulation unit 120 or the stream signal output from the external device interface unit 130 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be stream-decoded together with a decoded image and input to the controller 170. The control unit 170 may display a thumbnail list having a plurality of thumbnail images on the display 180 using the input thumbnail image.

At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 can be a PDP, an LCD, an OLED, a flexible display, or the like, and is also capable of a 3D display.

As described above, the display 180 according to the embodiment of the present invention is a non-eyeglass 3D image display capable of not requiring a separate glass. For this, a lenticular lens unit 195 is provided.

The power supply unit 190 supplies the overall power within the video display device 100. Accordingly, each module or unit in the video display device 100 can be operated.

Also, the display 180 may be configured to include a 2D image area and a 3D image area. In this case, the power supply part 190 may supply the first power source and the second power source different from each other to the lens part 195 have. The first power supply and the second power supply may be performed under the control of the controller 170.

The lens unit 195 varies the traveling direction of the light according to the applied power source.

The first power source may be applied to the first region of the lens unit corresponding to the 2D image region of the display 180 so that light may be emitted in the same direction as the light emitted from the 2D image region of the display 180 have. Accordingly, the user sees the displayed 2D image as a 2D image.

As another example, a second power source may be applied to a second region of the lens portion corresponding to the 3D image region of the display 180, such that light emitted from the 3D image region of the display 180 is scattered, Light is generated. As a result, a 3D effect is generated, and the user perceives the displayed 3D image as a stereoscopic image without wearing a separate glass.

On the other hand, the lens portion 195 can be disposed in the user direction, away from the display 180. [ In particular, the lens portion 195 may be disposed at a distance from the display 180, parallel to the display 180, inclined at a predetermined angle, or concave or convex. On the other hand, the lens portion 195 can be arranged in a sheet form. Accordingly, the lens portion 195 according to the embodiment of the present invention may be called a lens sheet.

 Meanwhile, the display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives the signal processed by the control unit 170 and outputs it as a voice.

The photographing unit 155 photographs the user. The photographing unit 155 may be implemented by a single camera, but the present invention is not limited thereto and may be implemented by a plurality of cameras. Meanwhile, the photographing unit 155 may be embedded in the image display device 100 above the display 180 or may be disposed separately. The image information photographed by the photographing unit 155 may be input to the control unit 170. [

The control unit 170 can sense the gesture of the user based on each of the images photographed by the photographing unit 155 or sensed signals from the sensor unit (not shown) or a combination thereof.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. [ To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. Also, the remote control apparatus 200 can receive the video, audio, or data signal output from the user input interface unit 150 and display it or output it by the remote control apparatus 200.

Meanwhile, the video display device 100 may be a digital broadcast receiver capable of receiving a fixed or mobile digital broadcast.

Meanwhile, the video display device described in the present specification can be applied to a TV set, a monitor, a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a PDA (personal digital assistant), a portable multimedia player (PMP) And the like.

Meanwhile, a block diagram of the image display apparatus 100 shown in FIG. 3 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

3, the video display apparatus 100 does not include the tuner unit 110 and the demodulation unit 120 shown in FIG. 3, and the network interface unit 130 or the external device interface unit 135 to play back the video content.

The image display apparatus 100 is an example of an image signal processing apparatus that performs signal processing of an image stored in an apparatus or an input image. Another example of the image signal processing apparatus is the display 180 illustrated in FIG. 3. And a set top box in which the audio output unit 185 is excluded, the above-described DVD player, Blu-ray player, game device, computer, etc. may be further illustrated.

4 is an internal block diagram of the control unit of FIG.

The control unit 170 includes a demultiplexing unit 310, an image processing unit 320, a processor 330, an OSD generating unit 340, a mixer 345, A frame rate conversion unit 350, and a formatter 360. [0031] An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processor 320 may perform image processing of the demultiplexed image signal. To this end, the image processing unit 320 may include a video decoder 225 and a scaler 235. [

The video decoder 225 decodes the demultiplexed video signal and the scaler 235 performs scaling so that the resolution of the decoded video signal can be output from the display 180.

The video decoder 225 may include a decoder of various standards.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

Meanwhile, the image signal decoded by the image processing unit 320 may be a 3D image signal in various formats. For example, a 3D image signal composed of a color image and a depth image, or a 3D image signal composed of a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

Here, the format of the 3D video signal is a side-by-side format in which the left-eye image signal L and the right-eye image signal R are arranged in left and right directions, a top- An interlaced format in which the left and right eye image signals and the right eye image signal are mixed line by line, a checker box for mixing the left eye image signal and the right eye image signal box by box, Format, and the like.

The processor 330 may control the overall operation in the image display apparatus 100 or in the control unit 170. [ For example, the processor 330 may control the tuner 110 to select a channel selected by the user or an RF broadcast corresponding to a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. [

In addition, the processor 330 may perform data transfer control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The OSD generating unit 340 can generate a pointer that can be displayed on the display based on the pointing signal input from the remote control device 200. [ In particular, such a pointer may be generated by the pointing signal processor, and the OSD generator 240 may include such a pointing signal processor (not shown). Of course, a pointing signal processing unit (not shown) may be provided separately from the OSD generating unit 240.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded video signal processed by the image processor 320. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the frame rate without conversion.

The formatter 360 can arrange the frame rate converted 3D image.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

In the present specification, a 3D video signal means a 3D object. Examples of the 3D object include a picuture in picture (PIP) image (still image or moving picture), an EPG indicating broadcasting program information, Icons, texts, objects in images, people, backgrounds, web screens (newspapers, magazines, etc.).

On the other hand, the formatter 360 can change the format of the 3D video signal. For example, when a 3D image is input in the above-described various formats, it can be changed to a multi-view image. In particular, it is possible to change the multiple viewpoint image to be repeated. Thereby, the 3D image of the non-spectacle type can be displayed.

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal may be a multiple view image signal as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

4 shows that the signals from the OSD generating unit 340 and the image processing unit 320 are mixed in the mixer 345 and then 3D processed in the formatter 360. However, May be located behind the formatter. That is, the output of the image processing unit 320 is 3D-processed by the formatter 360, and the OSD generating unit 340 performs 3D processing together with the OSD generation. Thereafter, the processed 3D signals are mixed by the mixer 345 It is also possible to do.

Meanwhile, the block diagram of the controller 170 shown in FIG. 4 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be separately provided.

5 is a diagram showing a control method of the remote control apparatus of FIG.

5A illustrates that the pointer 180 corresponding to the remote control device 200 is displayed on the display 180. In this case,

The user can move or rotate the remote control device 200 up and down, left and right (Figure 5 (b)), and back and forth (Figure 5 (c)). The pointer 205 displayed on the display 180 of the video display device corresponds to the movement of the remote control device 200. [ The remote control apparatus 200 may be referred to as a spatial remote controller because the pointer 205 is moved and displayed according to the movement in the 3D space as shown in the figure.

5B illustrates that when the user moves the remote control apparatus 200 to the left, the pointer 205 displayed on the display 180 of the image display apparatus also shifts to the left correspondingly.

Information on the motion of the remote control device 200 sensed through the sensor of the remote control device 200 is transmitted to the image display device. The image display apparatus can calculate the coordinates of the pointer 205 from the information on the motion of the remote control apparatus 200. [ The image display apparatus can display the pointer 205 so as to correspond to the calculated coordinates.

5C illustrates a case in which the user moves the remote control device 200 away from the display 180 while pressing a specific button in the remote control device 200. FIG. Thereby, the selected area in the display 180 corresponding to the pointer 205 can be zoomed in and displayed. Conversely, when the user moves the remote control device 200 close to the display 180, the selection area within the display 180 corresponding to the pointer 205 may be zoomed out and zoomed out. On the other hand, when the remote control device 200 moves away from the display 180, the selection area is zoomed out, and when the remote control device 200 approaches the display 180, the selection area may be zoomed in.

On the other hand, when the specific button in the remote control device 200 is pressed, it is possible to exclude recognizing the up, down, left, and right movement. That is, when the remote control device 200 moves away from or near the display 180, the up, down, left and right movements are not recognized, and only the front and back movements can be recognized. Only the pointer 205 is moved in accordance with the upward, downward, leftward, and rightward movement of the remote control device 200 in a state where the specific button in the remote control device 200 is not pressed.

On the other hand, the moving speed and moving direction of the pointer 205 may correspond to the moving speed and moving direction of the remote control device 200.

Fig. 6 is an internal block diagram of the remote control device of Fig. 3; Fig.

The remote control device 200 includes a wireless communication unit 420, a user input unit 435, a sensor unit 440, an output unit 450, a power supply unit 460, a storage unit 470, And a control unit 480.

The wireless communication unit 420 transmits / receives a signal to / from any one of the video display devices according to the embodiments of the present invention described above. Of the video display devices according to the embodiments of the present invention, one video display device 100 will be described as an example.

In this embodiment, the remote control apparatus 200 may include an RF module 421 capable of transmitting and receiving signals with the image display apparatus 100 according to the RF communication standard. In addition, the remote control apparatus 200 may include an IR module 423 capable of transmitting and receiving signals to and from the image display apparatus 100 according to the IR communication standard.

In the present embodiment, the remote control device 200 transmits a signal containing information on the motion and the like of the remote control device 200 to the image display device 100 through the RF module 421.

Also, the remote control device 200 can receive the signal transmitted by the video display device 100 through the RF module 421. [ In addition, the remote control device 200 can transmit a command regarding power on / off, channel change, volume change, and the like to the video display device 100 through the IR module 423 as necessary.

The user input unit 430 may include a keypad, a button, a touchpad, or a touch screen. The user can input a command related to the image display apparatus 100 to the remote control apparatus 200 by operating the user input unit 430. [ When the user input unit 430 includes a hard key button, the user can input a command related to the image display apparatus 100 to the remote controller 200 through the push operation of the hard key button. When the user input unit 430 has a touch screen, the user can touch a soft key of the touch screen to input a command related to the image display apparatus 100 to the remote control apparatus 200. [ In addition, the user input unit 430 may include various types of input means such as a scroll key, a jog key, etc., which can be operated by the user, and the present invention does not limit the scope of the present invention.

The sensor unit 440 may include a gyro sensor 441 or an acceleration sensor 443. The gyro sensor 441 may sense information about the movement of the remote controller 200.

For example, the gyro sensor 441 can sense information about the operation of the remote control device 200 based on the x, y, and z axes. The acceleration sensor 443 may sense information about a moving speed of the remote controller 200. On the other hand, a distance measuring sensor can be further provided, whereby the distance to the display 180 can be sensed.

The output unit 450 may output an image or a voice signal corresponding to the operation of the user input unit 430 or corresponding to the signal transmitted from the image display apparatus 100. [ The user can recognize whether the user input unit 430 is operated or whether the image display apparatus 100 is controlled through the output unit 450.

For example, the output unit 450 includes an LED module 451 which is turned on when the user input unit 430 is manipulated or a signal is transmitted / received through the wireless communication unit 420, and a vibration module generating vibration. 453), a sound output module 455 for outputting sound, or a display module 457 for outputting an image.

The power supply unit 460 supplies power to the remote control device 200. The power supply unit 460 may reduce power waste by stopping the power supply when the remote controller 200 does not move for a predetermined time. The power supply unit 460 may resume power supply when a predetermined key provided in the remote control device 200 is operated.

The storage unit 470 may store various types of programs, application data, and the like necessary for the control or operation of the remote control apparatus 200. [ If the remote control device 200 wirelessly transmits and receives a signal through the image display device 100 and the RF module 421, the remote control device 200 and the image display device 100 transmit signals through a predetermined frequency band Send and receive. The control unit 480 of the remote control device 200 stores information on the frequency band and the like capable of wirelessly transmitting and receiving signals with the video display device 100 paired with the remote control device 200 in the storage unit 470 Can be referenced.

The control unit 480 controls various items related to the control of the remote control device 200. The control unit 480 transmits a signal corresponding to a predetermined key operation of the user input unit 430 or a signal corresponding to the motion of the remote control device 200 sensed by the sensor unit 440 through the wireless communication unit 420, (100).

The user input interface unit 150 of the image display apparatus 100 includes a wireless communication unit 411 capable of wirelessly transmitting and receiving signals to and from the remote control apparatus 200 and a pointer corresponding to the operation of the remote control apparatus 200. [ And a coordinate value calculation unit 415 that can calculate the coordinate value of the coordinate system.

The user input interface unit 150 can wirelessly transmit and receive signals to and from the remote control device 200 through the RF module 412. Also, the remote control device 200 can receive a signal transmitted through the IR module 413 according to the IR communication standard.

The coordinate value calculator 415 corrects the camera shake or error from the signal corresponding to the operation of the remote controller 200 received via the wireless communication unit 411 and outputs the coordinate value of the pointer 202 to be displayed on the display 170 (x, y) can be calculated.

The transmission signal of the remote controller 200 inputted to the image display apparatus 100 through the user input interface unit 150 is transmitted to the controller 170 of the image display apparatus 100. [ The control unit 170 can determine the information on the operation of the remote control apparatus 200 and the key operation from the signal transmitted from the remote control apparatus 200 and control the image display apparatus 100 in accordance with the information.

As another example, the remote control device 200 may calculate the pointer coordinate value corresponding to the operation and output it to the user input interface unit 150 of the video display device 100. [ In this case, the user input interface unit 150 of the image display apparatus 100 can transmit information on the received pointer coordinate values to the control unit 170 without any additional camera shake or error correction process.

As another example, the coordinate value calculating unit 415 may be provided in the control unit 170 instead of the user input interface unit 150, unlike the drawing.

FIG. 7 is a view for explaining how images are formed by a left eye image and a right eye image, and FIG. 8 is a view for explaining depths of a 3D image according to an interval between a left eye image and a right eye image.

First, referring to FIG. 7, a plurality of images or a plurality of objects 515, 525, 535, 545 are illustrated.

First, the first object 515 includes a first left eye image 511, L based on the first left eye image signal and a first right eye image 513, R based on the first right eye image signal, It is exemplified that the interval between the first left eye image 511, L and the first right eye image 513, R is d1 on the display 180. [ At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 501 and the first left eye image 511 and an extension line connecting the right eye 503 and the first right eye image 503. Accordingly, the user recognizes that the first object 515 is located behind the display 180. [

Next, since the second object 525 includes the second left eye image 521, L and the second right eye image 523, R and overlaps with each other and is displayed on the display 180, do. Accordingly, the user recognizes that the second object 525 is located on the display 180. [

Next, the third object 535 and the fourth object 545 are displayed on the display screen of the fourth left eye image 531, L, the second right eye image 533, R, the fourth left eye image 541, Right eye image 543 (R), and the intervals thereof are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 535 and the fourth object 545 are located at positions where images are formed, respectively, and recognizes that they are located before the display 180 in the drawing.

At this time, it is recognized that the fourth object 545 is projected before the third object 535, that is, more protruded than the third object 535. This is because the interval between the fourth left eye image 541, L and the fourth right eye image 543, d4 is larger than the interval d3 between the third left eye image 531, L and the third right eye image 533, R. [

Meanwhile, in the embodiment of the present invention, the distance between the display 180 and the objects 515, 525, 535, and 545 recognized by the user is represented by a depth. Accordingly, it is assumed that the depth when the user is recognized as being positioned behind the display 180 has a negative value (-), and the depth when the user is recognized as being positioned before the display 180 (depth) has a negative value (+). That is, the greater the degree of protrusion in the user direction, the greater the depth.

8, the interval a between the left eye image 601 and the right eye image 602 in FIG. 8 (a) is smaller than the interval a between the left eye image 601 and the right eye image 602 shown in FIG. 8 (b) it is understood that the depth a 'of the 3D object in FIG. 8 (a) is smaller than the depth b' of the 3D object in FIG. 8 (b).

In this way, when the 3D image is exemplified as the left eye image and the right eye image, the positions recognized as images are different depending on the interval between the left eye image and the right eye image. Accordingly, by adjusting the display intervals of the left eye image and the right eye image, the depth of the 3D image or the 3D object composed of the left eye image and the right eye image can be adjusted.

Fig. 9 is a diagram referred to explain the principle of a stereoscopic image display apparatus of a non-eyeglass system.

As described above, the stereoscopic image display apparatus of the non-eyeglass system includes a lenticular system and a parallax system, and a system using a microlens array. Hereinafter, the lenticular method and the parallax method will be described in detail. Hereinafter, it will be exemplified that the viewpoint image is composed of the left-eye viewpoint image and the right-eye viewpoint image, but this is not for convenience of description.

9 (a) is a view showing a lenticular system using a lenticular lens. Referring to FIG. 9A, a block 720 (L) constituting a left eye view image and blocks 710 (R) constituting a right eye view image may be alternately arranged on a display 180. At this time, each block may include a plurality of pixels, but it is also possible to include one pixel. Hereinafter, the case where each block is composed of one pixel will be mainly described.

In the lenticular method, the lenticular lens 195a is disposed on the lens unit 195, and the lenticular lens 195a disposed in front of the display 180 has a traveling direction of light emitted from the pixels 710 and 720. Can be changed. For example, the light emitted from the pixels 720, L constituting the left eye view image is changed in the direction of progress toward the left eye 701 of the viewer, and the pixels 710, R constituting the right eye view image, The light emitted from the viewer can be changed toward the right eye 702 of the viewer.

Accordingly, in the left eye 702, the light emitted from the pixels 720, L constituting the left eye view image is merged to see the left eye view image. In the right eye 701, The light emitted from the right eye 710 (R) is combined to see the right eye view image, and the viewer is recognized as a three-dimensional image without wearing glasses.

9 (b) is a diagram showing a parallax system using a slit array. Referring to FIG. 9B, similarly to FIG. 9A, a pixel 720 (L) constituting a left eye view image and pixels 710 (R) constituting a right eye view image are alternately displayed on a display 180 Lt; / RTI > In the parallax system, a slit array 195b is disposed in the lens unit 195. The slit array 195b functions as a barrier so that light emitted from the pixels can travel only in a certain direction . Accordingly, in the same manner as the lenticular method, the user sees the left eye view image in the left eye 702, the right eye view image in the right eye 701, and the viewer recognizes the stereoscopic image without wearing any glasses.

10 to 14 are diagrams for explaining the principle of an image display device including a plurality of viewpoint images.

Fig. 10 is a diagram showing a non-eyeglass image display apparatus 100 in which three viewpoint regions 821, 822 and 823 are formed. In the three viewpoint regions 821, 822 and 823, Can be recognized.

Some pixels constituting the three viewpoint images may be rearranged and displayed on the display 180 as shown in FIG. 10 so that three viewpoint images are respectively recognized in the three viewpoint regions 821, 822 and 823 . At this time, rearranging the pixels means changing the value of the pixel displayed on the display 180, rather than changing the physical position of the pixel.

The three viewpoint images may be images of the object 910 taken in different directions as shown in FIG. For example, FIG. 11A shows an image taken in the first direction, FIG. 11B shows the image taken in the second direction, FIG. 11C shows the image taken in the third direction, And the first direction, the second direction, and the third direction may be different directions.

11A shows an image taken in the left direction with the object 910 as the center, FIG. 11B shows an image taken in the front direction with the object 910 as the center, FIG. (c) may be an image photographed in the rightward direction with the object 910 as a center.

The first pixel 811 displayed on the display 180 may be composed of a first subpixel 801, a second subpixel 802 and a third subpixel 803, (801, 802, and 803) may be subpixels representing any one of red, green, and blue.

FIG. 10 shows one pattern displayed by rearranging the pixels constituting the three view-point images. However, the present invention is not limited thereto, and may be rearranged and displayed in various patterns according to the lens unit 195.

In FIG. 10, the subpixels 801, 802 and 803 in which the numeral 1 is described are subpixels constituting the first viewpoint image, the subpixels in which the numeral 2 is described are subpixels constituting the second viewpoint image, The described subpixels may be subpixels constituting the third viewpoint image.

Accordingly, in the first viewpoint region 821, the subpixels in which the numeral 1 is described are combined to recognize the first viewpoint image, and in the second viewpoint region 822, the subpixels describing the numeral 2 are combined to recognize the second viewpoint image , And the third viewpoint image can be recognized by combining the subpixels in which the number 3 is described in the third viewpoint region.

That is, the first view image 901, the second view image 902, and the third view image 903 shown in FIG. 11 represent images displayed along the view direction. In addition, the first viewpoint image 901 is taken in the first viewpoint direction, the second viewpoint image 902 is taken in the second viewpoint direction, and the third viewpoint image 903 is taken in the third viewpoint direction It can be a video.

Therefore, when the left eye 922 of the viewer is located in the third viewpoint region 823 and the right eye 921 is located in the second viewpoint region 822 as shown in Fig. 12 (a), the left eye 922 The third viewpoint image 903 and the right eye view 921 see the second viewpoint image 902. FIG.

At this time, the third view image 903 is the left eye image, and the second view image 902 is the right eye image. Accordingly, as shown in Fig. 12 (b) The viewer recognizes that the object 910 is positioned in front of the display 180, and the viewer recognizes the stereoscopic image without wearing glasses.

Also, when the left eye 922 of the viewer is located in the second view region 822 and the right eye 921 is located in the first view region 821, stereoscopic images (3D images) can be recognized as well.

On the other hand, as shown in FIG. 10, when the pixels of the plural viewpoint images are rearranged only in the horizontal direction, the horizontal resolution is reduced to 1 / n (the number of viewpoint images) as compared with the 2D image. For example, the horizontal resolution of the stereoscopic image (3D image) of FIG. 10 is reduced to 1/3 of that of the 2D image. On the other hand, the vertical resolution has the same resolution as the multi-view images 901, 902, and 903 before being rearranged.

In the case where the number of viewpoint images per direction is large (the reason why the number of viewpoint images should be increased will be described later with reference to FIG. 14), only the horizontal resolution is reduced as compared with the vertical resolution, There is a problem that it may be degraded.

13, the lens unit 195 is disposed on the front surface of the display 180 while being inclined at a predetermined angle alpha with the longitudinal axis 185 of the display, and the lens unit 195 The subpixels constituting the multiple view image can be rearranged and displayed in various patterns. 13 shows an image display apparatus including a plurality of viewpoints in each of 25 directions according to an embodiment of the present invention. In this case, the lens unit 195 may be a lenticular lens or a slit array.

13, the red subpixels constituting the sixth view image are displayed for every five pixels in the horizontal and vertical directions, and the three-dimensional image (3D image) The horizontal and vertical resolutions can be reduced to 1/5 of the directional multi-view images before rearrangement. Therefore, it is possible to balance the degradation in resolution compared to a method in which only the conventional horizontal resolution is reduced to 1/25.

14 is a diagram for explaining a sweet zone and a dead zone appearing on the front face of the video display device.

When the stereoscopic image is viewed using the image display device 100 as described above, the stereoscopic effect can be felt by a plurality of viewers who do not wear the special stereoscopic glasses, but the stereoscopic effect is limited to a certain area.

There is an area where the viewer can view an optimal image, which can be defined by an optimal viewing distance (OVD) and a sweet zone (1020). First, the optimum viewing distance D can be determined by the distance between the left and right eyes, the pitch of the lens portion, and the focal length of the lens.

The swizzone 1020 is an area in which a plurality of viewpoint regions are sequentially positioned so that a viewer can stably feel a three-dimensional feeling. 14, when the viewer is located in the sweet zone 1020, the 12th to 14th viewpoint images are recognized in the right eye 1001 and the 17th to 19th viewpoint images are recognized in the left eye view 1002 , The viewpoint image for each direction can be sequentially recognized in the left eye 1002 and the right eye 1001. [ Therefore, as described with reference to Fig. 12, the stereoscopic effect can be felt by the left eye image and the right eye image.

On the other hand, in the case where the viewer is located in the dead zone 1015 after leaving the sweet zone 1020, for example, the first to third view images are displayed in the left eye 1003, When the 23rd to 25th viewpoint images are recognized, the viewpoint-based viewpoint images are not sequentially recognized in the left eye 1003 and the right eye 1004, and the inversion of the left eye image and the right eye image may occur, . In addition, when the first view image and the 25th view image are recognized at once in the left eye 1003 or the right eye 1004, dizziness may be felt.

The size of the swath zone 1020 can be defined by the number n of the plurality of viewpoint images in the direction and the distance corresponding to one viewpoint. Since the distance corresponding to one viewpoint must be smaller than the distance between the eyes of the viewer, there is a limit to increase the size thereof. To increase the size of the sweet zone 1020, .

On the other hand, when there are a plurality of viewers in a video display device that provides 3D video, some viewers may want to watch 2D video.

However, in the video display device shown in FIG. 13, as shown in FIG. 18, the first viewer a located in the nineteenth to twenty-fourth view area sequentially views the nineteenth to twenty-fourth view images from the left eye and the right eye. The second viewer (b) positioned in the twelfth to seventeenth view areas may sequentially recognize the twelfth to seventeenth view images in the left eye and the right eye.

Therefore, since the 3D image is displayed to both the first viewer (a) and the second viewer (b), when only the first viewer (a) of the plurality of viewers wants to watch the 2D image, the 2D viewer is only 2D. There is a problem in that an image cannot be displayed.

To improve this point, an embodiment of the present invention provides an image display apparatus capable of displaying a 2D image and a 3D image together. Hereinafter, the image display device will be described in detail.

15 is a flowchart illustrating a method of operating the image display apparatus according to an exemplary embodiment, and FIGS. 16 to 22 are views for explaining the method of operating the image display apparatus of FIG. 15.

Referring to FIG. 15, the controller 170 enters a mixed image display mode capable of simultaneously displaying 2D and 3D images through menu selection and the like (S1110).

For example, as illustrated in FIG. 16, an image including an object 1210 indicating a mixed image display mode, an object 1220 indicating a 2D image display mode, and an object 1230 indicating a 3D image display mode, is displayed on the display 180. A display mode selection menu 1201 is displayed and an image display mode may be determined through an input for selecting any one of the objects 1210, 1220, and 1230.

Here, the 2D video display mode displays 2D video to all viewers, the 3D video display mode displays 3D video to all viewers, and the mixed video display mode allows a plurality of viewers to display either 2D video display or 3D video display. When selected, the 2D image or the 3D image is displayed to the plurality of viewers according to the selected display mode.

When the controller 170 receives an input for selecting the object 1210 indicating the mixed image display mode, the controller 170 enters the mixed image display mode.

When entering the mixed video display mode, the controller 170 receives an input for selecting whether to watch in the 2D video display mode or the 3D video display mode for the plurality of viewers (S1120).

For example, as illustrated in FIG. 17A, images 1231 and 1232 representing a plurality of viewers may be displayed on the display 180. In this case, the images 1231 and 1232 representing the plurality of viewers may be images of the plurality of viewers by using the photographing unit.

The plurality of viewers select the image 1231 representing the first viewer by moving the pointer 1270 displayed on the display 180 using the remote controller 200, for example, a spatial remote controller, and drag and drop the viewer. Next, the object 1251 indicating the 2D image display mode may be selected.

Alternatively, the object 1251 indicating the 2D image display mode may be selected first, and then dragged and dropped to the image 1231 representing the viewer who wants to view the 2D image.

When the image display mode for the first viewer is determined as the 2D image display mode through the selection as described above, the first viewer is selected as the 2D image display mode together with the image 1231 indicating the first viewer on the display 180. The object 1261 indicating the information can be displayed to allow a plurality of viewers to easily recognize the selected video display mode.

Referring to FIG. 17B, in order to determine the image display mode for the second viewer, the remote control apparatus 200 selects an image 1232 representing the second viewer and drags and drops the same as the above method. By performing the operation, the image display mode of the second viewer may be determined as the 3D image display mode through an input for selecting the object 1252 indicating the 3D image display mode.

When the image display mode for the second viewer is determined, the object 1262 indicating that the second viewer is selected as the 3D image display mode may be displayed on the display 180 together with the image 1232 indicating the second viewer.

When the video display mode is determined for each of the plurality of viewers, as shown in FIG. 17C, an object 1280 indicating completion of setting is displayed on the display 180, and when the object 1280 is selected, each of the plurality of viewers is selected. The video display mode of is set.

Also, the photographing unit 155 may acquire the location information of the first viewer and the location information of the second viewer by using a tracking technique, and the tracking technique may use an eye tracking technique or a head tracking technique. have.

The controller 170 determines whether the 2D video display mode is selected for the plurality of viewers (S1130), and in the case of the first viewer selected in the 2D video display mode, the first viewer using the location information of the first viewer. In operation S1140, the second viewer selected as the 3D image display mode is displayed, and the 3D image is displayed to the second viewer using the location information of the second viewer (S1150).

 In order to display a 2D image to the first viewer, the controller 170 displays a plurality of viewpoint images recognized as the left eye and the right eye of the first viewer as the same viewpoint image as shown in FIGS. 19 and 20, or as shown in FIGS. 21 and 22. The plurality of view images may be displayed such that the arrangement of the view image recognized to the left eye of the first viewer and the view image recognized to the right eye is the same.

In addition, the controller 170 may display different view images to the left and right eyes of the second viewer in order to display the 3D image to the second viewer.

For example, as illustrated in FIG. 19, all of the subpixels constituting the 19th to 24th viewpoint images recognized by the left eye and the right eye of the first viewer are all subpixels 1310 constituting the 19th viewpoint image. You can switch to display.

When a plurality of viewpoint images are displayed as shown in FIG. 19, as shown in FIG. 18, the first viewer a does not sequentially recognize the 19th to 24th viewpoint images in the left eye and the right eye, as shown in FIG. 20. Only the 19th viewpoint image is recognized in the left eye and the right eye, so that the 2D image can be viewed.

In addition, the sub-pixels constituting the twelfth to seventeenth view images recognized in the left eye and the right eye of the second viewer b are displayed in the same manner as the arrangement shown in FIG. 13, whereby the second viewer b is connected to the first viewer. The 3D image may be viewed by independently recognizing the 12th to 17th viewpoint images in the left eye and the right eye independently.

In FIG. 19 and FIG. 20, the subpixels constituting the nineteenth to twenty-fourth viewpoint images are represented as subpixels 1310 constituting the nineteenth viewpoint image. For example, the control unit (A) is not limited thereto. In operation 170, the subpixels constituting the nineteenth viewpoint to the twenty-fourth viewpoint image may be controlled to be displayed in the same manner as the subpixels forming one of the nineteenth viewpoint to the twenty-fourth viewpoint image.

In addition, the 2D image may be recognized by the same arrangement of the viewpoint image recognized by the left eye of the first viewer a and the arrangement of the viewpoint image recognized by the right eye.

For example, as illustrated in FIG. 21, the controller 170 may convert a subpixel constituting the nineteenth viewpoint image and the twenty-second viewpoint image into the subpixel 1310 constituting the nineteenth viewpoint image on the display 180. The sub-pixels constituting the 20th viewpoint and the 23rd viewpoint image are the subpixels 1320 constituting the 20th viewpoint image, and the subpixels constituting the 21st viewpoint and the 24th viewpoint image compose the 21st viewpoint image. The display may be controlled to switch to the subpixel 1330.

When the display 180 is displayed as described above, as shown in FIG. 22, the arrangement of the viewpoint images recognized by the left eye of the first viewer a and the arrangement of the viewpoint images recognized by the right eye are the same. Since the same video can be recognized, 2D video can be viewed.

Accordingly, the left eye and the right eye of the first viewer (a) recognize the same image, so that the 2D video can be viewed, and the left eye and the right eye of the second viewer (b) recognize the different viewpoint images, so that the 3D video can be viewed. Become.

FIG. 23 is a flowchart illustrating a method of operating the image display apparatus according to an exemplary embodiment, and FIGS. 24 to 28 are views referred to for describing the method of operating the image display apparatus of FIG. 23.

Referring to FIG. 24, when the positions of the first viewer (a) selected in the 2D image display mode and the second viewer (b) selected in the 3D image display mode overlap at least partially, the first viewer (a) and the second viewer. (b) There is a problem that cannot display 2D image and 3D image to each.

For example, if the 19th view image is recognized in the left and right eyes of the first viewer to display the 2D image to the first viewer a, the second viewer b strongly recognizes only the 19th view image in the left eye. In the right eye, the 15th to 17th viewpoint images are sequentially recognized, and the 3D images are overlapped or cause dizziness, so that the 3D images cannot be stably viewed.

Therefore, when the positions of the first viewer (a) selected in the 2D video display mode and the second viewer (b) selected in the 3D video display mode at least partially overlap, the first viewer (a) or the second viewer (b) In order to improve the problem of not providing a stable image, in the embodiment of the present invention, both the first viewer and the second viewer to watch any one of the 2D video or 3D video, or display a notification message to inform the viewer Provide a display device. Hereinafter, the video display device will be described in detail.

Referring to FIG. 23, when entering the mixed image display mode (S1410), the photographing unit 155 detects the positions of the first viewer and the second viewer. The photographing unit 155 acquires location information of the first viewer and the second viewer through tracking (S1420) and transmits the location information to the controller 170, and the controller 170 displays an image of the first viewer and the second viewer. A selection input for the mode is received (S1430).

The controller 170 determines whether the video display modes of the first viewer and the second viewer are different from each other (S1440). When the video display modes of the first viewer and the second viewer are the same, the controller 170 determines the first viewer and the second viewer. Using the location information, the 2D image or the 3D image is displayed so that the first viewer and the second viewer can watch the image in the selected image display mode (S1450).

When the video display modes of the first viewer and the second viewer are different, the controller 170 determines whether the positions of the first viewer and the second viewer are partially overlapped using the location information of the first viewer and the second viewer. (S1460)

If the positions of the first viewer and the second viewer do not overlap, the controller 170 may watch the image in the image display mode selected by the first viewer and the second viewer by using the location information of the first viewer and the second viewer. A 2D image or a 3D image is displayed so as to be displayed (S1450).

When the positions of the first viewer and the second viewer overlap, the video display mode of the first viewer and the second viewer is equally switched to either the 2D video display mode or the 3D video display mode, or a notification message is displayed (S1470). ). In this case, the notification message may be displayed on the display 180 and may be output as a notification sound, but is not limited thereto.

25 and 26, when the positions of the first viewer and the second viewer are at least partially overlapped, the image display mode of the first viewer (a) and the second viewer (b) may be a 2D video display mode or a 3D video display. You can switch to either mode.

As illustrated in FIG. 25, when the positions of the first viewer a and the second viewer b partially overlap each other, the controller 170 may display a video display mode of the second viewer b that was watching the 3D image. Is switched to the 2D video display mode, and both the first viewer (a) and the second viewer (b) display the 2D video.

For example, as described with reference to FIG. 19, the fifteenth to twenty-fourth viewpoint images may be configured to recognize the nineteenth viewpoint image at a position where the fifteenth to twenty-fourth viewpoint images are recognized. The subpixels may be converted into subpixels constituting the nineteenth viewpoint image and displayed.

In the display as described above, only the 19th viewpoint image, which is a single viewpoint image, is recognized in the left and right eyes of the first viewer (a) and the second viewer (b), and the first viewer (a) and the second viewer (b). ) Both will watch 2D video. However, the single view image is not limited to the 19th view image, and the subpixels constituting the 15th to 24th view images are converted into subpixels constituting any one of the 15th to 24th view images. Can be.

Alternatively, as shown in FIG. 26, the video display mode of the first viewer (a) who is watching in the 2D video display mode is switched to the 3D video display mode, so that both the first viewer (a) and the second viewer (b) are 3D. In order to view an image, a plurality of viewpoint images, for example, the fifteenth to twenty-fourth viewpoint images, may be displayed so as to recognize the left and right eyes of the first viewer (a) and the second viewer (b).

27 and 28 are diagrams illustrating screens on which a notification message is displayed, according to an exemplary embodiment.

Referring to FIG. 27, the notification message 1610 may be a message for requesting movement of a first viewer or a second viewer. In addition, the message may be displayed only to one of the first viewer and the second viewer. For example, as shown in FIGS. 25 and 26, when the notification message 1610 is displayed on an image recognized in the fifteenth to eighteenth viewpoint regions, the left and right eyes of the first viewer are determined by the fifteenth viewpoint. Since the first viewer (a) is not located in the eighteenth viewpoint area, the first viewer (a) cannot recognize the notification message, and in the case of the second viewer, the left eye and the right eye are located in the fifteenth to eighteenth viewpoint regions, so the second viewer is notified. Message 1610 may be displayed.

In addition, the notification message 1610 may include at least one of a moving direction and a moving distance to the first viewer or the second viewer. The moving distance may be a minimum moving distance or a range of moving distances.

Referring to FIG. 28, the controller 170 photographs the first viewer 1601 and the second viewer 1602 using the photographing unit 155, and displays the photographed image 1620 on the display 180. The notification message can be displayed. In addition, one of the first viewer 1601 and the second viewer 1602 may be selected in the displayed image 1620 to display a message requesting movement.

In this case, when the positions of the first viewer and the second viewer overlap by the movement of any one of the first viewer 1601 and the second viewer 1602, the controller 170 recognizes the movement and moves the moved viewer. You can control the display of the requesting message.

The image display apparatus and the operation method thereof according to the present invention are not limited to the configuration and method of the embodiments described above, but the embodiments may be applied to all or some of the embodiments May be selectively combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (21)

A display on which a plurality of viewpoint images are displayed;
A lens unit disposed on a front surface of the display, the lens unit separating the plurality of viewpoint images according to directions; And
In the mixed image display mode for displaying a 2D image and a 3D image together, when the first viewer is set to the 2D image display mode, the display is controlled to display the 2D image based on the position information of the first viewer.
And a controller configured to display the 3D image based on the position information of the second viewer when the second viewer is set to the 3D image display mode.
The method of claim 1,
The control unit,
Control to display the 2D image on the first area of the display to be recognizable only to the first viewer and not to the second viewer,
And displaying the 3D image on the second area of the display so that the second viewer can recognize the second viewer but not the first viewer.
The method of claim 1,
And the control unit controls to enter the mixed image display mode when there is a selection input for the object indicating the mixed image display mode.
The method of claim 1,
Further comprising a spatial remote controller for outputting a pointing signal corresponding to the movement,
And the controller is configured to set the 2D video display mode for the first viewer and to the 3D video display mode for the second viewer based on the pointing signal.
The method of claim 1,
The control unit,
And a first object indicating that the first viewer is selected as the 2D image display mode and a second object indicating that the second viewer is selected as the 3D image display mode.
The method of claim 5,
The control unit,
And controlling the first object to be displayed together with the image representing the first viewer and the second object to be displayed together with the image representing the second viewer.
The method of claim 1,
And a capturing unit configured to perform tracking on the first viewer and the second viewer, and obtain location information of the first viewer and the second viewer.
The method of claim 1,
The control unit,
The plurality of viewpoint images recognized by the left eye and the right eye of the first viewer may be displayed as the same viewpoint image, or the viewpoint images recognized by the left eye and the right eye of the first viewer may have the same arrangement. Image display device.
The method of claim 1,
The control unit,
And displaying different viewpoint images to the left and right eyes of the second viewer.
The method of claim 1,
The control unit,
When the positions of the first viewer and the second viewer overlap at least partially, the video display modes of the first viewer and the second viewer are configured to be the same so that the video display modes of the first viewer and the second viewer are the same. And a 2D video display mode and a 3D video display mode.
The method of claim 1,
The control unit,
And displaying a notification message requesting movement if the positions of the first viewer and the second viewer overlap at least partially.
The method of claim 1,
The lens unit includes a lenticular lens,
And the lenticular lens is inclined at a predetermined angle with the display.
In the operation method of an image display device for displaying a multi-view image,
Entering a mixed image display mode for displaying a 2D image and a 3D image together;
Receiving a selection input for an image display mode of a first viewer and a second viewer;
Displaying the 2D image to the first viewer by using the location information of the first viewer when the first viewer is selected as the 2D image display mode; And
And displaying the 3D image to the second viewer by using the location information of the second viewer when the second viewer is selected as the 3D image display mode.
The method of claim 13,
And displaying an object indicating the mixed image display mode, and when there is a selection input for the object, entering the mixed image display mode.
The method of claim 13,
Displaying a first object indicating that the first viewer has selected the 2D image display mode; And
And displaying a second object indicating that the second viewer has selected the 3D image display mode.
16. The method of claim 15,
And the first object is displayed together with the image representing the first viewer, and the second object is displayed together with the image representing the second viewer.
The method of claim 13,
And receiving location information of the first viewer and the second viewer through tracking the first viewer and the second viewer.
The method of claim 13,
The displaying of the 2D image to the first viewer may include displaying a plurality of viewpoint images recognized to the left eye and the right eye of the first viewer as the same viewpoint image or recognizing the viewpoint image and the right eye recognized to the left eye of the first viewer. And displaying the viewpoint images having the same arrangement.
The method of claim 13,
The displaying of the 3D image to the second viewer may include displaying different view images of the left eye and the right eye of the second viewer.
The method of claim 13,
When the positions of the first viewer and the second viewer overlap at least partially, the video display modes of the first viewer and the second viewer are configured to be the same so that the video display modes of the first viewer and the second viewer are the same. And switching to one of a 2D image display mode and a 3D image display mode.
The method of claim 13,
And displaying a notification message for requesting movement when the positions of the first viewer and the second viewer overlap at least partially.
KR1020120043363A 2012-04-25 2012-04-25 Image display apparatus, and method for operating the same KR20130120255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120043363A KR20130120255A (en) 2012-04-25 2012-04-25 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120043363A KR20130120255A (en) 2012-04-25 2012-04-25 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20130120255A true KR20130120255A (en) 2013-11-04

Family

ID=49850995

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120043363A KR20130120255A (en) 2012-04-25 2012-04-25 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20130120255A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102101771B1 (en) * 2018-12-12 2020-04-17 서울과학기술대학교 산학협력단 Device and method for providing contents

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102101771B1 (en) * 2018-12-12 2020-04-17 서울과학기술대학교 산학협력단 Device and method for providing contents

Similar Documents

Publication Publication Date Title
KR101924058B1 (en) Image display apparatus, and method for operating the same
US20140143733A1 (en) Image display apparatus and method for operating the same
KR20130007824A (en) Image display apparatus, and method for operating the same
US20150109426A1 (en) Glassless stereoscopic image display apparatus and method for operating the same
US20140132726A1 (en) Image display apparatus and method for operating the same
KR101855939B1 (en) Method for operating an Image display apparatus
EP2672716A2 (en) Image display apparatus and method for operating the same
KR101832225B1 (en) Image display apparatus, and method for operating the same
EP2566165A2 (en) Image display apparatus and method for operating the same
KR101912635B1 (en) Image display apparatus, and method for operating the same
KR20130120255A (en) Image display apparatus, and method for operating the same
KR101836846B1 (en) Image display apparatus, and method for operating the same
KR20140098512A (en) Image display apparatus, and method for operating the same
KR20150043875A (en) Stereoscopic image display apparatus in glassless mode and method for operating the same
KR101890323B1 (en) Image display apparatus, settop box and method for operating the same
KR101825669B1 (en) Image display apparatus, and method for operating the same
KR20140073231A (en) Image display apparatus, and method for operating the same
KR20140063276A (en) Image display apparatus and method for operating the same
KR20140063275A (en) Image display apparatus and method for operating the same
KR101878808B1 (en) Image display apparatus and method for operating the same
KR101945811B1 (en) Image display apparatus, and method for operating the same
KR20140107923A (en) Image display apparatus
KR101882214B1 (en) Image display apparatus, server and method for operating the same
KR20140008188A (en) Image display apparatus, and method for operating the same
KR20140064115A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination