KR20150001421A - Multiview image generation method and stereoscopic image display device - Google Patents

Multiview image generation method and stereoscopic image display device Download PDF

Info

Publication number
KR20150001421A
KR20150001421A KR1020130074633A KR20130074633A KR20150001421A KR 20150001421 A KR20150001421 A KR 20150001421A KR 1020130074633 A KR1020130074633 A KR 1020130074633A KR 20130074633 A KR20130074633 A KR 20130074633A KR 20150001421 A KR20150001421 A KR 20150001421A
Authority
KR
South Korea
Prior art keywords
image data
value
eye image
data
calculating
Prior art date
Application number
KR1020130074633A
Other languages
Korean (ko)
Other versions
KR102045563B1 (en
Inventor
이승용
허천
Original Assignee
엘지디스플레이 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지디스플레이 주식회사 filed Critical 엘지디스플레이 주식회사
Priority to KR1020130074633A priority Critical patent/KR102045563B1/en
Publication of KR20150001421A publication Critical patent/KR20150001421A/en
Application granted granted Critical
Publication of KR102045563B1 publication Critical patent/KR102045563B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays

Abstract

The present invention relates to a multi-view image generation method and a stereoscopic image display apparatus using the same. A method of generating a multi-view image according to an exemplary embodiment of the present invention includes a first step of calculating a gain value by analyzing disparities, left eye image data, and right eye image data of an N-1th frame (N is a natural number of 2 or more) A second step of calculating disparities using the gain value and left eye image data and right eye image data of an Nth frame; And a third step of generating multi-view image data by shifting the left eye image data or the right eye image data of the Nth frame according to the disparities.

Description

TECHNICAL FIELD [0001] The present invention relates to a multi-view image generating method and a stereoscopic image displaying apparatus using the multi-

The present invention relates to a multi-view image generation method and a stereoscopic image display apparatus using the same.

The stereoscopic display is divided into a stereoscopic technique and an autostereoscopic technique. The binocular parallax method uses parallax images of right and left eyes with large stereoscopic effect, and both glasses and non-glasses are used, and both methods are practically used. In the spectacle system, there is a pattern retarder system in which a polarizing direction of a right and left parallax image is displayed on a direct view type display device or a projector, and a stereoscopic image is realized using polarizing glasses. The eyeglass system has a shutter glasses system in which right and left parallax images are displayed in a time-division manner on a direct view type display device or a projector, and a stereoscopic image is implemented using liquid crystal shutter glasses. In the non-eyeglass system, an optical plate such as a parallax barrier or a lenticular lens is generally used to separate the optical axes of the right and left parallax images to realize a stereoscopic image.

Since the convenience of the user to view the stereoscopic image without wearing the shutter glasses or the polarizing glasses, the non-eyeglass system has recently attracted a great deal of attention to small and medium sized displays such as a smart phone, a tablet, and a notebook . In the non-eyeglass mode, a stereoscopic image is implemented by displaying a multi-view image including k view images (k is a natural number of 3 or more) on k view areas using an optical plate to reduce 3D crosstalk. 3D crosstalk means that a plurality of view images overlapped with a user, and the quality of a stereoscopic image is lowered due to 3D crosstalk.

The multi-view image can be generated by separating k cameras from each other by a distance of two persons and shooting an image of the object. However, since multi-view video is not easy to produce as video contents and has a high unit price to be produced as video contents, video contents implemented as multi-view video are in short supply. Therefore, a method of generating a multi-view image using a 3D image including a left eye image and a right eye image (or two view images) is widely used.

A method for generating a multi-view image using a 3D image is a method for calculating diparity by analyzing a left eye image and a right eye image. The disparity means a value for shifting the left eye image and the right eye image to form a three-dimensional effect.

1B is a view showing a disparity image calculated from a left eye image and a left eye image, and FIG. 1B is a view showing a disparity image calculated from a right eye image having substantially the same brightness as a left eye image and a left eye image Fig. The disparity image DI shown in Figs. 1A and 1B is an image obtained by normalizing disparities to 256 gray-scale values.

The conventional method of calculating the disparity does not reflect the brightness or color difference between the left eye image LI and the right eye image RI and therefore the difference in brightness or color between the left eye image LI and the right eye image RI There is a problem that disparities are calculated incorrectly as in the case of 1a. The brightness or color difference between the left eye image LI and the right eye image RI may occur when the color coordinate changes due to the characteristics of the CMOS sensor of the photographing camera or may be caused by a difference in exposure conditions during photographing.

The present invention provides a disparity calculation method and a stereoscopic image display apparatus capable of accurately calculating disparities even when there is a difference in brightness or color between a left eye image and a right eye image.

A method of generating a multi-view image according to an exemplary embodiment of the present invention includes a first step of calculating a gain value by analyzing disparities, left eye image data, and right eye image data of an N-1th frame (N is a natural number of 2 or more) A second step of calculating disparities using the gain value and left eye image data and right eye image data of an Nth frame; And a third step of generating multi-view image data by shifting the left eye image data or the right eye image data of the Nth frame according to the disparities.

A stereoscopic image display device according to an embodiment of the present invention includes a display panel including data lines and gate lines; A disparity calculating unit for calculating disparities from 3D image data including left eye image data and right eye image data, and a multi-view image generating unit for shifting the left eye image data or the right eye image data according to the disparities, An image processor including a view image generator; A data driving circuit for converting the multi-view image data into a data voltage and supplying the data voltage to the data lines; And a gate driver circuit for sequentially supplying gate pulses to the gate lines, wherein the disparity calculator calculates the disparity of the N-1 (N is a natural number of 2 or more) frames, the left eye image data, and the right eye image data A gain value calculation unit for calculating gain values by analyzing the gain values; And a disparity calculating unit for calculating disparities using the gain value and the left eye image data and the right eye image data of the Nth frame.

The present invention calculates a larger gain value as the brightness or color difference between the left eye image and the right eye image increases, and calculates disparities by reflecting the gain value. As a result, the present invention can accurately calculate disparities even when the brightness or color difference between the left eye image and the right eye image is large.

1A is an exemplary view showing a disparity image calculated from a left-eye image and a right-eye image that is lighter than a left-eye image.
1B is an exemplary view showing a disparity image calculated from a right eye image having substantially the same brightness as a left eye image and a left eye image.
2 is a block diagram schematically showing a stereoscopic image display apparatus according to an embodiment of the present invention.
3 is an exemplary view illustrating a method of implementing a stereoscopic image of a stereoscopic image display device according to an embodiment of the present invention.
4 is a detailed block diagram of the image processing unit of FIG. 2;
5 is a flowchart illustrating a method for generating a multi-view image according to an exemplary embodiment of the present invention.
6 is an exemplary view showing left eye image data, right eye image data, and view image data.
FIG. 7 is a block diagram showing the detail of the disparity calculating unit of FIG. 4;
8 is a flowchart showing in detail a disparity calculating method of the disparity calculating unit;
9 is a flowchart showing a gain value calculating method of the gain value calculating unit in detail;
10 is a block diagram showing an initial disparity calculating unit in detail;
11 is a flowchart showing in detail a method of calculating an initial disparity of an initial disparity calculating unit;
12 is an example showing the census transformation of the first census window and the census transformation of the second census window.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Like reference numerals throughout the specification denote substantially identical components. In the following description, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. The component name used in the following description may be selected in consideration of easiness of specification, and may be different from the actual product name.

2 is a block diagram schematically showing a stereoscopic image display apparatus according to an embodiment of the present invention. 2, a stereoscopic image display apparatus according to an exemplary embodiment of the present invention includes a display panel 10, an optical plate 30, a gate driving circuit 110, a data driving circuit 120, a timing controller 130, An image processing unit 140, a host system 150, and the like. The display panel 10 of the stereoscopic image display apparatus according to the embodiment of the present invention may be applied to a liquid crystal display (LCD), a field emission display (FED), a plasma display panel (PDP) ), An organic light emitting diode (OLED), or the like. Although the present invention has been described with reference to the case where the display panel 10 is implemented as a liquid crystal display device in the following embodiments, it should be noted that the present invention is not limited thereto.

The display panel 10 includes an upper substrate and a lower substrate facing each other with a liquid crystal layer interposed therebetween. The display panel 10 is formed with a pixel array including pixels arranged in a matrix form by the intersection structure of the data lines D and the gate lines G (or scan lines). Each of the pixels of the pixel array drives the liquid crystal of the liquid crystal layer by a voltage difference between a pixel electrode through which a data voltage is charged through a TFT (Thin Film Transistor) and a common electrode to which a common voltage is applied, Display. On the upper substrate of the display panel 10, a black matrix and a color filter are formed. The common electrode is formed on the upper substrate in the case of a vertical electric field driving method such as a TN (Twisted Nematic) mode and a VA (Vertical Alignment) mode. The common electrode may be formed in the IPS (In-Plane Switching) mode and the FFS And may be formed on the lower substrate together with the pixel electrode in the case of the horizontal field driving method. The liquid crystal mode of the display panel 10 may be implemented in any liquid crystal mode as well as a TN mode, a VA mode, an IPS mode, and an FFS mode. On the upper substrate and the lower substrate of the liquid crystal display panel, an alignment film is formed to attach a polarizing plate and set a pre-tilt angle of the liquid crystal. A spacer is formed between the upper substrate and the lower substrate of the display panel 10 to maintain a cell gap of the liquid crystal layer.

The display panel 10 can be implemented in any form such as a transmissive liquid crystal display panel, a transflective liquid crystal display panel, and a reflective liquid crystal display panel. In a transmissive liquid crystal display panel and a transflective liquid crystal display panel, a backlight unit is required. The backlight unit may be implemented as a direct type backlight unit or an edge type backlight unit.

The multi-view image includes first through k-th (k is a natural number of 3 or more) view images. The multi-view image can be generated by separating k cameras from each other by a distance of two persons and shooting an image of the object. The optical plate 30 is disposed on the display panel 10 and advances the first to k-th view images displayed on the pixels of the display panel 10 to the first to k-th view areas. The first to k-th view images are matched one-to-one with the first to k-th view areas. That is, the optical plate 30 advances the t-th view area (t is a natural number satisfying 1? T? K) displayed on the pixels to the t-view area. The optical plate 30 of the stereoscopic image display apparatus according to the exemplary embodiment of the present invention may be any of various types such as a parallax barrier, a switchable barrier, a lenticular lens, a switchable lens, But may also be implemented in other forms. On the other hand, when the optical plate 30 is implemented as a switchable barrier or a switchable lens, an optical plate driving circuit for driving the optical plate 30 is required. The optical plate driving circuit can turn on / off the optical separation of the switchable barrier or the switchable lens by supplying a driving voltage to the optical plate 30. [ Hereinafter, a method of implementing a stereoscopic image using the optical plate 30 will be described in detail with reference to FIG.

3 is a view illustrating an example of a stereoscopic image realizing method of a stereoscopic image display apparatus according to an embodiment of the present invention. 3, the display panel 10 displays four view images (V1, V2, V3, V4), and the optical plate 30 displays four view images (V1, V2, V3, V4) to four view areas (VP1, VP2, VP3, VP4). The optical plate 30 according to the embodiment of the present invention may be implemented in any form such as a parallax barrier, a switchable barrier, a switchable lens, or the like. It should be noted that

3, the optical plate 30 advances the first view image V1 displayed on the pixels to the first view area VP1 and the second view image V2 displayed on the pixels 2 view area VP2 and advances the third view image V3 displayed on the pixels to the third view area VP3 and the fourth view image V4 displayed on the pixels to the fourth view area VP2, And proceeds to the area VP4. When the user's left eye is located in the t-th view region VPt and the right eye is located in the t-1 view region VPt-1, the user views the t-view image Vt in the left eye, t-1 < / RTI > Therefore, the user can feel the three-dimensional effect by the binocular parallax. For example, when the left eye of the user is located in the second view area VP2 and the right eye is located in the first view area VP1 as shown in FIG. 3, the user views the second view image V2 in the left eye, The first view image V1 can be viewed by the user, so that the user can feel the three-dimensional feeling by the binocular parallax.

The data driving circuit 120 includes a plurality of source drive integrated circuits (ICs). The source driver ICs convert the 2D image data (RGB2D) or the multi-view image data (MVD) to the positive / negative gamma compensation voltages under the control of the timing controller 130 to generate positive / negative analog data voltages. Positive / negative polarity analog data voltages output from the source drive ICs are supplied to the data lines D of the display panel 10.

The gate driving circuit 110 sequentially supplies gate pulses (or scan pulses) to the gate lines G of the display panel 10 under the control of the timing controller 130. The gate driver 110 may be composed of a plurality of gate drive integrated circuits each including a shift register, a level shifter for converting an output signal of the shift register into a swing width suitable for TFT driving of the liquid crystal cell, have.

The timing controller 130 receives 2D image data (RGB2D) or multi-view image data (MVD), timing signals, a mode signal (MODE), and the like from the image processing unit 140. [ The timing controller 130 receives the 2D image data RGB2D in the 2D mode and the 3D image data RGB3D in the 3D mode. The mode signal (MODE) indicates the 2D mode or the 3D mode. The timing signals may include a vertical synchronization signal, a horizontal synchronization signal, a data enable signal, and a clock signal.

The timing controller 130 generates a gate control signal GCS for controlling the gate driving circuit 110 based on the timing signals and generates a data control signal DCS for controlling the data driving circuit 120 do. The timing controller 130 supplies a gate control signal GCS to the gate driving circuit 110. [ The timing controller 130 supplies the 2D video data RGB2D and the data control signal DCS to the data driving circuit 120 in the 2D mode and outputs the multi view video data MVD and the data control signal DCS in the 3D mode, To the data driving circuit (120).

The host system 150 has a scaler for converting the 2D image data RGB2D or 3D image data RGB3D input from the external video source device into a data format suitable for display on the display panel 10 And may include an embedded System on Chip. The host system 150 transmits the 2D image data RGB2D or the 3D image data RGB3D and timing signals to the image processing unit 140 through an interface such as a Low Voltage Differential Signaling (LVDS) interface or a TMDS (Transition Minimized Differential Signaling) . In addition, the host system 150 supplies the image processing unit 140 with a mode signal MODE indicating the 2D mode or the 3D mode.

The image processing unit 140 outputs the 2D image data RGB2D to the timing controller 130 without converting the 2D image data RGB2D in the 2D mode. The image processing unit 140 generates multi-view image data MVD from the 3D image data RGB3D in the 3D mode and outputs the multi-view image data MVD to the timing controller 130. [ The 3D image data (RGB3D) includes monocular image data and another monocular image data (or two view image data). Hereinafter, for convenience of explanation, it should be noted that the monocular image data is the left eye image data, and the other monocular image data is the right eye image data.

As a result, even if 3D image data (RGB3D) is inputted, the stereoscopic image display apparatus according to the embodiment of the present invention generates multi view image data (MVD) using the image processing unit 140, A view image can be displayed. As a result, the stereoscopic image display apparatus according to the embodiment of the present invention can enhance the quality of the stereoscopic image. Hereinafter, a method for generating multi-view image data (MVD) of the image processing unit 140 will be described in detail with reference to FIGS.

4 is a detailed block diagram of the image processing unit of FIG. 5 is a flowchart illustrating a multi-view image generating method according to an exemplary embodiment of the present invention. Referring to FIG. 4, the image processing unit 140 includes a disparity calculating unit 200 and a multi-view image generating unit 300. Hereinafter, a multi-view image generating method of the image processing unit 140 will be described in detail with reference to FIGS. 4 and 5. FIG.

First, the disparity calculating unit 200 calculates disparities DIS using left eye image data RGBL and right eye image data RGBR of 3D image data RGB3D and outputs the disparities DIS. The disparity means a value for shifting left eye image data (RGBL) or right eye image data (RGBR) to form a three-dimensional sensation. A detailed description of the disparity calculating method of the disparity calculating unit 200 will be described later with reference to FIGS. 5 and 6. FIG. (S101)

The multi view image generator 300 shifts the left eye image data RGBL or the right eye image data RGBR according to the disparities DIS calculated by the disparity calculating unit 200 to generate multi view image data MVD ). 6, the multi-view image generating unit 300 sets the left eye image data RGBL to the first view image data VD1 and the right eye image data RGBR to the kth view image data VDk, And shifts the left eye image data RGBL or the right eye image data RGBR using the disparities DIS and generates second to k-1th view image data VD2 to VDk-1, it is possible to generate multi-view image data MVD including k view image data. For example, the t-view image data VDt can be generated by shifting the left-eye image data RGBL in the first horizontal direction by a value obtained by multiplying the disparities DIS by (t / k-1).

The multi-view image generation method of the multi-view image generation unit 300 may be any known method. The multi-view image generating unit 300 may perform post-processing such as hole filling on the multi-view image data MVD. The multi-view image generating unit 300 may arrange the multi-view image data MVD using the 3D formatter according to the 3D display arrangement of the display panel 10, and output the multi-view image data MV to the timing controller 130. (S102)

FIG. 7 is a block diagram showing the disparity calculating unit of FIG. 4 in detail. 8 is a flowchart showing in detail a disparity calculating method of the disparity calculating unit. Referring to FIG. 7, the disparity calculating unit 200 includes a gain value calculating unit 210, an initial disparity calculating unit 220, and a post-processing unit 230. Hereinafter, a method for calculating the disparity of the disparity calculating unit according to the embodiment of the present invention will be described in detail with reference to FIGS. 7 and 8. FIG.

First, the gain value calculation unit 210 calculates the gain value G of the Nth frame by analyzing the disparities, the left eye image data, and the right eye image data of the Nth frame. The gain value calculating unit 210 may include a memory for storing the gain value G. [ The gain value G calculated in the (N-1) th frame is stored in the memory. The gain value calculator 210 outputs the gain value calculated in the (N-1) th frame during the Nth frame to the initial disparity calculator 220, and stores the calculated gain value in the Nth frame in the memory. The gain value calculation unit 210 will be described later in detail with reference to FIGS. 9 and 10. FIG. (S201)

Secondly, the initial disparity calculating unit 220 receives the gain value G from the gain value calculating unit 210 and receives the left eye image data RGBL and the right eye image data RGBL of the Nth frame from the host system 150, (RGB3D) including RGBR (RGBR). The initial disparity calculating unit 220 calculates the initial disparities IDIS using the gain value and the left eye image data RGBL and the right eye image data RGBR of the Nth frame. A detailed description of the initial disparity (IDIS) calculation method of the initial disparity calculation unit 220 will be described later with reference to FIGS. 11 and 12. FIG. (S202)

Third, the post-processing unit 230 post-processes initial disparities (IDIS) to calculate disparities (DIS). The post-processing unit 230 performs post-processing of initial disparities (IDIS) using any one of various filters such as a median filter, a weighted median filter, and a weighted voting filter. can do. The median filter is a filter that converts data in the center coordinates of the mask into a median value of the data in the mask. The weight median filter is a filter that arranges data in a mask by applying a weight of a weight mask, selects a median value, and converts data in the center coordinates in the mask to its median value. The weight mode filter is a filter for generating a histogram by applying a weight of a weight mask to data in a mask, selecting a mode, and converting data in the center coordinates in the mask to the mode. The post-processing unit 230 outputs the post-processed disparities DIS to the multi-view image generating unit 300. (S203)

9 is a flowchart showing in detail a gain value calculating method of the gain value calculating section. Referring to FIG. 9, first, the gain value calculator 210 receives the left eye disparities and the right eye disparities of the Nth frame. The positions of left eye disparities and right eye disparities can be expressed in coordinates. The gain value calculator 210 calculates the coordinates corresponding to the occlusion region by analyzing the left eye disparities and the right eye disparities of the Nth frame. The occlusion area means data having no information in the view image data when the view image data is generated by shifting the left eye image data (RGBL) or the right eye image data (RGBR) using disparities. Thus, the occlusion region includes coordinates of data that do not have any information in the view image data. The left eye disparity is a value for shifting the left eye image data to form a three-dimensional sensation. The left eye disparity is a position (coordinate) of the left eye image data calculated based on the left eye image data and a position Coordinate). The right-eye disparity is a value for shifting the right eye image to form a three-dimensional sensation. The right eye disparity is a value indicating the position (coordinate) of the left eye image data and the position (coordinate) of the right eye image data calculated on the basis of the right eye image data. It means difference.

The gain value calculation unit 210 calculates the coordinates corresponding to the occlusion region by analyzing the left eye disparities and the right eye disparities of the Nth frame. Specifically, the gain value calculation unit 210 calculates the gain value of the right eye disparity in the (x, y) coordinate and the right eye disparity in the (x-Dl (x, y) (X, y) is calculated as an occlusion area when the absolute value of the difference is greater than the first threshold value. Dl (x, y) is the left-eye disparity at the (x, y) coordinates.

Figure pat00001

In Equation 1, Dl (x, y) is the left eye disparity at the (x, y) coordinates, Dr (x, y) is the right eye disparity at the (x, y) coordinates, TH1 is the first threshold it means. (S301)

Secondly, the gain value calculation unit 210 receives the left eye image data of the Nth frame. The position of the left eye image data may be expressed in coordinates. The gain value calculation unit 210 calculates average values of the sub pixel data of the left eye image data that does not correspond to the occlusion region among the left eye image data of the Nth frame. The left eye image data may include first through third sub pixel data, and the first through third sub pixel data may be embodied as red data, green data, and blue data. In this case, the gain-value calculation unit 210 is the N-th frames of the left-eye image data from the five average values of the left-eye red data which do not correspond with the occlusion zone (RAVG L), the average value of the left-eye green data (GAVG L) and a left-eye blue data (BAVG L ). (S302)

Thirdly, the gain value calculating unit 210 receives the right eye image data of the Nth frame. The position of the right eye image data can be represented by coordinates. The gain value calculating unit 210 calculates average values of sub pixel data of the right eye image data that does not correspond to the occlusion region among the right eye image data of the Nth frame. The right eye image data may include first through third sub pixel data, and the first through third sub pixel data may be embodied as red data, green data, and blue data. In this case, the gain value calculating unit 210 calculates the average value RAVG R of the right eye red data, the mean value GAVG R of the right eye green data, and the mean value GAVG R of the right eye blue data, which do not correspond to the occlusion region, (BAVG R ). (S303)

Fourth, the gain value calculating section 210 uses the average values of the sub-pixel data of the left eye image data not corresponding to the occlusion region and the average values of the sub pixel data of the right eye image data not corresponding to the occlusion region Pixel data difference value, the second sub-pixel data difference value, and the third sub-pixel data difference value. Specifically, the gain value calculating unit 210 calculates the absolute value of the difference between the average value RAVG L of the left eye red data and the mean value RAVG R of the right eye red data as shown in Equation 2 and calculates the average value of the left eye red data RAVG L ) and the average value RAVG R of the right-eye red data and the maximum difference value Dmax, among the absolute difference value Dmax and the absolute difference value Dmax. On the other hand, the maximum difference value Dmax is a value for setting the maximum value of the absolute value of the difference between the average value RAVG L of the left eye red data and the average value RAVG R of the right eye red data, and may be omitted. The gain value calculator 210 calculates the absolute value of the difference between the average value RAVG L of the left eye red data and the mean value RAVG R of the right eye red data as the red difference value Dred when the maximum difference value Dmax is omitted, .

Figure pat00002

The gain value computing unit 210 is the average value (GAVG L) of calculating an absolute value of the difference, and the left-eye green data of the average value (GAVG R) of the left-eye green data average (GAVG L) and right-eye green data for as shown in Equation 3 And the maximum difference value Dmax between the absolute value of the difference between the average value of the right-eye green data GAVG R and the maximum difference value Dmax is calculated as the green difference value Dgreen. In this case, the absolute value of the difference between the average value GAVG L of the left eye green data and the mean value GAVG R of the right eye green data is set to green It can be calculated as a difference value (Dgreen).

Figure pat00003

The gain value computing unit 210 is the average value (BAVG L) of calculating an absolute value of the difference, and the left-eye blue data of the average value (BAVG R) of the left-eye blue data average (BAVG L) and right-eye blue data of as shown in Equation 4 And the absolute value of the difference between the average value BAVG R of the right-eye blue data and the maximum difference value Dmax is calculated as the blue difference value Dblue. In this case, the gain value calculator 210 may calculate the absolute value of the difference between the average value BAVG L of the left eye blue data and the average value BAVG R of the right eye blue data as blue And can be calculated as a difference value (Dblue).

Figure pat00004

Fifth, the gain value calculating unit 210 calculates the gain value G using the red difference value Dred, the green difference value Dgreen, and the blue difference value Dblue. Specifically, the gain value calculating unit 210 can calculate the gain value G as shown in Equation (5).

Figure pat00005

In Equation 5, G denotes a gain value, G offset denotes an offset gain value, Dred denotes a red difference value, Dgreen denotes a green difference value, Dblue denotes a blue difference value, and Dmax denotes a maximum difference value. The G offset has a value between 0 and 1, and the larger the G offset , the larger the gain value (G) is calculated. The G offset can be preset through a preliminary experiment. The red difference value Dred, the green difference value Dgreen and the blue difference value Dblue are calculated to be larger as the brightness or color difference between the left eye image LI and the right eye image RI increases. . (S305)

10 is a block diagram showing the initial disparity calculating unit in detail. 11 is a flowchart showing in detail an initial disparity calculating method of the initial disparity calculating unit. 10, the initial disparity calculating unit 220 includes an AD value calculating unit 221, a census value calculating unit 222, an initial matching value calculating unit 223, an initial matching sum value calculating unit 224, And an initial disparity output unit 225. Hereinafter, a method of calculating the initial disparity of the initial disparity calculating unit will be described in detail with reference to FIGS. 10 and 11. FIG.

Meanwhile, the initial disparity calculating unit 220 calculates the initial disparity (IDIS) by setting either one of the left eye image data and the right eye image data as reference image data and setting the other as comparison image data. It should be noted that in the embodiment of the present invention, the left eye image data is the reference image data, and the right eye image data is the comparison image data.

First, the AD value calculating unit 221 receives the 3D image data (RGB3D) of the Nth frame from the host system 150. [ The 3D image data RGB3D of the Nth frame includes left eye image data RGBL and right eye image data RGBR. The AD value calculation unit 221 calculates the AD value by analyzing the left eye image data RGBL and the right eye image data RGBR of the Nth frame. The AD value is the difference between the left eye image data and the right eye image data located within the first range from the left eye image data.

The AD value calculating section 221 sets the center coordinates of the left eye image data, which is the reference image data. For example, the AD value calculating section 221 can set the (x, y) coordinate as the center coordinate. In this case, the AD value calculating unit 221 calculates the AD value of the left eye image data RGBL (x, y) and (xr, y) (r is a natural number) The absolute value of the difference of the image data RGBR (xr, y) is calculated as an AD value (C AD (x, y, r)) corresponding to (x, y, r). r is from 0 to r max Value.

Figure pat00006

For example, if r max is 60, the AD value calculating section 221 calculates the right eye image data (x, y, y) in the (x, y) It calculates the respective differences (x, y, 0) to AD value (C AD (x, y, 0) ~ C AD (x, y, 60)) for the (x, y, 60). (S401)

Secondly, the census value calculating unit 222 receives the 3D image data (RGB3D) of the Nth frame from the host system 150. [ The census value calculation unit 222 calculates the census value using the left eye image data, the surrounding data, the right eye image data, and the surrounding data of the Nth frame.

Specifically, the census value calculation unit 222 sets the first census window CW1 with the left-eye image data at the (x, y) coordinates as the center coordinates CC. In FIG. 12, the first census window CW1 is implemented as a 3 × 3 size. However, the present invention is not limited to this, and p × q (p, q is a natural number) size. When the left eye image data at one coordinate in the first census window CW1 is greater than or equal to the left eye image data at the center coordinate CC, the census value calculation unit 222 calculates the census value And assigns a second value to the value of the coordinate when the value is smaller than the second value. The first value may be "1 ", and the second value may be" 0 ". For example, if the left eye image data of one coordinate in the first census window CW1 is equal to or larger than the left eye image data "85" in the center coordinate CC as shown in FIG. 12, Quot; 1 "is assigned to the coordinate value, and" 0 "

The census value calculation unit 222 sets the second census window CW2 to the center coordinate CC of the right eye image data at the (xr, y) coordinates. As described in step S401, r is a value ranging from 0 to rmax Value. In FIG. 12, the second census window CW1 is implemented with a size of 3x3, but the present invention is not limited thereto. The second census window CW1 may be implemented by a pxq size. The census value calculation unit 222 assigns the first value to the coordinate value of the right eye image data at a certain coordinate within the second census window CW2 when the right eye image data is equal to or larger than the right eye image data at the center coordinate, And if smaller, the second value is assigned to the value of the coordinate. The first value may be "1 ", and the second value may be" 0 ". For example, if the right eye image data of one coordinate in the second census window CW2 is equal to or larger than the right eye image data "30 " in the center coordinate as shown in Fig. 12, Quot ;, and when it is smaller than "30 "," 0 "

The census value calculator 222 converts the census-converted values in the first census window CW1 into a first bit string BS1 and outputs the census-converted value in the second census window CW2 And then performs an exclusive OR (XOR) operation on the third bit string BS3 to generate a third bit string BS3. The census value calculation unit 222 calculates the census value C cen (x, y, r) corresponding to (x, y, r) by summing the bit values of the third bit string BS3. In Fig. 12, the census value C cen (x, y, r) corresponding to (x, y, r) can be calculated as "2 ". On the other hand, the census value C cen (x, y, r) corresponding to (x, y, r) sets the left eye image data RGBL (x, y) And the right eye image data RGBR (xr, y) in the (xr, y) coordinates as the center coordinates. (S402)

Thirdly, the initial matching value calculator 223 receives the gain value G from the gain value calculator 210. The initial matching value calculating unit 223 receives the AD values VAD from the AD value calculating unit 221 and receives the census values VCEN from the census value calculating unit 222. The initial matching value calculator 223 calculates the initial matching values IMV by applying the gain value G to the AD values VAD and the census values VCEN.

Specifically, the initial matching value calculator 223 calculates an initial matching value corresponding to (x, y, r) as shown in Equation (7). In Equation 7, IMV (x, y, r) is (x, y, r) the initial matching value, G is a gain value, C cen (x, y, r) corresponding to the (x, y, r) C AD (x, y, r) denotes an AD value corresponding to (x, y, r).

Figure pat00007

As a result, the larger the difference in brightness or color between the left eye image and the right eye image, the larger the gain value is calculated. Also, as the gain value increases, the present invention reflects the census value as high as shown in Equation (7) and reflects the AD value low to calculate the initial matching value. Since the AD value is calculated as the absolute value of the difference between the left eye image data and the right eye image data, the greater the difference in brightness or color between the left eye image and the right eye image, the higher the probability of being miscalculated. In contrast, the census value is a value calculated by comparing and comparing the left eye image data and the right eye image data in the census window, so that the brightness or color difference between the left eye image and the right eye image is less sensitive than the AD value. Therefore, according to the present invention, as the gain value is increased, the census value is reflected high and the AD value is reflected low, and the initial matching value is calculated, thereby preventing an erroneous calculation of the initial matching value according to the brightness or color difference between the left eye image and the right eye image have. In particular, since the disparity is calculated using the initial matching value, the disparity can also be accurately calculated by accurately calculating the initial matching value. (S403)

Fourth, the initial matching sum value calculating section 224 receives the initial matching value IMV from the initial matching value calculating section 223. [ The initial matching sum value calculation unit 224 calculates the initial matching sum value IMAV by summing the initial matching value IMV with the surrounding initial matching values. Specifically, the initial matching sum value calculation unit 224 sets the initial matching value IMV (x, y) corresponding to (x, y, r) as a center coordinate and sets the initial matching value IMV (X, y, r) at (x, y, r) by summing the matching values. The mask may be implemented to include i x j (i, j is a natural number greater than or equal to 2) initial match values. (S404)

Fifth, the initial disparity output unit 225 receives the initial matching summed values (IMAV) from the initial matching summed value calculating unit 224. The initial disparity output unit 225 analyzes the initial matching sum values (IMAV) to calculate initial disparities (IDIS). Specifically, the initial disparity output unit 225 outputs the initial matching sum values IMAV (x, y, 0) to C (x, y, r max ) corresponding to (x, y, r) (x, y) at the (x, y) coordinates of the initial matching sum value having the minimum value among the initial disparity values (IDIS (x, y), r max ) For example, an initial disparity output unit 225 is (x, y, r) to (x, y, r max), the initial matched summation values (IMAV (x, y corresponding to a, 0) ~ C (x, y, r max)) from (x, y, 10) if the initial matching integrated value having a minimum value, the "10" (x, y) initial disparity (IDIS (x, y) in the coordinate corresponding to) . (S405)

Meanwhile, in the embodiment of the present invention, centering on calculating the disparities by setting the left eye image data as the reference image data and the right eye image data as the comparison image data. However, the embodiment of the present invention can calculate the disparities by setting the right eye image data as the reference image data and the left eye image data as the comparison image data. Eye image data is set as reference image data, right eye image data is set as comparison image data, the calculated disparities correspond to left eye disparities, right eye image data is set as reference image data, and left eye image data is compared with comparison image data And the calculated disparities are right-eye disparities. The method of calculating the right-eye parities is substantially the same as that described with reference to Figs. 8 to 12. Fig.

As described above, according to the present invention, the larger the brightness or color difference between the left eye image and the right eye image is, the larger the gain value is calculated, and the disparities are calculated by reflecting the gain value. As a result, the present invention can accurately calculate disparities even when the brightness or color difference between the left eye image and the right eye image is large.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Therefore, the present invention should not be limited to the details described in the detailed description, but should be defined by the claims.

10: display panel 30: optical plate
110: gate driving circuit 120: data driving circuit
130: timing controller 140:
150: Host system 200: Disparity calculation unit
210: gain value calculation unit 220: initial disparity calculation unit
230: post-processing unit 300: stereoscopic image generating unit

Claims (11)

A first step of calculating a gain value by analyzing disparities, left eye image data, and right eye image data of an Nth (N is a natural number of 2 or more) frames;
A second step of calculating disparities using the gain value calculated in the (N-1) th frame and the left eye image data and the right eye image data of the Nth frame; And
And a third step of generating multi-view image data by shifting left-eye image data or right-eye image data of the Nth frame according to the disparities.
The method according to claim 1,
In the first step,
(1-a) calculating coordinates corresponding to the occlusion region by analyzing left-eye disparities and right-eye disparities of the Nth frame; And
(1-b) calculating a gain value using left eye image data and right eye image data that do not correspond to the occlusion region among the left eye image data and the right eye image data of the Nth frame, Multi-view image generation method.
3. The method of claim 2,
The step (1-a)
if the absolute value of the difference between the left eye disparity in the (x, y) coordinate and the right eye disparity in the (x-Dl (x, y), y) coordinate is larger than the threshold value, (X, y) corresponds to the left-eye disparity in the (x, y) coordinates, and Dl (x, y) corresponds to the left-eye disparity in the (x, y) coordinates.
3. The method of claim 2,
Wherein the step (1-b)
Calculating average values of the sub-pixel data of the left eye image data not corresponding to the occlusion region among the left eye image data of the Nth frame;
Calculating average values of sub pixel data of right eye image data not corresponding to the occlusion region among right eye image data of the Nth frame; And
And calculating the gain value using the average values of the sub-pixel data of the left eye image data and the average values of the sub-pixel data of the right eye image data.
5. The method of claim 4,
Wherein the step of calculating the gain value using the average values of the sub-pixel data of the left eye image data and the average values of the sub-pixel data of the right eye image data comprises:
Pixel data of the left-eye image data and an average value of the first sub-pixel data of the right-eye image data is Dred, the average value of the second sub- And the absolute value of the difference between the average value of the third sub pixel data of the left eye image data and the average value of the third sub pixel data of the right eye image data is Dblue When the maximum value of Dred, Dgreen, and Dblue is Dmax, the gain value is G, and the offset gain value is G offset ,
The gain value may be,
Figure pat00008
And generating the multi-view image.
The method according to claim 1,
The second step comprises:
(2-a) calculating AD values by analyzing left eye image data and right eye image data of the Nth frame;
(2-b) calculating census values using left eye image data, neighboring data, right eye image data, and surrounding data of the Nth frame;
(2-c) calculating the initial matching value by applying the gain value to the AD values and the census values; And
Calculating an initial matching sum value by summing the initial matching value with the initial matching values around the initial matching value, comparing the initial matching sum value with the center coordinate to the initial matching sum values located within the second range from the center coordinate, And (2-d) calculating a parity of the multi-view image.
The method according to claim 6,
Wherein the step (2-a)
(x, y, r) corresponding to (x, y, r) that the absolute value of the difference between the left eye image data and the right eye image data in (xr, y) Value of the multi-view image.
The method according to claim 6,
The step (2-b)
(x, y) coordinates of the first census window, and sets the first census window to the left eye image data at the coordinates (x, y) Performing a census transformation that assigns a first value to a value of the coordinate if the value is greater than or equal to the data and assigns a second value to the value of the coordinate if the value is greater than or equal to the data;
Sets right eye image data at the (x, y) coordinates as a center coordinate and sets right eye image data at a coordinate within the second census window as right eye coordinates at the (x, y) Performing a census transformation to assign a first value to a value of the coordinate if the value is greater than or equal to the image data, and assign a second value to the value of the coordinate if the value is less than the first value;
Generating census-converted values in the first census window into a first bitstream, census-transforming the values in the second census window into a second bitstream, and performing exclusive-OR operation on the third bitstream; And
And calculating a census value corresponding to (x, y, r) by summing bit values of the third bit string.
The method according to claim 6,
The step (2-c)
C (x, y, r) is (x, y, r) the initial matching value, G is a gain value, C cen (x, y, r) corresponding to the consensus corresponding to the (x, y, r) value, C AD (x, y, r) is referred to when the AD value, the gain value corresponding to the (x, y, r) G,
The initial matching value corresponding to (x, y, r)
Figure pat00009
And generating the multi-view image.
The method according to claim 6,
The step (2-d)
(x, y, r), the initial matching values corresponding to (x, y, r) are set to the center coordinates and the initial matching values of the respective coordinates in the mask are summed up, Calculating a value; And
(x, y) of the initial matching sum value having the minimum value among the initial matching sum values corresponding to (x, y, 0) to (x, y, r max ) And generating a multi-view image.
A display panel including data lines and gate lines;
A disparity calculating unit for calculating disparities from 3D image data including left eye image data and right eye image data, and a multi-view image generating unit for shifting the left eye image data or the right eye image data according to the disparities, An image processor including a view image generator;
A data driving circuit for converting the multi-view image data into a data voltage and supplying the data voltage to the data lines; And
And a gate driving circuit for sequentially supplying gate pulses to the gate lines,
The disparity calculating unit may calculate,
A gain value calculation unit for calculating a gain value by analyzing disparities, left eye image data, and right eye image data of N (N is a natural number of 2 or more) frames; And
And a disparity calculating unit for calculating disparities using the gain value calculated in the (N-1) -th frame and the left-eye image data and the right-eye image data in the N-th frame.
KR1020130074633A 2013-06-27 2013-06-27 Multiview image generation method and stereoscopic image display device KR102045563B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130074633A KR102045563B1 (en) 2013-06-27 2013-06-27 Multiview image generation method and stereoscopic image display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130074633A KR102045563B1 (en) 2013-06-27 2013-06-27 Multiview image generation method and stereoscopic image display device

Publications (2)

Publication Number Publication Date
KR20150001421A true KR20150001421A (en) 2015-01-06
KR102045563B1 KR102045563B1 (en) 2019-12-02

Family

ID=52475183

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130074633A KR102045563B1 (en) 2013-06-27 2013-06-27 Multiview image generation method and stereoscopic image display device

Country Status (1)

Country Link
KR (1) KR102045563B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130013071A (en) * 2011-07-27 2013-02-06 엘지디스플레이 주식회사 Streoscopic image display device and method for driving thereof
KR20130027932A (en) * 2011-09-08 2013-03-18 엘지디스플레이 주식회사 Stereoscopic image display device and driving method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130013071A (en) * 2011-07-27 2013-02-06 엘지디스플레이 주식회사 Streoscopic image display device and method for driving thereof
KR20130027932A (en) * 2011-09-08 2013-03-18 엘지디스플레이 주식회사 Stereoscopic image display device and driving method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
논문: *

Also Published As

Publication number Publication date
KR102045563B1 (en) 2019-12-02

Similar Documents

Publication Publication Date Title
US8743111B2 (en) Stereoscopic image display and method for driving the same
TWI510054B (en) Stereoscopic image display device and method for driving the same
KR102197382B1 (en) Bendable stereoscopic 3d display device
EP3038360A1 (en) Autostereoscopic 3d display device
US9995942B2 (en) Autostereoscopic 3D display device
KR101992163B1 (en) Stereoscopic image display device and method for driving the same
KR101963385B1 (en) Disparity calculation method and stereoscopic image display device
KR102126532B1 (en) Method of multi-view image formation and stereoscopic image display device using the same
KR101929042B1 (en) Disparity calculation unit and stereoscopic image display device including the same and disparity calculation method
KR101990334B1 (en) Stereoscopic image display device and method for driving the same
KR102334031B1 (en) Autostereoscopic 3d display device and driving method thereof
KR20140092055A (en) Stereoscopic image display device and driving method thereof
KR101798236B1 (en) Stereoscopic image display and method of adjusting brightness thereof
KR102022527B1 (en) Stereoscopic image display device and disparity calculation method thereof
KR102045563B1 (en) Multiview image generation method and stereoscopic image display device
KR101953315B1 (en) Disparity calculation method and stereoscopic image display device
KR20160024283A (en) Lenticular lens type stereoscopic 3d display device
KR101983369B1 (en) Multiview image generation method and stereoscopic image display device using the same
KR101957975B1 (en) Disparity calculation method and stereoscopic image display device using the same
KR101863140B1 (en) Display Apparatus For Displaying Three Dimensional Picture And Driving Method For The Same
KR101843198B1 (en) Method of multi-view image formation and stereoscopic image display device using the same
KR101961943B1 (en) 3d image data formation method and stereoscopic image display device using the same
KR102126530B1 (en) 3d conversion method and stereoscopic image display device using the same
KR101996657B1 (en) Global depth map generation method and stereoscopic image display device using the same
KR101980352B1 (en) Stereoscopic image display device and method for driving the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant