WO2014017201A1 - Image processing device, image processing method, and image display device - Google Patents

Image processing device, image processing method, and image display device Download PDF

Info

Publication number
WO2014017201A1
WO2014017201A1 PCT/JP2013/066163 JP2013066163W WO2014017201A1 WO 2014017201 A1 WO2014017201 A1 WO 2014017201A1 JP 2013066163 W JP2013066163 W JP 2013066163W WO 2014017201 A1 WO2014017201 A1 WO 2014017201A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
occlusion
unit
image
occlusion signal
Prior art date
Application number
PCT/JP2013/066163
Other languages
French (fr)
Japanese (ja)
Inventor
亨 西
オリバー エルドラー
ヤルチン インシィウ
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2014017201A1 publication Critical patent/WO2014017201A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation

Definitions

  • the technology disclosed in this specification relates to an image processing apparatus, an image processing method, and an image display apparatus that process a three-dimensional image, and in particular, image processing for correcting parallax information in an occlusion area between a left image and a right image.
  • the present invention relates to an apparatus, an image processing method, and an image display apparatus.
  • the parallax between the left image and the right image corresponds to a coordinate value in a three-dimensional space.
  • the disparity information is required, for example, when double-speed conversion or when generating an interpolation image in a multi-view image of the naked eye 3D.
  • an occlusion region appears in one image but hidden in the other image.
  • the occlusion area is present in the vicinity of the boundary of disparity information where the depth changes discontinuously, such as the boundary between the object and the background.
  • the estimated value near the boundary of the parallax information is important information that forms the contour of the object in the interpolated image.
  • the estimation error of the parallax that occurs near the boundary of the parallax information is that the pixels in the foreground area stick to the background, conversely, the pixels in the background area stick to the foreground, the outline of the object is disturbed, and the false area is in the background area near the outline of the object It causes the halo phenomenon.
  • the attention cluster including the pixels in which the correspondence relationship is detected is as follows.
  • the average value of the disparity information in the attention cluster is used as the disparity information of the occlusion pixel in the attention cluster, and for the attention cluster having no detected correlation, the disparity information in the cluster having the closest feature amount in the image reference area
  • An image processing apparatus that brings the boundary of disparity information closer to the boundary of an actual object by using the average value of the above as the disparity information of the occlusion pixels in the cluster of interest has been proposed (see, for example, Patent Document 1). .
  • the parallax information generated in the occlusion area cannot be obtained correctly in principle, there is a concern that it may have a non-uniform and noise-like shape.
  • a pair of search windows facing each other across the parallax contour that is, the boundary of the parallax information
  • a pair of search windows based on the left image in the pair of search windows An image processing apparatus has been proposed that determines which of the occlusion areas is included and corrects the parallax of the occlusion area in the search window in which the occlusion area exists based on the parallax on the other search window side (for example, see Patent Document 2).
  • An object of the technology disclosed in the present specification is to provide an excellent image processing device, an image processing method, and an image display device that can suitably correct parallax information in an occlusion region between a left image and a right image. There is.
  • a further object of the technology disclosed in this specification is to accurately estimate parallax information in an occlusion region between a left image and a right image, and to reduce a halo phenomenon that occurs in an interpolated image generated based on the parallax information.
  • An object of the present invention is to provide an excellent image processing apparatus, image processing method, and image display apparatus.
  • the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are configured to spatially stabilize the occlusion signal. Yes.
  • the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are arranged in horizontal and vertical directions with respect to the scattered occlusion signals. It is configured to spread after filtering by majority vote.
  • the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are configured to remove an indefinite state in which both occlusion signals are occluded. It is configured.
  • the first mixing unit of the image processing device is configured to generate a left image according to an occlusion signal processed by the first occlusion signal processing unit.
  • the left reference initial parallax is replaced with the right reference projection parallax
  • the second mixing unit changes the right according to the occlusion signal processed by the second occlusion signal processing unit.
  • the right reference initial parallax is replaced with the left reference projection parallax.
  • the first parallax cleaning unit of the image processing device performs processing after processing by an initial occlusion signal and the first occlusion signal processing unit. Comparing the occlusion signals, detecting the part where the value has changed as isolated parallax noise in the left reference parallax, and filling it with the non-isolated parallax value, and the second parallax cleaning unit, The initial occlusion signal is compared with the occlusion signal processed by the second occlusion signal processing unit, and the portion where the value is changed is detected as isolated parallax noise in the right reference parallax, and the non-isolated parallax It is configured to fill with the value of.
  • the first parallax cleaning unit of the image processing device performs processing on the left reference parallax by the first occlusion signal processing unit. Based on the later occlusion signal, a process of replacing the value of the foreground parallax with the value of the background parallax is performed, and the second parallax cleaning unit performs the second occlusion signal with respect to the right reference parallax. Based on the occlusion signal after processing by the processing unit, processing is performed to replace the foreground parallax value with the background parallax value.
  • the technique according to claim 9 of the present application is An initial parallax generating unit for generating left-reference and right-reference initial parallax from the left image and the right image, respectively; A parallax correction unit that corrects a parallax value of an occlusion area included in the initial parallax; An interpolated image generating unit that generates an interpolated image for interpolating the original image based on the corrected parallax; An image integration unit for integrating the original image and the interpolation image; A display unit for displaying an original image or an image after integration by the image integration unit; Comprising The parallax correction unit projects a left reference initial parallax onto a right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal; and a right reference initial parallax as a left reference initial parallax A second projection unit that projects right projection parallax and a right reference occlusion signal; a first occlusion signal
  • the parallax boundary of the occlusion area is brought close to the object boundary, for example, by applying it at a high frame rate or when displaying a naked-eye three-dimensional image.
  • the halo phenomenon that occurs in the parallax can be suppressed, and the isolated noise portion within the parallax can be suitably removed.
  • FIG. 1 is a diagram schematically illustrating a functional configuration of an image display apparatus 100 to which the technology disclosed in this specification can be applied.
  • FIG. 2 is a block diagram showing a functional configuration for performing processing for increasing the frame rate or increasing the number of viewpoints in the video signal processing unit 120.
  • FIG. 3 is a diagram illustrating an internal configuration of the interpolation frame generation unit 202.
  • FIG. 4 is a diagram illustrating an internal configuration example of the parallax correction unit 304.
  • FIG. 5 is a diagram showing an example of a method for performing projection between initial parallaxes of left and right images.
  • FIG. 6 is a diagram showing an example of stabilizing the left and right occlusion signals Occ_L and Occ_R.
  • FIG. 7 is a diagram illustrating a state in which an indefinite state portion of the occlusion signal is removed.
  • FIG. 8 is a diagram illustrating an example of the left reference initial parallax OD_L and the left reference parallax D_L after being mixed with the right reference projection parallax PD_R by the mixing unit 405.
  • FIG. 9A is a diagram illustrating a place where the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R in the mixed left reference parallax D_L.
  • FIG. 9B is a diagram illustrating a state in which a portion in which the initial occlusion signal Occ_L and the processed occlusion signal Occ_L ′′ are compared and changed is detected.
  • FIG. 10 is a diagram exemplifying a result of removing isolated noise from the left reference parallax D_L after being mixed by the mixing unit 405.
  • FIG. 11 is a diagram illustrating a state in which the foreground parallax value on the occlusion side is replaced with the background parallax value.
  • FIG. 12 is a diagram schematically illustrating how the left reference initial parallax OD_L and the right reference initial parallax OD_R are obtained from the left image and the right image.
  • FIG. 1 schematically shows a functional configuration of an image display apparatus 100 to which the technology disclosed in this specification can be applied.
  • the illustrated image display device 100 receives, for example, a 3D video signal composed of a left image and a right image, performs high frame rate processing or multi-viewpoint processing as necessary, and converts multiple viewpoint images into, for example, spatial images. Are displayed on the screen. Then, the viewer can wear glasses for viewing 3D video, or can observe the image 3D with the naked eye.
  • the image display device 100 includes a video display unit 110, a video signal processing unit 120, and a timing control unit 140.
  • the video signal processing unit 120 When the video signal processing unit 120 receives the transmission of the video signal from the external device of the video signal processing unit 120, the video signal processing unit 120 executes various signal processing so as to be suitable for video display in the video display unit 110 and outputs it.
  • the “external device” serving as the transmission source of the video signal mentioned here may include a digital broadcast receiver and a content playback device such as a Blu-ray disc player.
  • the video signal processing unit 120 for example, image quality correction processing such as enhancement of image sharpness and contrast improvement is performed.
  • the video signal processing unit 120 generates an interpolation frame for interpolating viewpoint images for multi-viewpoints. Processing related to the generation of the interpolation frame will be described later.
  • the timing control unit 140 receives the video signal processed by the video signal processing unit 120.
  • the timing control unit 140 converts the input left image signal and right image signal into signals to be input to the video display unit 110, and is used for the operation of the panel drive circuit including the gate driver 113 and the data driver 114. A pulse signal to be generated.
  • the video display unit 110 displays a video corresponding to a signal applied from the outside.
  • the video display unit 110 includes a display panel 112, a gate driver 113, a data driver 114, and a light source 115.
  • the gate driver 113 is a drive circuit that generates a signal for driving sequentially, and to the gate bus line connected to each pixel in the display panel 112 according to the signal transmitted from the timing control unit 140.
  • the drive voltage is output.
  • the data driver 114 is a drive circuit that outputs a drive voltage based on the video signal, and is applied to the data line based on the signal transmitted from the timing control unit 140 and the video signal output from the video signal processing unit 120. Generate and output signals.
  • the display panel 112 has a plurality of pixels arranged in a grid, for example.
  • liquid crystal molecules having a predetermined alignment state are sealed between transparent plates such as glass, and an image is displayed according to the application of a signal from the outside.
  • the application of signals to the display panel 112 is executed by the gate driver 113 and the data driver 114.
  • the light source 115 is a backlight provided at the back of the image display unit 110 when viewed from the viewer side.
  • unpolarized white light is emitted from the light source 115 to the display panel 112 located on the viewer side.
  • one pixel is formed by cells of a plurality of color components, such as OLED (Organic Light Emitting Diode) and LED (Light Emitting Diode), and a plurality of pixels are sequentially arranged in the horizontal and vertical directions.
  • OLED Organic Light Emitting Diode
  • LED Light Emitting Diode
  • FIG. 2 shows a functional block diagram for performing processing for increasing the frame rate or increasing the number of viewpoints in the video signal processing unit 120.
  • the image input unit 201 inputs a video signal composed of a time series of image frames.
  • a three-dimensional image signal including a left image and a right image is input.
  • the interpolation frame generation unit 202 generates an interpolation frame for multi-viewpoint from the input image frame.
  • the image integration unit 203 generates the multi-viewpoint image signal by inserting the interpolation frame generated by the interpolation frame generation unit 202 into the original image frame.
  • FIG. 3 shows an internal configuration of the interpolation frame generation unit 202.
  • the left image frame memory 301 and the right image frame memory 311 store the left image and the right image of the input three-dimensional image signal, respectively.
  • the initial parallax calculation unit 302 When the initial parallax calculation unit 302 reads out the temporally corresponding left image and right image from the left image frame memory 301 and the right image frame memory 311, respectively, the initial parallax calculation unit 302 uses the left image as a reference by the block matching method. An initial parallax (Original Disparity L: OD_L) is calculated and written in the parallax memory 303. Similarly, the initial parallax calculation unit 312 calculates an initial parallax (Original Disparity R: OD_R) based on the right image by the block matching method, and writes the initial parallax in the parallax memory 313.
  • An initial parallax (Original Disparity L: OD_L) is calculated and written in the parallax memory 303.
  • the initial parallax calculation unit 312 calculates an initial parallax (Original Disparity R: OD_R) based on the right image by the block
  • FIG. 12 schematically shows how the left reference initial parallax OD_L and the right reference initial parallax OD_R are obtained from the left image and the right image.
  • occlusion occurs along the left contour of the object in the left reference initial parallax OD_L
  • occlusion occurs along the right contour of the object in the right reference initial parallax OD_R.
  • the parallax correction unit 304 performs a correction process for the initial parallax calculated based on the left reference and a correction process for the initial parallax calculated based on the right reference.
  • a parallax estimation error occurs due to occlusion.
  • pixels in the foreground area stick to the background
  • pixels in the background area stick to the foreground
  • the outline of the object is disturbed, and a false outline appears in the background area near the outline of the object. It causes the phenomenon to occur.
  • the parallax correction unit 304 accurately estimates the parallax information in the occlusion included in the left and right initial parallaxes, and reduces the halo phenomenon that occurs in the interpolated image generated based on the parallax information. .
  • details of the parallax correction processing will be described later.
  • the frame generation unit 305 shifts the left image frame read from the left image frame memory 301 horizontally for each pixel based on the corrected left reference parallax, and generates an interpolation frame of the left image. To do. Also, the frame generation unit 315 shifts the right image frame read from the right image frame memory 311 horizontally for each pixel based on the corrected right reference parallax to generate an interpolation frame.
  • the interpolation frame of the left image and the right image is sequentially output from the image output unit 306 to the image integration unit 203.
  • FIG. 4 shows an internal configuration example of the parallax correction unit 304.
  • the projection unit 401 When the projection unit 401 reads the left reference initial parallax OD_L from the parallax memory 303 and also reads the right reference initial parallax OD_R from the parallax memory 313, the projection unit 401 projects the left reference initial parallax OD_L to the right reference initial parallax OD_R, and The left reference projected parallax (Projected Disparity L: PD_L) is written in the projected parallax memory 402.
  • projection is possible if the difference between the initial parallaxes OD_L and OD_R based on the left and right images is equal to or less than a predetermined threshold, but no projection is performed if the threshold is exceeded.
  • the place not projected is an occlusion area, and the projection unit 401 outputs a left reference occlusion signal (Occlusion L: Occ_L).
  • the occlusion signal is a binary signal indicating 0 in the occlusion area and 1 in other areas.
  • the projection unit 411 when the projection unit 411 reads the left reference initial parallax from the parallax memory 303 and reads the right reference initial parallax from the parallax memory 313, the projection unit 411 projects the right reference initial parallax to the left reference initial parallax,
  • the reference projected parallax (Projected Disparity R: PD_R) is written in the projected parallax memory 412.
  • the projection unit 411 outputs a right reference occlusion signal (Occlusion R: Occ_R) indicating a place where the projection is not performed.
  • FIG. 5 illustrates an example of a method of performing projection between initial parallaxes with reference to the left and right images. As shown in the drawing, one initial parallax pixel position is projected onto the other initial parallax pixel position in accordance with the parallax value (Disparity Value: DV).
  • the estimated parallax values are the same in the forward direction and the reverse direction, that is, the sum of the parallax values at the time of projection in the forward direction and the reverse direction (DVR ⁇ L ′ + DVL ⁇ R ) is within the threshold TH.
  • C How similar area FWD / BWD estimation? (DVR ⁇ L ′ + DVL ⁇ R ) ⁇ TH). If the above condition is satisfied, parallax is projected, and if the condition is not satisfied, the projection is not performed and it is determined as occlusion.
  • the occlusion signal stabilization unit 403 is based on the premise that the occlusion region larger than the fine occlusion region is more stable than the left reference occlusion signal that is calculated by the projection unit 401 and is not spatially stabilized. Process and stabilize spatially. Similarly, the occlusion signal stabilization unit 413 processes the right reference occlusion signal in the projection unit 411 and spatially stabilizes it.
  • each occlusion signal Occ_L, Occ_R is subjected to a horizontal median filter (MED_H) and a vertical median filter (MED_V), and the occlusion signal of the pixel of interest is subjected to a majority decision in the horizontal and vertical directions. Then, a process of replacing the scattered occlusion signals with the median value of the peripheral pixels is performed.
  • the portion that has been cut by applying the median filter is widened by a spread portion (Spread) to obtain a stabilized left reference occlusion signal Occ_L ′ and a stabilized right reference occlusion signal Occ_R ′.
  • FIG. 6 shows an example in which the left and right occlusion signals Occ_L and Occ_R are stabilized by the occlusion signal stabilization units 403 and 413. In the figure, the occlusion area is shown in black.
  • (0, 1) corresponds to the occlusion area of the left image
  • (1, 0) corresponds to the occlusion area of the right image
  • (1, 1) corresponds to the non-occlusion area of the left and right images.
  • (0, 0) is an indefinite state (Unknown Status) that is an occlusion area in both the left and right images, which is impossible in reality.
  • the occlusion signal cleaning unit 404 performs processing for removing an indefinite state that is occluded in the left and right images from the left reference occlusion signal Occ_L ′. Further, the occlusion signal cleaning unit 414 performs a process for removing an indefinite state from the right reference occlusion signal Occ_R ′.
  • FIG. 7 shows a state where an indefinite state portion of the occlusion signal is removed.
  • the part where the left and right occlusion signals are 0, that is, the indefinite state is shown in black.
  • the occlusion signal cleaning unit 404 and the occlusion signal cleaning unit 414 hold the value of the occlusion signal Occ_L ′ or Occ_R ′ that is not indefinite, and scan the image from left to right while comparing with the value of the upper pixel.
  • Occ_L ′ or Occ_R ′ that is not indefinite
  • the value of the occlusion signal Occ_L ′ or Occ_R ′ that is not indefinite is held, and the image is scanned from right to left while comparing with the value of the upper pixel, and when the pixel in the indefinite state is reached.
  • the value held is the same as the value of the upper pixel, the value is filled in the indefinite state pixel, and when the value is different, the original value of the indefinite state pixel is held.
  • the occlusion signal Occ_L ′ or Occ_R ′ is viewed in the vertical direction, and when there is an isolated occlusion value for one horizontal line, the isolated indefinite state line is filled with the neighboring value. remove.
  • the occlusion signals cleaned by the occlusion signal cleaning unit 404 and the occlusion signal cleaning unit 414 are Occ_L ′′ and Occ_R ′′, respectively.
  • the mixing unit 405 performs a process of mixing the left reference parallax with the right reference parallax to obtain a left reference parallax (Disparity L: D_L) with a clean outline. Specifically, the mixing unit 405 inputs the processed left reference occlusion signal Occ_L ′′ from the occlusion signal cleaning unit 404, and the left reference initial parallax OD_L from the initial parallax memory 302 and the right from the projection parallax memory 412.
  • FIG. 8 shows an example of the left reference initial parallax OD_L and the left reference parallax D_L after being mixed with the right reference projection parallax PD_R by the mixing unit 405.
  • the mixing unit 415 performs a process of mixing the right reference parallax with the left reference parallax to obtain a right reference parallax (Disparity L: D_L) with a clean outline. Specifically, the mixing unit 415 inputs the processed right reference occlusion signal Occ_R ′′ from the occlusion signal cleaning unit 414, and receives the right reference initial parallax OD_R from each initial parallax memory 312 from the projection parallax memory 402.
  • the left reference projection parallax PD_L is input, and the right reference initial parallax OD_R is replaced with the left reference projection parallax PD_L for the portion where the occlusion occurs in the right image according to the occlusion signal Occ_R ′′, and the right reference parallax is obtained.
  • the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R for the portion where occlusion occurs in the left image
  • the right reference initial parallax OD_R is replaced with the left reference projection parallax for the portion where occlusion occurs in the right image.
  • the replacement with PD_L uses the characteristic that the parallax contour of the image on the side where no occlusion occurs is easier to obtain than the parallax contour of the image on the side where occlusion occurs. is there.
  • the parallax cleaning units 407 and 417 at the final stage perform processing for reducing halos on the left and right parallax images D_L and D_R written in the parallax memories 406 and 416.
  • the parallax cleaning unit 407 compares the initial occlusion signal Occ_L with the processed occlusion signal Occ_L ′′, and detects a portion where the value has changed as isolated parallax noise.
  • FIG. 9A shows a state after mixing. In the left reference parallax D_L, a place where the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R is illustrated, and Fig. 9B compares the initial occlusion signal Occ_L with the processed occlusion signal Occ_L ". It shows how the changed part is detected.
  • the parallax cleaning unit 407 detects an isolated parallax
  • the parallax cleaning unit 407 performs a process of filling with a non-isolated parallax value.
  • the left reference parallax D_L is scanned from the left to the right while holding the non-isolated parallax value, and when the parallax reaches the isolated parallax, the parallax value is replaced with the non-isolated parallax value.
  • FIG. 10 illustrates a result of removing isolated noise from the left reference parallax D_L after the mixing process by the mixing unit 405.
  • the parallax cleaning unit 407 simultaneously performs the process of removing the isolated parallax noise, and the mixed parallax image D_L written in the parallax memory 406 based on the processed occlusion signal Occ_L ′′.
  • the parallax boundary on the occlusion side that has become far from the actual object boundary due to the mixing process is brought closer to the object boundary.
  • the cleaning unit 407 detects an area where occlusion occurs in the left parallax D_L based on the processed occlusion signal Occ_L ′′, the cleaning unit 407 scans the left reference parallax D_L from the left to the right while scanning the right side (foreground). Replace the parallax value with the left side (background) parallax value, (copy from left to right), and finally And it outputs the parallax D_L' such left reference.
  • FIG. 11 shows a state in which the foreground parallax value on the occlusion side is replaced with the background parallax value.
  • the parallax boundary is brought closer to the boundary of the object by the difference (BA) of the parallax value A on the left side (ie background) to be replaced with the parallax value B on the right side (ie foreground) before the substitution. Controls the distance. By doing so, it is possible to return the parallax boundary that has spread to the occlusion side by the mixing unit 405. And make sure that the halo range shrinks cleanly and doesn't overdo it.
  • the parallax cleaning unit 417 compares the initial occlusion signal Occ_R with the processed occlusion signal Occ_R ′′ and detects a portion where the value has changed as isolated parallax noise, the parallax value that is not isolated For example, the right reference parallax D_R is scanned from right to left while holding a non-isolated parallax value, and when the parallax is reached, it is replaced with a non-isolated parallax value.
  • the parallax cleaning unit 417 simultaneously performs the process of removing the isolated parallax noise, and the mixed parallax image D_R written in the parallax memory 416 based on the processed occlusion signal Occ_R ′′.
  • the parallax boundary on the occlusion side that has become far from the actual object boundary due to the mixing process is brought closer to the object boundary.
  • the cleaning unit 417 detects a region where occlusion occurs in the right parallax D_R based on the processed occlusion signal Occ_R ′′, the cleaning unit 417 scans the right reference parallax D_R from the right to the left while scanning the left side (foreground). Is replaced with the right (background) parallax value, and the final right reference parallax D_R ′ is output.
  • the boundary of the parallax image of the occlusion area is brought close to the boundary of the object, and the halo phenomenon generated near the contour of the object is suppressed, and an isolated noise portion in the parallax image is preferably used. Can be removed.
  • a first projection unit that projects a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal, and projects a right reference initial parallax to a left reference initial parallax.
  • a second projection unit that obtains a right projection parallax and a right reference occlusion signal, a first occlusion signal processing unit that processes a left reference occlusion signal, and a second occlusion signal process that processes a right reference occlusion signal
  • a first mixing unit that mixes the left reference parallax into the right reference projection parallax
  • a second mixing unit that mixes the right reference parallax into the left reference projection parallax
  • the first mixing unit that mixes the left reference parallax into the right reference projection parallax
  • a first parallax cleaning unit that removes noise from the left reference parallax after being processed by the unit
  • a second parallax cleaning unit that removes noise from the right reference parallax after being processed by the second mixing unit
  • the image processing device wherein the first and second occlusion signal processing units spatially stabilize the occlusion signal.
  • the first and second occlusion signal processing units perform spread processing after applying the majority filtering in the horizontal and vertical directions to the scattered occlusion signals. .
  • the first and second occlusion signal processing sections remove an indefinite state that becomes an occlusion in both the left and right occlusion signals.
  • the image processing apparatus according to claim 1.
  • the first mixing unit converts the left reference initial parallax to the right reference projection parallax in a portion where occlusion occurs in the left image according to the occlusion signal processed by the first occlusion signal processing unit.
  • the second mixing unit replaces the right reference initial parallax with the left reference projection parallax in a portion where occlusion occurs in the right image in accordance with the occlusion signal processed by the second occlusion signal processing unit.
  • the image processing apparatus according to (1) above.
  • the first parallax cleaning unit compares an initial occlusion signal with an occlusion signal processed by the first occlusion signal processing unit, and isolates a portion whose value has changed in the left reference parallax. Detected as parallax noise and filled with non-isolated parallax values, the second parallax cleaning unit compares the initial occlusion signal and the occlusion signal processed by the second occlusion signal processing unit.
  • the image processing apparatus wherein a portion where the value has changed is detected as isolated parallax noise in the right reference parallax and is filled with a non-isolated parallax value.
  • the first parallax cleaning unit converts the foreground parallax value to the background parallax value based on the occlusion signal processed by the first occlusion signal processing unit with respect to the left reference parallax.
  • the second parallax cleaning unit performs a replacement process, and converts the foreground parallax value to the background parallax value based on the occlusion signal processed by the second occlusion signal processing unit with respect to the right reference parallax.
  • the first parallax cleaning step for removing noise from the left reference parallax after processing in the step, and the noise from the right reference parallax after processing in the second mixing step An image processing method having a second parallax cleaning step of removed by, a.
  • an initial parallax generation unit that generates left- and right-reference initial parallax from the left image and the right image, a parallax correction unit that corrects the parallax value of the occlusion area included in the initial parallax, Based on the parallax, an interpolated image generating unit that generates an interpolated image that interpolates the original image, an image integrating unit that integrates the original image and the interpolated image, and the original image or an image after the image integrating unit is integrated are displayed A display unit, wherein the parallax correction unit projects a left reference initial parallax onto a right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal; and a right reference A second projection unit for projecting the initial parallax to the left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal; a first occlusion signal processing unit for processing the left reference occlusion
  • the first parallax cleaning unit for removing noise from the left reference parallax after processing by the first mixing unit, and the right reference parallax after processing by the second mixing unit
  • An image display apparatus provided with the 2nd parallax cleaning part which removes noise.
  • the technique disclosed in the present specification is applied to, for example, a high frame rate or a naked-eye three-dimensional image display so that the boundary of the parallax image in the occlusion region is brought close to the boundary of the object, and the halo generated in the vicinity of the contour of the object. While suppressing the phenomenon, it is possible to suitably remove the isolated noise portion in the parallax image.
  • image processing in the embodiment described in this specification can be performed by either hardware or software.
  • processing is realized by software, a computer program in which processing procedures in the software are described in a computer-readable format may be installed and executed on a predetermined computer.
  • DESCRIPTION OF SYMBOLS 100 ... Display apparatus 110 ... Video display part 112 ... Liquid crystal panel 113 ... Gate driver, 114 ... Data driver 115 ... Light source 120 ... Video signal processing part 140 ... Timing control part 201 ... Image input part, 202 ... Interpolation frame production
  • Frame generating unit (right reference) 306 Image output unit 401, 411 ... Projection unit 402, 412 ... Projection parallax memory 403, 413 ... Occlusion signal stabilization unit 404, 414 ... Occlusion signal cleaning unit 405, 415 ... Mixing unit 406, 416 ... Parallax memory 407, 417 ... Parallax cleaning unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The purpose of the invention is to correct a disparity value in an occlusion region included in an original disparity. An image processing device projects a left-based original disparity to a right-based original disparity to obtain a left projection disparity and a left-based occlusion signal, projects the right-based original disparity to the left-based original disparity to obtain a right projection disparity and a right-based occlusion signal, and performs stabilization and cleanup processes on each of the left-based and right-based occlusion signals. The image processing device also performs mixing processes in which a left-based disparity is mixed with the right-based projection disparity and a right-based disparity is mixed with the left-based projection disparity and then removes noise to obtain final left-based and right-based disparities.

Description

画像処理装置及び画像処理方法、並びに画像表示装置Image processing apparatus, image processing method, and image display apparatus
 本明細書で開示する技術は、3次元画像を処理する画像処理装置及び画像処理方法、並びに画像表示装置に係り、特に、左画像と右画像間のオクルージョン領域における視差の情報を補正する画像処理装置及び画像処理方法、並びに画像表示装置に関する。 The technology disclosed in this specification relates to an image processing apparatus, an image processing method, and an image display apparatus that process a three-dimensional image, and in particular, image processing for correcting parallax information in an occlusion area between a left image and a right image. The present invention relates to an apparatus, an image processing method, and an image display apparatus.
 最近、左画像と右画像を多重して表示し、観察者に立体視させる3次元表示システムが普及してきている。ここで、左画像と右画像間の視差は、3次元空間における座標値に相当する。視差情報は、例えば、倍速変換時や、裸眼3Dの多視点画像における補間画像を生成する際に必要となる。視差を得るには、例えばブロック・マッチング法により左画像と右画像間の対応画素を探索するのが一般的である。 Recently, a three-dimensional display system in which a left image and a right image are displayed in a multiplexed manner to allow an observer to view stereoscopically has become widespread. Here, the parallax between the left image and the right image corresponds to a coordinate value in a three-dimensional space. The disparity information is required, for example, when double-speed conversion or when generating an interpolation image in a multi-view image of the naked eye 3D. In order to obtain the parallax, it is common to search for a corresponding pixel between the left image and the right image by, for example, a block matching method.
 左画像と右画像のように視点の異なる画像間では、一方の画像には映っているが他方の画像では隠れているオクルージョン(occlusion)領域が生じる。オクルージョン領域では、画像間で対応する画素が存在しないため、精度よく視差を推定することは困難である。オクルージョン領域は、例えば物体と背景との境界のように、奥行きが不連続に変化する視差情報の境界近傍に存在する。ところが、視差情報の境界近傍での推定値は、補間画像における物体の輪郭を形成する重要な情報である。視差情報の境界近傍で生じる視差の推定誤差は、前景領域の画素が背景に張り付く、逆に背景領域の画素が前景に張り付く、物体の輪郭が乱れる、物体の輪郭近傍の背景領域に偽輪郭が生じるなど、ハロー現象の原因になる。 O Between the images with different viewpoints, such as the left image and the right image, an occlusion region appears in one image but hidden in the other image. In the occlusion area, there is no corresponding pixel between images, so it is difficult to accurately estimate the parallax. The occlusion area is present in the vicinity of the boundary of disparity information where the depth changes discontinuously, such as the boundary between the object and the background. However, the estimated value near the boundary of the parallax information is important information that forms the contour of the object in the interpolated image. The estimation error of the parallax that occurs near the boundary of the parallax information is that the pixels in the foreground area stick to the background, conversely, the pixels in the background area stick to the foreground, the outline of the object is disturbed, and the false area is in the background area near the outline of the object It causes the halo phenomenon.
 例えば、オクルージョン領域とその周囲領域とからなる画像参照領域を複数のクラスターに分割し、各クラスターに注目して視差情報を算出する際に、対応関係が検出された画素を含む注目クラスターについては、注目クラスター内の視差情報の平均値を注目クラスター内のオクルージョン画素の視差情報とし、対応関係が検出された画素がない注目クラスターについては、画像参照領域内で特徴量が最も近いクラスター内の視差情報の平均値を注目クラスター内のオクルージョン画素の視差情報とすることにより、視差情報の境界を実際の物体の境界へ近づける画像処理装置について提案がなされている(例えば、特許文献1を参照のこと)。しかしながら、そもそもオクルージョン領域において生成された視差情報は原理的に正しく求めることができないため、不均一でノイズのような形状になることが懸念される。 For example, when dividing an image reference area composed of an occlusion area and its surrounding area into a plurality of clusters and calculating disparity information by paying attention to each cluster, the attention cluster including the pixels in which the correspondence relationship is detected is as follows. The average value of the disparity information in the attention cluster is used as the disparity information of the occlusion pixel in the attention cluster, and for the attention cluster having no detected correlation, the disparity information in the cluster having the closest feature amount in the image reference area An image processing apparatus that brings the boundary of disparity information closer to the boundary of an actual object by using the average value of the above as the disparity information of the occlusion pixels in the cluster of interest has been proposed (see, for example, Patent Document 1). . However, since the parallax information generated in the occlusion area cannot be obtained correctly in principle, there is a concern that it may have a non-uniform and noise-like shape.
 また、左画像と右画像間の視差マップ内で、視差輪郭すなわち視差情報の境界を挟んで対向する一対の探索ウィンドウを設定し、例えば一対の探索ウィンドウ内の左画像に基づいて一対の探索ウィンドウのうちオクルージョン領域が含まれている方を判別し、オクルージョン領域が存在する探索ウィンドウ内のオクルージョン領域の視差を他方の探索ウィンドウ側の視差に基づいて補正する画像処理装置について提案がなされている(例えば、特許文献2を参照のこと)。しかしながら、左画像と右眼画像の視差は同じ位置ではないため、正しい視差を生成することができず、その結果ハロー現象が大きくなることが懸念される。また、オクルージョン領域を含まない探索ウィンドウ側の視差を使用してもやはり、不均一な部分が発生し、それかノイズのようなハローとして知覚されてしまうことがある。 Also, in the parallax map between the left image and the right image, a pair of search windows facing each other across the parallax contour, that is, the boundary of the parallax information, is set, for example, a pair of search windows based on the left image in the pair of search windows An image processing apparatus has been proposed that determines which of the occlusion areas is included and corrects the parallax of the occlusion area in the search window in which the occlusion area exists based on the parallax on the other search window side ( For example, see Patent Document 2). However, since the parallax of the left image and the right eye image is not the same position, there is a concern that correct parallax cannot be generated, resulting in an increase in the halo phenomenon. In addition, even when using the parallax on the search window side that does not include the occlusion area, a non-uniform portion is generated, which may be perceived as a halo such as noise.
特開2011-60216号公報JP 2011-60216 A 特開2011-60116号公報JP 2011-60116 A
 本明細書で開示する技術の目的は、左画像と右画像間のオクルージョン領域における視差の情報を好適に補正することができる、優れた画像処理装置及び画像処理方法、並びに画像表示装置を提供することにある。 An object of the technology disclosed in the present specification is to provide an excellent image processing device, an image processing method, and an image display device that can suitably correct parallax information in an occlusion region between a left image and a right image. There is.
 本明細書で開示する技術のさらなる目的は、左画像と右画像間のオクルージョン領域における視差の情報を精度よく推定して、視差情報に基づいて生成する補間画像に生じるハロー現象を低減することができる、優れた画像処理装置及び画像処理方法、並びに画像表示装置を提供することにある。 A further object of the technology disclosed in this specification is to accurately estimate parallax information in an occlusion region between a left image and a right image, and to reduce a halo phenomenon that occurs in an interpolated image generated based on the parallax information. An object of the present invention is to provide an excellent image processing apparatus, image processing method, and image display apparatus.
 本願は、上記課題を参酌してなされたものであり、請求項1に記載の技術は、
 左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、
 右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、
 左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、
 右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、
 左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、
 右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、
 前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、
 前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部と、
を具備する画像処理装置である。
The present application has been made in consideration of the above problems, and the technology according to claim 1
A first projection unit that projects a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal;
A second projection unit that projects the right reference initial parallax to the left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal;
A first occlusion signal processing unit for processing a left reference occlusion signal;
A second occlusion signal processing unit for processing a right reference occlusion signal;
A first mixing unit that performs processing for mixing the left reference parallax into the right reference projected parallax;
A second mixing unit for mixing the right reference parallax into the left reference projection parallax;
A first parallax cleaning unit that removes noise from the left reference parallax after processing in the first mixing unit;
A second parallax cleaning unit for removing noise from the right reference parallax after processing in the second mixing unit;
Is an image processing apparatus.
 本願の請求項2に記載の技術によれば、請求項1に記載の画像処理装置の前記第1及び第2のオクルージョン信号処理部は、オクルージョン信号を空間的に安定化するように構成されている。 According to the technique described in claim 2 of the present application, the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are configured to spatially stabilize the occlusion signal. Yes.
 本願の請求項3に記載の技術によれば、請求項1に記載の画像処理装置の前記第1及び第2のオクルージョン信号処理部は、点在しているオクルージョン信号に対し水平及び垂直方向の多数決のフィルターにかけた後にスプレッド処理するように構成されている。 According to the technique described in claim 3 of the present application, the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are arranged in horizontal and vertical directions with respect to the scattered occlusion signals. It is configured to spread after filtering by majority vote.
 本願の請求項4に記載の技術によれば、請求項1に記載の画像処理装置の前記第1及び第2のオクルージョン信号処理部は、左右のオクルージョン信号でともにオクルージョンとなる不定状態を取り除くように構成されている。 According to the technique described in claim 4 of the present application, the first and second occlusion signal processing units of the image processing apparatus according to claim 1 are configured to remove an indefinite state in which both occlusion signals are occluded. It is configured.
 本願の請求項5に記載の技術によれば、請求項1に記載の画像処理装置の前記第1の混合部は、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号に応じて左画像にオクルージョンが発生している部分では、左基準初期視差を右基準投影視差で置き換え、また、前記第2の混合部は、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号に応じて右画像にオクルージョンが発生している部分では、右基準初期視差を左基準投影視差で置き換えるように構成されている。 According to the technique described in claim 5 of the present application, the first mixing unit of the image processing device according to claim 1 is configured to generate a left image according to an occlusion signal processed by the first occlusion signal processing unit. In the portion where the occlusion occurs, the left reference initial parallax is replaced with the right reference projection parallax, and the second mixing unit changes the right according to the occlusion signal processed by the second occlusion signal processing unit. In the portion where the occlusion occurs in the image, the right reference initial parallax is replaced with the left reference projection parallax.
 本願の請求項6に記載の技術によれば、請求項1に記載の画像処理装置の前記第1の視差清浄化部は、初期のオクルージョン信号と前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を左基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋め、また、前記第2の視差清浄化部は、初期のオクルージョン信号と前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を右基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋めるように構成されている。 According to the technique described in claim 6 of the present application, the first parallax cleaning unit of the image processing device according to claim 1 performs processing after processing by an initial occlusion signal and the first occlusion signal processing unit. Comparing the occlusion signals, detecting the part where the value has changed as isolated parallax noise in the left reference parallax, and filling it with the non-isolated parallax value, and the second parallax cleaning unit, The initial occlusion signal is compared with the occlusion signal processed by the second occlusion signal processing unit, and the portion where the value is changed is detected as isolated parallax noise in the right reference parallax, and the non-isolated parallax It is configured to fill with the value of.
 本願の請求項7に記載の技術によれば、請求項6に記載の画像処理装置の前記第1の視差清浄化部は、左基準の視差に対し、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行ない、また、前記第2の視差清浄化部は、右基準の視差に対し、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行なうように構成されている。 According to the technique described in claim 7 of the present application, the first parallax cleaning unit of the image processing device according to claim 6 performs processing on the left reference parallax by the first occlusion signal processing unit. Based on the later occlusion signal, a process of replacing the value of the foreground parallax with the value of the background parallax is performed, and the second parallax cleaning unit performs the second occlusion signal with respect to the right reference parallax. Based on the occlusion signal after processing by the processing unit, processing is performed to replace the foreground parallax value with the background parallax value.
 また、本願の請求項8に記載の技術は、
 左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影ステップと、
 右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影ステップと、
 左基準のオクルージョン信号を処理する第1のオクルージョン信号処理ステップと、
 右基準のオクルージョン信号を処理する第2のオクルージョン信号処理ステップと、
 左基準の視差を右基準の投影視差へ混合処理する第1の混合ステップと、
 右基準の視差を左基準の投影視差へ混合処理する第2の混合ステップと、
 前記第1の混合ステップで処理した後の左基準の視差からノイズを除去する第1の視差清浄化ステップと、
 前記第2の混合ステップで処理した後の右基準の視差からノイズを除去する第2の視差清浄化ステップと、
を有する画像処理方法である。
Further, the technique described in claim 8 of the present application is:
A first projection step of projecting a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal;
A second projection step of projecting a right reference initial parallax to a left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal;
A first occlusion signal processing step for processing a left-referenced occlusion signal;
A second occlusion signal processing step for processing a right reference occlusion signal;
A first mixing step of mixing the left reference parallax into the right reference projected parallax;
A second mixing step for mixing the right reference parallax into the left reference projection parallax;
A first parallax cleaning step that removes noise from the left reference parallax after processing in the first mixing step;
A second parallax cleaning step for removing noise from the right reference parallax after processing in the second mixing step;
Is an image processing method.
 また、本願の請求項9に記載の技術は、
 左画像及び右画像から左基準及び右基準の初期視差をそれぞれ生成する初期視差生成部と、
 前記初期視差に含まれるオクルージョン領域の視差の値を補正する視差補正部と、
 補正後の視差に基づいて、元画像を補間する補間画像を生成する補間画像生成部と、
 元画像と補間画像を統合する画像統合部と、
 元画像又は前記画像統合部が統合した後の画像を表示する表示部と、
を具備し、
 前記視差補正部は、左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部を備える、
画像表示装置である。
Further, the technique according to claim 9 of the present application is
An initial parallax generating unit for generating left-reference and right-reference initial parallax from the left image and the right image, respectively;
A parallax correction unit that corrects a parallax value of an occlusion area included in the initial parallax;
An interpolated image generating unit that generates an interpolated image for interpolating the original image based on the corrected parallax;
An image integration unit for integrating the original image and the interpolation image;
A display unit for displaying an original image or an image after integration by the image integration unit;
Comprising
The parallax correction unit projects a left reference initial parallax onto a right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal; and a right reference initial parallax as a left reference initial parallax A second projection unit that projects right projection parallax and a right reference occlusion signal; a first occlusion signal processing unit that processes a left reference occlusion signal; and a second projection unit that processes a right reference occlusion signal An occlusion signal processing unit; a first mixing unit that mixes a left reference parallax into a right reference projected parallax; a second mixing unit that mixes a right reference parallax into a left reference projected parallax; A first parallax cleaning unit that removes noise from the left reference parallax after processing by the first mixing unit, and a second that removes noise from the right reference parallax after processing by the second mixing unit A parallax cleaning unit;
An image display device.
 本明細書で開示する技術によれば、左画像と右画像間のオクルージョン領域における視差の情報を精度よく推定して、視差情報に基づいて生成する補間画像に生じるハロー現象を低減することができる、優れた画像処理装置及び画像処理方法、並びに画像表示装置を提供することができる。 According to the technology disclosed in this specification, it is possible to accurately estimate parallax information in an occlusion area between a left image and a right image, and to reduce a halo phenomenon that occurs in an interpolated image generated based on the parallax information. An excellent image processing apparatus, image processing method, and image display apparatus can be provided.
 また、本明細書で開示する技術によれば、例えばハイフレームレート化や裸眼3次元画像表示時に適用することにより、オクルージョン領域の視差の境界を物体の境界に程良く近づけて、物体の輪郭近傍に生じるハロー現象を抑制するとともに、視差内で孤立したノイズ部分を好適に除去することができる。 Further, according to the technique disclosed in the present specification, the parallax boundary of the occlusion area is brought close to the object boundary, for example, by applying it at a high frame rate or when displaying a naked-eye three-dimensional image. In addition, the halo phenomenon that occurs in the parallax can be suppressed, and the isolated noise portion within the parallax can be suitably removed.
 本明細書で開示する技術のさらに他の目的、特徴や利点は、後述する実施形態や添付する図面に基づくより詳細な説明によって明らかになるであろう。 Other objects, features, and advantages of the technology disclosed in the present specification will become apparent from a more detailed description based on embodiments to be described later and the accompanying drawings.
図1は、本明細書で開示する技術を適用可能な画像表示装置100の機能的構成を模式的に示した図である。FIG. 1 is a diagram schematically illustrating a functional configuration of an image display apparatus 100 to which the technology disclosed in this specification can be applied. 図2は、映像信号処理部120内でハイフレームレート化若しくは多視点化のための処理を行なう機能的構成を示したブロック図である。FIG. 2 is a block diagram showing a functional configuration for performing processing for increasing the frame rate or increasing the number of viewpoints in the video signal processing unit 120. 図3は、補間フレーム生成部202の内部構成を示した図である。FIG. 3 is a diagram illustrating an internal configuration of the interpolation frame generation unit 202. 図4は、視差補正部304の内部構成例を示した図である。FIG. 4 is a diagram illustrating an internal configuration example of the parallax correction unit 304. 図5は、左右の画像の初期視差間で投影を行なう方法の一例を示した図である。FIG. 5 is a diagram showing an example of a method for performing projection between initial parallaxes of left and right images. 図6は、左右のオクルージョン信号Occ_L、Occ_Rを安定化処理する例を示した図である。FIG. 6 is a diagram showing an example of stabilizing the left and right occlusion signals Occ_L and Occ_R. 図7は、オクルージョン信号の不定状態の部分を取り除く様子を示した図である。FIG. 7 is a diagram illustrating a state in which an indefinite state portion of the occlusion signal is removed. 図8は、左基準初期視差OD_Lと、混合部405で右基準投影視差PD_Rと混合した後の左基準視差D_Lの一例を示した図である。FIG. 8 is a diagram illustrating an example of the left reference initial parallax OD_L and the left reference parallax D_L after being mixed with the right reference projection parallax PD_R by the mixing unit 405. 図9Aは、混合した後の左基準視差D_Lにおいて、左基準初期視差OD_Lを右基準投影視差PD_Rで置き換えた場所を例示した図である。FIG. 9A is a diagram illustrating a place where the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R in the mixed left reference parallax D_L. 図9Bは、初期のオクルージョン信号Occ_Lと加工後のオクルージョン信号Occ_L″を比較変化した部分を検出する様子を示した図である。FIG. 9B is a diagram illustrating a state in which a portion in which the initial occlusion signal Occ_L and the processed occlusion signal Occ_L ″ are compared and changed is detected. 図10は、混合部405で混合処理した後の左基準視差D_Lから孤立したノイズを取り除いた結果を例示した図である。FIG. 10 is a diagram exemplifying a result of removing isolated noise from the left reference parallax D_L after being mixed by the mixing unit 405. 図11は、オクルージョン側の前景の視差の値を背景の視差の値に置き換える様子を示した図である。FIG. 11 is a diagram illustrating a state in which the foreground parallax value on the occlusion side is replaced with the background parallax value. 図12は、左画像、右画像から左基準の初期視差OD_L並びに右基準の初期視差OD_Rを求める様子を模式的に示した図である。FIG. 12 is a diagram schematically illustrating how the left reference initial parallax OD_L and the right reference initial parallax OD_R are obtained from the left image and the right image.
 以下、図面を参照しながら本明細書で開示する技術の実施形態について詳細に説明する。 Hereinafter, embodiments of the technology disclosed in this specification will be described in detail with reference to the drawings.
 図1には、本明細書で開示する技術を適用可能な画像表示装置100の機能的構成を模式的に示している。図示の画像表示装置100は、例えば左画像と右画像からなる3次元映像信号を入力し、必要に応じてハイフレームレート化又は多視点化の処理を行ない、複数の視点の画像を例えば空間的に多重して画面に表示する。そして、視聴者は、3次元映像視聴用の眼鏡をかけ、あるいは裸眼で画像を3次元的に観察することができる。 FIG. 1 schematically shows a functional configuration of an image display apparatus 100 to which the technology disclosed in this specification can be applied. The illustrated image display device 100 receives, for example, a 3D video signal composed of a left image and a right image, performs high frame rate processing or multi-viewpoint processing as necessary, and converts multiple viewpoint images into, for example, spatial images. Are displayed on the screen. Then, the viewer can wear glasses for viewing 3D video, or can observe the image 3D with the naked eye.
 画像表示装置100は、映像表示部110と、映像信号処理部120と、タイミング制御部140を備えている。 The image display device 100 includes a video display unit 110, a video signal processing unit 120, and a timing control unit 140.
 映像信号処理部120は、映像信号処理部120の外部機器からの映像信号の伝送を受けると、映像表示部110における映像表示に適したものとなるように各種信号処理を実行して出力する。なお、ここで言う、映像信号の伝送元となる「外部機器」には、ディジタル放送の受信機や、ブルーレイ・ディスク・プレイヤーなどのコンテンツ再生装置を挙げることができる。 When the video signal processing unit 120 receives the transmission of the video signal from the external device of the video signal processing unit 120, the video signal processing unit 120 executes various signal processing so as to be suitable for video display in the video display unit 110 and outputs it. The “external device” serving as the transmission source of the video signal mentioned here may include a digital broadcast receiver and a content playback device such as a Blu-ray disc player.
 映像信号処理部120内では、例えば画像の鮮鋭度のエンハンスやコントラスト改善などの画質補正処理が行なわれる。また、本実施形態では、映像信号処理部120内で、多視点化のための視点画像を補間する補間フレームの生成を行なう。補間フレームの生成に関する処理については後述に譲る。 In the video signal processing unit 120, for example, image quality correction processing such as enhancement of image sharpness and contrast improvement is performed. In the present embodiment, the video signal processing unit 120 generates an interpolation frame for interpolating viewpoint images for multi-viewpoints. Processing related to the generation of the interpolation frame will be described later.
 タイミング制御部140には、映像信号処理部120で処理後の映像信号が入力される。タイミング制御部140は、入力された左画像信号及び右画像信号を映像表示部110へ入力するための信号に変換するとともに、ゲート・ドライバー113及びデータ・ドライバー114からなるパネル駆動回路の動作に用いられるパルス信号を生成する。 The timing control unit 140 receives the video signal processed by the video signal processing unit 120. The timing control unit 140 converts the input left image signal and right image signal into signals to be input to the video display unit 110, and is used for the operation of the panel drive circuit including the gate driver 113 and the data driver 114. A pulse signal to be generated.
 映像表示部110は、外部から印加された信号に応じた映像の表示を行なう。映像表示部110は、表示パネル112と、ゲート・ドライバー113と、データ・ドライバー114と、光源115を備えている。 The video display unit 110 displays a video corresponding to a signal applied from the outside. The video display unit 110 includes a display panel 112, a gate driver 113, a data driver 114, and a light source 115.
 ゲート・ドライバー113は、順次駆動するための信号を生成する駆動回路であり、タイミング制御部140から伝送された信号に応じて、表示パネル112内の各画素に接続されたゲート・バス・ラインへ、駆動電圧を出力する。また、データ・ドライバー114は、映像信号に基づく駆動電圧を出力する駆動回路であり、タイミング制御部140から伝送された信号並びに映像信号処理部120から出力された映像信号に基づいてデータ線へ印加する信号を生成して出力する。 The gate driver 113 is a drive circuit that generates a signal for driving sequentially, and to the gate bus line connected to each pixel in the display panel 112 according to the signal transmitted from the timing control unit 140. The drive voltage is output. The data driver 114 is a drive circuit that outputs a drive voltage based on the video signal, and is applied to the data line based on the signal transmitted from the timing control unit 140 and the video signal output from the video signal processing unit 120. Generate and output signals.
 表示パネル112は、例えば格子状に配列された複数の画素を有する。液晶表示パネルの場合、ガラスなどの透明板の間に所定の配向状態を有する液晶分子が封入されており、外部からの信号の印加に応じて画像を表示する。上述したように、表示パネル112への信号の印加はゲート・ドライバー113及びデータ・ドライバー114によって実行される。 The display panel 112 has a plurality of pixels arranged in a grid, for example. In the case of a liquid crystal display panel, liquid crystal molecules having a predetermined alignment state are sealed between transparent plates such as glass, and an image is displayed according to the application of a signal from the outside. As described above, the application of signals to the display panel 112 is executed by the gate driver 113 and the data driver 114.
 光源115は、視聴者側から見て画像表示部110の一番奥に設けられるバックライトである。画像表示部110に画像を表示する際、光源115からは無偏光の白色光が視聴者側に位置する表示パネル112に出射される。 The light source 115 is a backlight provided at the back of the image display unit 110 when viewed from the viewer side. When displaying an image on the image display unit 110, unpolarized white light is emitted from the light source 115 to the display panel 112 located on the viewer side.
 なお、本明細書では、映像表示部110として液晶ディスプレイを用いた実施形態について記載するが、本明細書で開示する技術の要旨はこれに限定されるものではない。例えばOLED(Organic Light Emitting Diode)やLED(Light Emitting Diode)など、複数の色成分のセルによって1つの画素を形成し、複数の画素を水平方向及び垂直方向に順次配置して構成される、他のディスプレイにも同様に本発明を適用することができる。 In this specification, an embodiment using a liquid crystal display as the video display unit 110 is described, but the gist of the technology disclosed in this specification is not limited to this. For example, one pixel is formed by cells of a plurality of color components, such as OLED (Organic Light Emitting Diode) and LED (Light Emitting Diode), and a plurality of pixels are sequentially arranged in the horizontal and vertical directions. The present invention can be similarly applied to these displays.
 図2には、映像信号処理部120内で、ハイフレームレート化若しくは多視点化のための処理を行なう機能ブロック図を示している。 FIG. 2 shows a functional block diagram for performing processing for increasing the frame rate or increasing the number of viewpoints in the video signal processing unit 120.
 画像入力部201は、画像フレームの時間系列からなる映像信号を入力する。本実施形態では、左画像及び右画像からなる3次元画像信号が入力されるものとする。 The image input unit 201 inputs a video signal composed of a time series of image frames. In the present embodiment, it is assumed that a three-dimensional image signal including a left image and a right image is input.
 補間フレーム生成部202は、入力された画像フレームから、多視点化のための補間フレームを生成する。 The interpolation frame generation unit 202 generates an interpolation frame for multi-viewpoint from the input image frame.
 画像統合部203は、元の画像フレームに、補間フレーム生成部202で生成された補間フレームを挿入して、多視点画像信号を生成する。 The image integration unit 203 generates the multi-viewpoint image signal by inserting the interpolation frame generated by the interpolation frame generation unit 202 into the original image frame.
 図3には、補間フレーム生成部202の内部構成を示している。 FIG. 3 shows an internal configuration of the interpolation frame generation unit 202.
 左画像用フレーム・メモリー301並びに右画像用フレーム・メモリー311は、入力された3次元画像信号の左画像、右画像をそれぞれ記憶する。 The left image frame memory 301 and the right image frame memory 311 store the left image and the right image of the input three-dimensional image signal, respectively.
 初期視差算出部302は、時間的に対応する左画像、右画像をそれぞれ左画像用フレーム・メモリー301、右画像用フレーム・メモリー311から読み出すと、ブロック・マッチング法により、左画像を基準とした初期視差(Original Disparity L:OD_L)を算出して、視差メモリー303に書き込む。同様に、初期視差算出部312は、ブロック・マッチング法により、右画像を基準とした初期視差(Original Disparity R:OD_R)を算出して、視差メモリー313に書き込む。 When the initial parallax calculation unit 302 reads out the temporally corresponding left image and right image from the left image frame memory 301 and the right image frame memory 311, respectively, the initial parallax calculation unit 302 uses the left image as a reference by the block matching method. An initial parallax (Original Disparity L: OD_L) is calculated and written in the parallax memory 303. Similarly, the initial parallax calculation unit 312 calculates an initial parallax (Original Disparity R: OD_R) based on the right image by the block matching method, and writes the initial parallax in the parallax memory 313.
 図12には、左画像、右画像から左基準の初期視差OD_L並びに右基準の初期視差OD_Rを求める様子を模式的に示している。一般には、図示のように、左基準の初期視差OD_Lには物体の左側の輪郭に沿ってオクルージョンが発生し、右基準の初期視差OD_Rには物体の右側の輪郭に沿ってオクルージョンが発生する。 FIG. 12 schematically shows how the left reference initial parallax OD_L and the right reference initial parallax OD_R are obtained from the left image and the right image. In general, as shown in the figure, occlusion occurs along the left contour of the object in the left reference initial parallax OD_L, and occlusion occurs along the right contour of the object in the right reference initial parallax OD_R.
 次いで、視差補正部304では、左基準で算出された初期視差に対する補正処理、並びに、右基準で算出された初期視差に対する補正処理を行なう。左右の画像間には、オクルージョンに起因して視差の推定誤差が生じている領域がある。すなわち、オクルージョン領域では、前景領域の画素が背景に張り付く、逆に背景領域の画素が前景に張り付く、物体の輪郭が乱れる、物体の輪郭近傍の背景領域に偽輪郭が生じるなど、補間画像にハロー現象が発生する原因になる。本実施形態では、視差補正部304は、左右の初期視差に含まれるオクルージョンにおける視差の情報を精度よく推定して、視差情報に基づいて生成する補間画像に生じるハロー現象を低減するようにしている。但し、視差の補正処理の詳細については後述に譲る。 Next, the parallax correction unit 304 performs a correction process for the initial parallax calculated based on the left reference and a correction process for the initial parallax calculated based on the right reference. Between the left and right images, there is a region where a parallax estimation error occurs due to occlusion. In other words, in the occlusion area, pixels in the foreground area stick to the background, conversely, pixels in the background area stick to the foreground, the outline of the object is disturbed, and a false outline appears in the background area near the outline of the object. It causes the phenomenon to occur. In the present embodiment, the parallax correction unit 304 accurately estimates the parallax information in the occlusion included in the left and right initial parallaxes, and reduces the halo phenomenon that occurs in the interpolated image generated based on the parallax information. . However, details of the parallax correction processing will be described later.
 そして、フレーム生成部305は、左画像用フレーム・メモリー301から読み出した左画像フレームを、補正後の左基準の視差に基づいて画素毎に水平方向にシフトして、左画像の補間フレームを生成する。また、フレーム生成部315は、右画像用フレーム・メモリー311から読み出した右画像フレームを、補正後の右基準の視差に基づいて画素毎に水平方向にシフトして、補間フレームを生成する。 Then, the frame generation unit 305 shifts the left image frame read from the left image frame memory 301 horizontally for each pixel based on the corrected left reference parallax, and generates an interpolation frame of the left image. To do. Also, the frame generation unit 315 shifts the right image frame read from the right image frame memory 311 horizontally for each pixel based on the corrected right reference parallax to generate an interpolation frame.
 その後、左画像並びに右画像の補間フレームが、画像出力部306から画像統合部203へ順次出力される。 Thereafter, the interpolation frame of the left image and the right image is sequentially output from the image output unit 306 to the image integration unit 203.
 続いて、視差補正部304で左基準の初期視差並びに右基準の初期視差に対して実施する補正処理について説明する。図4には、視差補正部304の内部構成例を示している。 Subsequently, a correction process performed by the parallax correction unit 304 for the left-reference initial parallax and the right-reference initial parallax will be described. FIG. 4 shows an internal configuration example of the parallax correction unit 304.
 投影部401は、視差メモリー303から左基準初期視差OD_Lを読み出すとともに、視差メモリー313から右基準初期視差OD_Rを読み出すと、左基準初期視差OD_Lを右基準初期視差OD_Rに投影(projection)して、左基準投影視差(Projected Disparity L:PD_L)を投影視差メモリー402に書き込む。 When the projection unit 401 reads the left reference initial parallax OD_L from the parallax memory 303 and also reads the right reference initial parallax OD_R from the parallax memory 313, the projection unit 401 projects the left reference initial parallax OD_L to the right reference initial parallax OD_R, and The left reference projected parallax (Projected Disparity L: PD_L) is written in the projected parallax memory 402.
 ここで、左右の画像をそれぞれ基準とした初期視差OD_LとOD_Rの差が、所定の閾値以下であれば投影可能であるが、閾値を超えると投影されない。投影されない場所はオクルージョン領域であり、投影部401は、左基準のオクルージョン信号(Occlusion L:Occ_L)を出力する。本実施形態では、オクルージョン信号は、オクルージョン領域では0、それ以外の領域では1を示すバイナリー信号とする。 Here, projection is possible if the difference between the initial parallaxes OD_L and OD_R based on the left and right images is equal to or less than a predetermined threshold, but no projection is performed if the threshold is exceeded. The place not projected is an occlusion area, and the projection unit 401 outputs a left reference occlusion signal (Occlusion L: Occ_L). In this embodiment, the occlusion signal is a binary signal indicating 0 in the occlusion area and 1 in other areas.
 同様に、投影部411は、視差メモリー303から左基準初期視差を読み出すとともに、視差メモリー313から右基準初期視差を読み出すと、右基準初期視差を左基準初期視差に投影(projection)して、右基準投影視差(Projected Disparity R:PD_R)を投影視差メモリー412に書き込む。また、投影部411は、投影されない場所を示す右基準のオクルージョン信号(Occlusion R:Occ_R)を出力する。 Similarly, when the projection unit 411 reads the left reference initial parallax from the parallax memory 303 and reads the right reference initial parallax from the parallax memory 313, the projection unit 411 projects the right reference initial parallax to the left reference initial parallax, The reference projected parallax (Projected Disparity R: PD_R) is written in the projected parallax memory 412. In addition, the projection unit 411 outputs a right reference occlusion signal (Occlusion R: Occ_R) indicating a place where the projection is not performed.
 図5には、左右の画像をそれぞれ基準とした初期視差間で投影を行なう方法の一例を図解している。図示のように、一方の初期視差の画素位置を、その視差の値(Disparity Value:DV)に応じて、他方の初期視差の画素位置に投影する。 FIG. 5 illustrates an example of a method of performing projection between initial parallaxes with reference to the left and right images. As shown in the drawing, one initial parallax pixel position is projected onto the other initial parallax pixel position in accordance with the parallax value (Disparity Value: DV).
 例えば、ある画素位置の右基準の初期視差を左基準の初期視差に投影したときの視差の値DVR→Lを推定する(A:Estimate DVR→L)。次いで、視差の値DVR→LがDVR→LよりもXだけ水平移動した位置の視差の値DVR→L´を指しているかどうかをチェックする(B:Checking whereDVR→L is pointing at DVR→L´ =DVR→L(X+DVR→L))。ここで、順方向と逆方向で視差の推定値が同じであるか、すなわち、順方向と逆方向の投影時の視差の値の和(DVR→L´+DVL→R)が閾値TH以内であるかどうかをチェックする(C:How similar are FWD/BWD estimation? (DVR→L´+DVL→R)<TH)。そして、上記の条件を満たせば、視差の投影が行なわれ、条件を満たさなければ、投影せず、オクルージョンと判定する。 For example, a parallax value DV R → L when a right reference initial parallax at a certain pixel position is projected onto a left reference initial parallax is estimated (A: Estimate DV R → L ). Then, it is checked whether the value DV R → L parallax points to the value DV R → L'parallax locations only horizontal movement X than DV R → L (B: Checking whereDV R → L is pointing at DV R → L ′ = DV R → L (X + DV R → L )). Here, the estimated parallax values are the same in the forward direction and the reverse direction, that is, the sum of the parallax values at the time of projection in the forward direction and the reverse direction (DVR → L ′ + DVL → R ) is within the threshold TH. (C: How similar area FWD / BWD estimation? (DVR → L ′ + DVL → R ) <TH). If the above condition is satisfied, parallax is projected, and if the condition is not satisfied, the projection is not performed and it is determined as occlusion.
 オクルージョン信号安定化部403は、投影部401で算出された、空間的に安定化していない左基準のオクルージョン信号を、細かいオクルージョン領域よりも大きいオクルージョン領域の方が安定しているという前提の下、加工して空間的に安定化する。同様に、オクルージョン信号安定化部413は、投影部411で右基準のオクルージョン信号を加工して、空間的に安定化する。 The occlusion signal stabilization unit 403 is based on the premise that the occlusion region larger than the fine occlusion region is more stable than the left reference occlusion signal that is calculated by the projection unit 401 and is not spatially stabilized. Process and stabilize spatially. Similarly, the occlusion signal stabilization unit 413 processes the right reference occlusion signal in the projection unit 411 and spatially stabilizes it.
 オクルージョン信号の空間的な安定化とは、細かい領域若しくは細い線からなるオクルージョン領域を取り除く処理である。例えば、オクルージョン信号が1である注目画素について、周囲の±N画素(但し、Nは正の整数。例えばn=1)の範囲内でオクルージョン信号が0となる画素が検出されれば、注目画素についてもオクルージョン信号を1から0に変更する。このような処理は、具体的には、各オクルージョン信号Occ_L、Occ_Rを水平のメディアン・フィルター(MED_H)及び垂直のメディアン・フィルター(MED_V)にかけ、注目画素のオクルージョン信号を水平及び垂直方向の多数決をとり、点在しているオクルージョン信号を周辺画素の中央値に置き換える処理を行なう。メディアン・フィルターをかけることにより削れてしまった部分をスプレッド部(Spread)により広げて、安定化後の左基準のオクルージョン信号Occ_L´、安定化後の右基準のオクルージョン信号Occ_R´とする。図6には、オクルージョン信号安定化部403、413で左右のオクルージョン信号Occ_L、Occ_Rを安定化処理する例を示している。同図中、オクルージョン領域を黒で示している。 The spatial stabilization of the occlusion signal is a process of removing an occlusion area composed of fine areas or thin lines. For example, for a pixel of interest with an occlusion signal of 1, if a pixel with an occlusion signal of 0 is detected within a range of surrounding ± N pixels (where N is a positive integer, for example n = 1), the pixel of interest Also, the occlusion signal is changed from 1 to 0. Specifically, in this processing, each occlusion signal Occ_L, Occ_R is subjected to a horizontal median filter (MED_H) and a vertical median filter (MED_V), and the occlusion signal of the pixel of interest is subjected to a majority decision in the horizontal and vertical directions. Then, a process of replacing the scattered occlusion signals with the median value of the peripheral pixels is performed. The portion that has been cut by applying the median filter is widened by a spread portion (Spread) to obtain a stabilized left reference occlusion signal Occ_L ′ and a stabilized right reference occlusion signal Occ_R ′. FIG. 6 shows an example in which the left and right occlusion signals Occ_L and Occ_R are stabilized by the occlusion signal stabilization units 403 and 413. In the figure, the occlusion area is shown in black.
 ここで、バイナリー形式となる左右のオクルージョン信号を統合した場合、数値の組み合わせとしては(Occ_L´,Occ_R´)=(0,0),(0,1),(1,0),(1,1)の4通りが発生し得る。このうち、(0,1)は左画像のオクルージョン領域、(1,0)は右画像のオクルージョン領域、(1,1)は左右の画像でオクルージョンでない領域にそれぞれ相当する。これに対し、(0,0)は左右の画像でともにオクルージョン領域になるという、現実にはあり得ない不定状態(Unknown Status)である。そこで、オクルージョン信号清浄化部404は、左基準のオクルージョン信号Occ_L´から、左右の画像でともにオクルージョンとなる不定状態を取り除くための処理を行なう。また、オクルージョン信号清浄化部414は、右基準のオクルージョン信号Occ_R´から不定状態を取り除くための処理を行なう。 Here, when the left and right occlusion signals in binary format are integrated, the combination of numerical values is (Occ_L ′, Occ_R ′) = (0, 0), (0, 1), (1, 0), (1, Four types of 1) can occur. Of these, (0, 1) corresponds to the occlusion area of the left image, (1, 0) corresponds to the occlusion area of the right image, and (1, 1) corresponds to the non-occlusion area of the left and right images. On the other hand, (0, 0) is an indefinite state (Unknown Status) that is an occlusion area in both the left and right images, which is impossible in reality. Therefore, the occlusion signal cleaning unit 404 performs processing for removing an indefinite state that is occluded in the left and right images from the left reference occlusion signal Occ_L ′. Further, the occlusion signal cleaning unit 414 performs a process for removing an indefinite state from the right reference occlusion signal Occ_R ′.
 オクルージョン信号の清浄化に際し、左右のオクルージョン信号を(0,0)→0、(0,1)→1、(1,0)→2、(1,1)→3という値として扱う。図7には、オクルージョン信号の不定状態の部分を取り除く様子を示している。同図中、左右のオクルージョン信号が0すなわち不定状態となる部分を黒で示している。 When cleaning the occlusion signal, the left and right occlusion signals are treated as (0,0) → 0, (0,1) → 1, (1,0) → 2, and (1,1) → 3. FIG. 7 shows a state where an indefinite state portion of the occlusion signal is removed. In the figure, the part where the left and right occlusion signals are 0, that is, the indefinite state is shown in black.
 オクルージョン信号清浄化部404、オクルージョン信号清浄化部414は、不定状態でないオクルージョン信号Occ_L´又はOcc_R´の値を保持し、上の画素の値と比較しながら画像内を左から右へとスキャンしていき、図7中で黒く塗られているような不定状態の画素に到達したときに、保持している値と上の画素の値が同じときには不定状態の画素にその値を埋め、異なるときには不定状態の画素の元の値を保持する。続いて、不定状態でないオクルージョン信号Occ_L´又はOcc_R´の値を保持し、上の画素の値と比較しながら画像内を右から左へとスキャンしていき、不定状態の画素に到達したときに、保持している値と上の画素の値が同じときには不定状態の画素にその値を埋め、異なるときには不定状態の画素の元の値を保持する。そして、今度は垂直方向にオクルージョン信号Occ_L´又はOcc_R´を見ていき、1水平ライン分の孤立したオクルージョンの値があったときに、近傍の値で埋めることにより、孤立した不定状態のラインを取り除く。オクルージョン信号清浄化部404、オクルージョン信号清浄化部414が清浄化したオクルージョン信号をそれぞれOcc_L″、Occ_R″とする。 The occlusion signal cleaning unit 404 and the occlusion signal cleaning unit 414 hold the value of the occlusion signal Occ_L ′ or Occ_R ′ that is not indefinite, and scan the image from left to right while comparing with the value of the upper pixel. When reaching an indefinite state pixel that is painted black in FIG. 7, if the value held is the same as the value of the upper pixel, the value is filled in the indefinite state pixel. Holds the original value of the pixel in the undefined state. Subsequently, the value of the occlusion signal Occ_L ′ or Occ_R ′ that is not indefinite is held, and the image is scanned from right to left while comparing with the value of the upper pixel, and when the pixel in the indefinite state is reached. When the value held is the same as the value of the upper pixel, the value is filled in the indefinite state pixel, and when the value is different, the original value of the indefinite state pixel is held. Then, this time, the occlusion signal Occ_L ′ or Occ_R ′ is viewed in the vertical direction, and when there is an isolated occlusion value for one horizontal line, the isolated indefinite state line is filled with the neighboring value. remove. The occlusion signals cleaned by the occlusion signal cleaning unit 404 and the occlusion signal cleaning unit 414 are Occ_L ″ and Occ_R ″, respectively.
 続いて、混合部405は、左基準の視差の右基準の視差への混合処理を行なって、輪郭を綺麗にした左基準の視差(Disparity L:D_L)を得る。具体的には、混合部405は、加工された左基準のオクルージョン信号Occ_L″をオクルージョン信号清浄化部404から入力するとともに、初期視差メモリー302から左基準初期視差OD_Lを、投影視差メモリー412から右基準投影視差PD_Rを入力する。そして、オクルージョン信号Occ_L″に応じて左画像にオクルージョンが発生している部分については、左基準初期視差OD_Lを右基準投影視差PD_Rで置き換えて、左基準の視差D_Lとして、左基準の視差メモリー406に書き込む。図8には、左基準初期視差OD_Lと、混合部405で右基準投影視差PD_Rと混合した後の左基準視差D_Lの一例を示している。 Subsequently, the mixing unit 405 performs a process of mixing the left reference parallax with the right reference parallax to obtain a left reference parallax (Disparity L: D_L) with a clean outline. Specifically, the mixing unit 405 inputs the processed left reference occlusion signal Occ_L ″ from the occlusion signal cleaning unit 404, and the left reference initial parallax OD_L from the initial parallax memory 302 and the right from the projection parallax memory 412. The reference projection parallax PD_R is input, and the left reference initial parallax D_L is replaced with the right reference projection parallax PD_R for the portion where the occlusion occurs in the left image according to the occlusion signal Occ_L ″. Is written in the left reference parallax memory 406. FIG. 8 shows an example of the left reference initial parallax OD_L and the left reference parallax D_L after being mixed with the right reference projection parallax PD_R by the mixing unit 405.
 同様に、混合部415は、右基準の視差の左基準の視差への混合処理を行なって、輪郭を綺麗にした右基準の視差(Disparity L:D_L)を得る。具体的には、混合部415は、加工された右基準のオクルージョン信号Occ_R″をオクルージョン信号清浄化部414から入力するとともに、各初期視差メモリー312から右基準初期視差OD_Rを、投影視差メモリー402から左基準投影視差PD_Lを入力する。そして、オクルージョン信号Occ_R″に応じて右画像にオクルージョンが発生している部分については、右基準初期視差OD_Rを左基準投影視差PD_Lで置き換えて、右基準の視差D_Lとして、右基準の視差メモリー416に書き込む。 Similarly, the mixing unit 415 performs a process of mixing the right reference parallax with the left reference parallax to obtain a right reference parallax (Disparity L: D_L) with a clean outline. Specifically, the mixing unit 415 inputs the processed right reference occlusion signal Occ_R ″ from the occlusion signal cleaning unit 414, and receives the right reference initial parallax OD_R from each initial parallax memory 312 from the projection parallax memory 402. The left reference projection parallax PD_L is input, and the right reference initial parallax OD_R is replaced with the left reference projection parallax PD_L for the portion where the occlusion occurs in the right image according to the occlusion signal Occ_R ″, and the right reference parallax is obtained. Write to the right reference parallax memory 416 as D_L.
 このように、左画像にオクルージョンが発生している部分について左基準初期視差OD_Lを右基準投影視差PD_Rで置き換え、右画像にオクルージョンが発生している部分について右基準初期視差OD_Rを左基準投影視差PD_Lで置き換えるのは、オクルージョンが発生していない側の画像の視差の輪郭の方が、オクルージョンが発生している側の画像の視差の輪郭よりも綺麗に求まり易い、という特性を使用したものである。 In this way, the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R for the portion where occlusion occurs in the left image, and the right reference initial parallax OD_R is replaced with the left reference projection parallax for the portion where occlusion occurs in the right image. The replacement with PD_L uses the characteristic that the parallax contour of the image on the side where no occlusion occurs is easier to obtain than the parallax contour of the image on the side where occlusion occurs. is there.
 しかしながら、オクルージョンが発生している側の視差をオクルージョンが発生していない他方の投影視差で置き換えるという上記の処理を行なうことによって、オクルージョンが発生している側の視差の境界が実際の物体の境界から遠くなり、ハロー現象が発生する幅が広がってしまうことが懸念される。そして、オクルージョンが発生しない側の視差を使用しても、なおノイズのような孤立点は発生してしまう。 However, by performing the above process of replacing the parallax on the side where the occlusion occurs with the other projection parallax where the occlusion does not occur, the boundary of the parallax on the side where the occlusion has occurred becomes the boundary of the actual object. There is concern that the range of occurrence of the halo phenomenon will be widened. Even if the parallax on the side where no occlusion occurs is used, isolated points such as noise still occur.
 そこで、最終段の視差清浄化部407、417において、各視差メモリー406、416に書き込まれている左右の視差画像D_L、D_Rに対し、ハローを軽減する処理を行なう。 Therefore, the parallax cleaning units 407 and 417 at the final stage perform processing for reducing halos on the left and right parallax images D_L and D_R written in the parallax memories 406 and 416.
 視差清浄化部407は、初期のオクルージョン信号Occ_Lと加工後のオクルージョン信号Occ_L″を比較して、値が変わっている部分を孤立した視差のノイズとして検出する。図9Aには、混合した後の左基準視差D_Lにおいて、左基準初期視差OD_Lを右基準投影視差PD_Rで置き換えた場所を例示している。また、図9Bには、初期のオクルージョン信号Occ_Lと加工後のオクルージョン信号Occ_L″を比較して変化した部分を検出する様子を示している。 The parallax cleaning unit 407 compares the initial occlusion signal Occ_L with the processed occlusion signal Occ_L ″, and detects a portion where the value has changed as isolated parallax noise. FIG. 9A shows a state after mixing. In the left reference parallax D_L, a place where the left reference initial parallax OD_L is replaced with the right reference projection parallax PD_R is illustrated, and Fig. 9B compares the initial occlusion signal Occ_L with the processed occlusion signal Occ_L ". It shows how the changed part is detected.
 そして、視差清浄化部407は、孤立した視差を検出すると、孤立していない視差の値で埋める処理を行なう。例えば、孤立していない視差の値を保持しながら左基準の視差D_Lを左から右へスキャンし、孤立した視差に到達すると、孤立していない視差の値に置き換える。図10には、混合部405で混合処理した後の左基準視差D_Lから孤立したノイズを取り除いた結果を例示している。 Then, when the parallax cleaning unit 407 detects an isolated parallax, the parallax cleaning unit 407 performs a process of filling with a non-isolated parallax value. For example, the left reference parallax D_L is scanned from the left to the right while holding the non-isolated parallax value, and when the parallax reaches the isolated parallax, the parallax value is replaced with the non-isolated parallax value. FIG. 10 illustrates a result of removing isolated noise from the left reference parallax D_L after the mixing process by the mixing unit 405.
 視差清浄化部407は、上記の孤立した視差のノイズを除去する処理と同時に、視差メモリー406に書き込まれている混合処理後の視差画像D_Lに対し、加工後のオクルージョン信号Occ_L″を基に、前景の視差の値を背景の視差の値に置き換える処理を行なうことによって、混合処理により実際の物体の境界から遠くなってしまったオクルージョン側の視差の境界を、物体の境界に近づける。つまり、視差清浄化部407は、加工後のオクルージョン信号Occ_L″を基に左側の視差D_Lにオクルージョンが発生している領域を検出すると、左基準の視差D_Lを左から右へスキャンしながら、右側(前景)の視差の値を左側(背景)の視差の値に置き換えて、(copy from left to right)、最終的な左基準の視差D_L´を出力する。図11には、オクルージョン側の前景の視差の値を背景の視差の値に置き換える様子を示している。このとき、置き換える条件として、置き換える前の右側(すなわち前景)の視差の値Bと置き換える左側(すなわち背景)の視差の値Aの差分(B-A)により、視差の境界を物体の境界に近づける距離(Length)をコントロールする。このようにすることにより、混合部405によりオクルージョン側に広がった分の視差の境界を戻すことが可能になる。そして、ハローの範囲が綺麗に縮まり、やり過ぎないようにする。 The parallax cleaning unit 407 simultaneously performs the process of removing the isolated parallax noise, and the mixed parallax image D_L written in the parallax memory 406 based on the processed occlusion signal Occ_L ″. By performing the process of replacing the parallax value of the foreground with the parallax value of the background, the parallax boundary on the occlusion side that has become far from the actual object boundary due to the mixing process is brought closer to the object boundary. When the cleaning unit 407 detects an area where occlusion occurs in the left parallax D_L based on the processed occlusion signal Occ_L ″, the cleaning unit 407 scans the left reference parallax D_L from the left to the right while scanning the right side (foreground). Replace the parallax value with the left side (background) parallax value, (copy from left to right), and finally And it outputs the parallax D_L' such left reference. FIG. 11 shows a state in which the foreground parallax value on the occlusion side is replaced with the background parallax value. At this time, as a replacement condition, the parallax boundary is brought closer to the boundary of the object by the difference (BA) of the parallax value A on the left side (ie background) to be replaced with the parallax value B on the right side (ie foreground) before the substitution. Controls the distance. By doing so, it is possible to return the parallax boundary that has spread to the occlusion side by the mixing unit 405. And make sure that the halo range shrinks cleanly and doesn't overdo it.
 また、視差清浄化部417は、初期のオクルージョン信号Occ_Rと加工後のオクルージョン信号Occ_R″を比較して、値が変わっている部分を孤立した視差のノイズとして検出すると、孤立していない視差の値で埋める処理を行なう。例えば、孤立していない視差の値を保持しながら右基準の視差D_Rを右から左へスキャンし、孤立した視差に到達すると、孤立していない視差の値に置き換える。 Further, when the parallax cleaning unit 417 compares the initial occlusion signal Occ_R with the processed occlusion signal Occ_R ″ and detects a portion where the value has changed as isolated parallax noise, the parallax value that is not isolated For example, the right reference parallax D_R is scanned from right to left while holding a non-isolated parallax value, and when the parallax is reached, it is replaced with a non-isolated parallax value.
 視差清浄化部417は、上記の孤立した視差のノイズを除去する処理と同時に、視差メモリー416に書き込まれている混合処理後の視差画像D_Rに対し、加工後のオクルージョン信号Occ_R″を基に、前景の視差の値を背景の視差の値に置き換える処理を行なうことによって、混合処理により実際の物体の境界から遠くなってしまったオクルージョン側の視差の境界を、物体の境界に近づける。つまり、視差清浄化部417は、加工後のオクルージョン信号Occ_R″を基に右側の視差D_Rにオクルージョンが発生している領域を検出すると、右基準の視差D_Rを右から左へスキャンしながら、左側(前景)の視差の値を右側(背景)の視差の値に置き換えて、最終的な右基準の視差D_R´を出力する。 The parallax cleaning unit 417 simultaneously performs the process of removing the isolated parallax noise, and the mixed parallax image D_R written in the parallax memory 416 based on the processed occlusion signal Occ_R ″. By performing the process of replacing the parallax value of the foreground with the parallax value of the background, the parallax boundary on the occlusion side that has become far from the actual object boundary due to the mixing process is brought closer to the object boundary. When the cleaning unit 417 detects a region where occlusion occurs in the right parallax D_R based on the processed occlusion signal Occ_R ″, the cleaning unit 417 scans the right reference parallax D_R from the right to the left while scanning the left side (foreground). Is replaced with the right (background) parallax value, and the final right reference parallax D_R ′ is output.
 このように本実施形態によれば、オクルージョン領域の視差画像の境界を物体の境界に程良く近づけて、物体の輪郭近傍に生じるハロー現象を抑制するとともに、視差画像内で孤立したノイズ部分を好適に除去することができる。 As described above, according to the present embodiment, the boundary of the parallax image of the occlusion area is brought close to the boundary of the object, and the halo phenomenon generated near the contour of the object is suppressed, and an isolated noise portion in the parallax image is preferably used. Can be removed.
 なお、本明細書の開示の技術は、以下のような構成をとることも可能である。
(1)左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部と、を具備する画像処理装置。
(2)前記第1及び第2のオクルージョン信号処理部は、オクルージョン信号を空間的に安定化する、上記(1)に記載の画像処理装置。
(3)前記第1及び第2のオクルージョン信号処理部は、点在しているオクルージョン信号に対し水平及び垂直方向の多数決のフィルターにかけた後にスプレッド処理する、上記(1)に記載の画像処理装置。
(4)前記第1及び第2のオクルージョン信号処理部は、左右のオクルージョン信号でともにオクルージョンとなる不定状態を取り除く、
請求項1に記載の画像処理装置。
(5)前記第1の混合部は、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号に応じて左画像にオクルージョンが発生している部分では、左基準初期視差を右基準投影視差で置き換え、前記第2の混合部は、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号に応じて右画像にオクルージョンが発生している部分では、右基準初期視差を左基準投影視差で置き換える、上記(1)に記載の画像処理装置。
(6)前記第1の視差清浄化部は、初期のオクルージョン信号と前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を左基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋め、前記第2の視差清浄化部は、初期のオクルージョン信号と前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を右基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋める、上記(1)に記載の画像処理装置。
(7)前記第1の視差清浄化部は、左基準の視差に対し、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行ない、前記第2の視差清浄化部は、右基準の視差に対し、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行なう、上記(6)に記載の画像処理装置。
(8)左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影ステップと、右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影ステップと、左基準のオクルージョン信号を処理する第1のオクルージョン信号処理ステップと、右基準のオクルージョン信号を処理する第2のオクルージョン信号処理ステップと、左基準の視差を右基準の投影視差へ混合処理する第1の混合ステップと、右基準の視差を左基準の投影視差へ混合処理する第2の混合ステップと、前記第1の混合ステップで処理した後の左基準の視差からノイズを除去する第1の視差清浄化ステップと、前記第2の混合ステップで処理した後の右基準の視差からノイズを除去する第2の視差清浄化ステップと、を有する画像処理方法。
(9)左画像及び右画像から左基準及び右基準の初期視差をそれぞれ生成する初期視差生成部と、前記初期視差に含まれるオクルージョン領域の視差の値を補正する視差補正部と、補正後の視差に基づいて、元画像を補間する補間画像を生成する補間画像生成部と、元画像と補間画像を統合する画像統合部と、元画像又は前記画像統合部が統合した後の画像を表示する表示部と、を具備し、前記視差補正部は、左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部を備える、画像表示装置。
Note that the technology disclosed in the present specification can also be configured as follows.
(1) A first projection unit that projects a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal, and projects a right reference initial parallax to a left reference initial parallax. A second projection unit that obtains a right projection parallax and a right reference occlusion signal, a first occlusion signal processing unit that processes a left reference occlusion signal, and a second occlusion signal process that processes a right reference occlusion signal A first mixing unit that mixes the left reference parallax into the right reference projection parallax, a second mixing unit that mixes the right reference parallax into the left reference projection parallax, and the first mixing unit. A first parallax cleaning unit that removes noise from the left reference parallax after being processed by the unit, and a second parallax cleaning unit that removes noise from the right reference parallax after being processed by the second mixing unit An image processing apparatus.
(2) The image processing device according to (1), wherein the first and second occlusion signal processing units spatially stabilize the occlusion signal.
(3) The image processing apparatus according to (1), wherein the first and second occlusion signal processing units perform spread processing after applying the majority filtering in the horizontal and vertical directions to the scattered occlusion signals. .
(4) The first and second occlusion signal processing sections remove an indefinite state that becomes an occlusion in both the left and right occlusion signals.
The image processing apparatus according to claim 1.
(5) The first mixing unit converts the left reference initial parallax to the right reference projection parallax in a portion where occlusion occurs in the left image according to the occlusion signal processed by the first occlusion signal processing unit. The second mixing unit replaces the right reference initial parallax with the left reference projection parallax in a portion where occlusion occurs in the right image in accordance with the occlusion signal processed by the second occlusion signal processing unit. The image processing apparatus according to (1) above.
(6) The first parallax cleaning unit compares an initial occlusion signal with an occlusion signal processed by the first occlusion signal processing unit, and isolates a portion whose value has changed in the left reference parallax. Detected as parallax noise and filled with non-isolated parallax values, the second parallax cleaning unit compares the initial occlusion signal and the occlusion signal processed by the second occlusion signal processing unit. Then, the image processing apparatus according to (1), wherein a portion where the value has changed is detected as isolated parallax noise in the right reference parallax and is filled with a non-isolated parallax value.
(7) The first parallax cleaning unit converts the foreground parallax value to the background parallax value based on the occlusion signal processed by the first occlusion signal processing unit with respect to the left reference parallax. The second parallax cleaning unit performs a replacement process, and converts the foreground parallax value to the background parallax value based on the occlusion signal processed by the second occlusion signal processing unit with respect to the right reference parallax. The image processing apparatus according to (6), wherein a process of replacing with a value is performed.
(8) A first projection step of projecting the left reference initial parallax to the right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal, and projecting the right reference initial parallax to the left reference initial parallax Second projection step for obtaining right projection parallax and right reference occlusion signal, first occlusion signal processing step for processing left reference occlusion signal, and second occlusion signal processing for processing right reference occlusion signal A first mixing step for mixing the left reference parallax with the right reference projection parallax, a second mixing step for mixing the right reference parallax with the left reference projection parallax, and the first mixing. The first parallax cleaning step for removing noise from the left reference parallax after processing in the step, and the noise from the right reference parallax after processing in the second mixing step An image processing method having a second parallax cleaning step of removed by, a.
(9) an initial parallax generation unit that generates left- and right-reference initial parallax from the left image and the right image, a parallax correction unit that corrects the parallax value of the occlusion area included in the initial parallax, Based on the parallax, an interpolated image generating unit that generates an interpolated image that interpolates the original image, an image integrating unit that integrates the original image and the interpolated image, and the original image or an image after the image integrating unit is integrated are displayed A display unit, wherein the parallax correction unit projects a left reference initial parallax onto a right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal; and a right reference A second projection unit for projecting the initial parallax to the left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal; a first occlusion signal processing unit for processing the left reference occlusion signal; Occlusion A second occlusion signal processing unit for processing the signal, a first mixing unit for mixing the left reference parallax with the right reference projection parallax, and a second for mixing the right reference parallax with the left reference projection parallax. From the left reference parallax after processing by the first mixing unit, the first parallax cleaning unit for removing noise from the left reference parallax after processing by the first mixing unit, and the right reference parallax after processing by the second mixing unit An image display apparatus provided with the 2nd parallax cleaning part which removes noise.
 以上、特定の実施形態を参照しながら、本明細書で開示する技術について詳細に説明してきた。しかしながら、本明細書で開示する技術の要旨を逸脱しない範囲で当業者が該実施形態の修正や代用を成し得ることは自明である。 As described above, the technology disclosed in this specification has been described in detail with reference to specific embodiments. However, it is obvious that those skilled in the art can make modifications and substitutions of the embodiments without departing from the scope of the technology disclosed in this specification.
 本明細書で開示する技術は、例えばハイフレームレート化や裸眼3次元画像表示時に適用することにより、オクルージョン領域の視差画像の境界を物体の境界に程良く近づけて、物体の輪郭近傍に生じるハロー現象を抑制するとともに、視差画像内で孤立したノイズ部分を好適に除去することができる。 The technique disclosed in the present specification is applied to, for example, a high frame rate or a naked-eye three-dimensional image display so that the boundary of the parallax image in the occlusion region is brought close to the boundary of the object, and the halo generated in the vicinity of the contour of the object. While suppressing the phenomenon, it is possible to suitably remove the isolated noise portion in the parallax image.
 また、本明細書で説明した実施形態における画像処理は、ハードウェア、ソフトウェアのいずれにより行なうこともできる。当該処理をソフトウェアによって実現する場合には、ソフトウェアにおける処理手順をコンピューター可読形式に記述したコンピューター・プログラムを所定のコンピューターにインストールして実行すればよい。 Further, the image processing in the embodiment described in this specification can be performed by either hardware or software. When the processing is realized by software, a computer program in which processing procedures in the software are described in a computer-readable format may be installed and executed on a predetermined computer.
 要するに、例示という形態で本技術を開示してきたのであり、本明細書の記載内容を限定的に解釈するべきではない。本技術の要旨を判断するためには、特許請求の範囲を参酌すべきである。 In short, the present technology has been disclosed in the form of exemplification, and the description content of the present specification should not be interpreted in a limited manner. In order to determine the gist of the present technology, the claims should be taken into consideration.
 100…表示装置
 110…映像表示部
 112…液晶パネル
 113…ゲート・ドライバー、114…データ・ドライバー
 115…光源
 120…映像信号処理部
 140…タイミング制御部
 201…画像入力部、202…補間フレーム生成部、203…画像統合部
 301…左画像用フレーム・メモリー、311…右画像用フレーム・メモリー
 302…初期視差算出部(左基準)、312…初期視差算出部(右基準)
 303…視差メモリー(左基準)、313…視差メモリー(右基準)
 304…視差補正部
 305…フレーム生成部(左基準)、315…フレーム生成部(右基準)
 306…画像出力部
 401、411…投影部
 402、412…投影視差メモリー
 403、413…オクルージョン信号安定化部
 404、414…オクルージョン信号清浄化部
 405、415…混合部
 406、416…視差メモリー
 407、417…視差清浄化部
 
DESCRIPTION OF SYMBOLS 100 ... Display apparatus 110 ... Video display part 112 ... Liquid crystal panel 113 ... Gate driver, 114 ... Data driver 115 ... Light source 120 ... Video signal processing part 140 ... Timing control part 201 ... Image input part, 202 ... Interpolation frame production | generation part , 203 ... Image integration unit 301 ... Left image frame memory, 311 ... Right image frame memory 302 ... Initial parallax calculation unit (left reference), 312 ... Initial parallax calculation unit (right reference)
303: Parallax memory (left reference), 313 ... Parallax memory (right reference)
304: Parallax correcting unit 305 ... Frame generating unit (left reference), 315 ... Frame generating unit (right reference)
306: Image output unit 401, 411 ... Projection unit 402, 412 ... Projection parallax memory 403, 413 ... Occlusion signal stabilization unit 404, 414 ... Occlusion signal cleaning unit 405, 415 ... Mixing unit 406, 416 ... Parallax memory 407, 417 ... Parallax cleaning unit

Claims (9)

  1.  左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、
     右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、
     左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、
     右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、
     左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、
     右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、
     前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、
     前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部と、
    を具備する画像処理装置。
    A first projection unit that projects a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal;
    A second projection unit that projects the right reference initial parallax to the left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal;
    A first occlusion signal processing unit for processing a left reference occlusion signal;
    A second occlusion signal processing unit for processing a right reference occlusion signal;
    A first mixing unit that performs processing for mixing the left reference parallax into the right reference projected parallax;
    A second mixing unit for mixing the right reference parallax into the left reference projection parallax;
    A first parallax cleaning unit that removes noise from the left reference parallax after processing in the first mixing unit;
    A second parallax cleaning unit for removing noise from the right reference parallax after processing in the second mixing unit;
    An image processing apparatus comprising:
  2.  前記第1及び第2のオクルージョン信号処理部は、オクルージョン信号を空間的に安定化する、
    請求項1に記載の画像処理装置。
    The first and second occlusion signal processing units spatially stabilize the occlusion signal;
    The image processing apparatus according to claim 1.
  3.  前記第1及び第2のオクルージョン信号処理部は、点在しているオクルージョン信号に対し水平及び垂直方向の多数決のフィルターにかけた後にスプレッド処理する、
    請求項1に記載の画像処理装置。
    The first and second occlusion signal processing units apply spread processing to the scattered occlusion signals after being subjected to horizontal and vertical majority filtering.
    The image processing apparatus according to claim 1.
  4.  前記第1及び第2のオクルージョン信号処理部は、左右のオクルージョン信号でともにオクルージョンとなる不定状態を取り除く、
    請求項1に記載の画像処理装置。
    The first and second occlusion signal processing units remove an indefinite state that becomes an occlusion in both the left and right occlusion signals.
    The image processing apparatus according to claim 1.
  5.  前記第1の混合部は、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号に応じて左画像にオクルージョンが発生している部分では、左基準初期視差を右基準投影視差で置き換え、
     前記第2の混合部は、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号に応じて右画像にオクルージョンが発生している部分では、右基準初期視差を左基準投影視差で置き換える、
    請求項1に記載の画像処理装置。
    The first mixing unit replaces the left reference initial parallax with the right reference projection parallax in a portion where occlusion occurs in the left image in accordance with the occlusion signal processed by the first occlusion signal processing unit,
    The second mixing unit replaces the right reference initial parallax with the left reference projection parallax in a portion where occlusion occurs in the right image in accordance with the occlusion signal processed by the second occlusion signal processing unit.
    The image processing apparatus according to claim 1.
  6.  前記第1の視差清浄化部は、初期のオクルージョン信号と前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を左基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋め、
     前記第2の視差清浄化部は、初期のオクルージョン信号と前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を比較して、値が変わっている部分を右基準視差中の孤立した視差のノイズとして検出し、孤立していない視差の値で埋める、
    請求項1に記載の画像処理装置。
    The first parallax cleaning unit compares an initial occlusion signal with an occlusion signal processed by the first occlusion signal processing unit, and determines a portion where the value is changed as an isolated parallax in the left reference parallax. Detect as noise, fill with disparity values that are not isolated,
    The second parallax cleaning unit compares the initial occlusion signal and the occlusion signal processed by the second occlusion signal processing unit, and determines a portion where the value is changed as an isolated parallax in the right reference parallax. Detect as noise and fill with disparity values that are not isolated,
    The image processing apparatus according to claim 1.
  7.  前記第1の視差清浄化部は、左基準の視差に対し、前記第1のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行ない、
     前記第2の視差清浄化部は、右基準の視差に対し、前記第2のオクルージョン信号処理部で処理後のオクルージョン信号を基に、前景の視差の値を背景の視差の値に置き換える処理を行なう、
    請求項6に記載の画像処理装置。
    The first parallax cleaning unit performs processing for replacing the parallax value of the foreground with the value of the parallax of the background based on the occlusion signal processed by the first occlusion signal processing unit with respect to the parallax of the left reference. Do,
    The second parallax cleaning unit performs processing for replacing the foreground parallax value with the background parallax value based on the occlusion signal processed by the second occlusion signal processing unit with respect to the right reference parallax. Do,
    The image processing apparatus according to claim 6.
  8.  左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影ステップと、
     右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影ステップと、
     左基準のオクルージョン信号を処理する第1のオクルージョン信号処理ステップと、
     右基準のオクルージョン信号を処理する第2のオクルージョン信号処理ステップと、
     左基準の視差を右基準の投影視差へ混合処理する第1の混合ステップと、
     右基準の視差を左基準の投影視差へ混合処理する第2の混合ステップと、
     前記第1の混合ステップで処理した後の左基準の視差からノイズを除去する第1の視差清浄化ステップと、
     前記第2の混合ステップで処理した後の右基準の視差からノイズを除去する第2の視差清浄化ステップと、
    を有する画像処理方法。
    A first projection step of projecting a left reference initial parallax to a right reference initial parallax to obtain a left projected parallax and a left reference occlusion signal;
    A second projection step of projecting a right reference initial parallax to a left reference initial parallax to obtain a right projection parallax and a right reference occlusion signal;
    A first occlusion signal processing step for processing a left-referenced occlusion signal;
    A second occlusion signal processing step for processing a right reference occlusion signal;
    A first mixing step of mixing the left reference parallax into the right reference projected parallax;
    A second mixing step for mixing the right reference parallax into the left reference projection parallax;
    A first parallax cleaning step that removes noise from the left reference parallax after processing in the first mixing step;
    A second parallax cleaning step for removing noise from the right reference parallax after processing in the second mixing step;
    An image processing method.
  9.  左画像及び右画像から左基準及び右基準の初期視差をそれぞれ生成する初期視差生成部と、
     前記初期視差に含まれるオクルージョン領域の視差の値を補正する視差補正部と、
     補正後の視差に基づいて、元画像を補間する補間画像を生成する補間画像生成部と、
     元画像と補間画像を統合する画像統合部と、
     元画像又は前記画像統合部が統合した後の画像を表示する表示部と、
    を具備し、
     前記視差補正部は、左基準の初期視差を右基準の初期視差に投影して左投影視差と左基準のオクルージョン信号を得る第1の投影部と、右基準の初期視差を左基準の初期視差に投影して右投影視差と右基準のオクルージョン信号を得る第2の投影部と、左基準のオクルージョン信号を処理する第1のオクルージョン信号処理部と、右基準のオクルージョン信号を処理する第2のオクルージョン信号処理部と、左基準の視差を右基準の投影視差へ混合処理する第1の混合部と、右基準の視差を左基準の投影視差へ混合処理する第2の混合部と、前記第1の混合部で処理した後の左基準の視差からノイズを除去する第1の視差清浄化部と、前記第2の混合部で処理した後の右基準の視差からノイズを除去する第2の視差清浄化部を備える、
    画像表示装置。
     
    An initial parallax generating unit for generating left-reference and right-reference initial parallax from the left image and the right image, respectively;
    A parallax correction unit that corrects a parallax value of an occlusion area included in the initial parallax;
    An interpolated image generating unit that generates an interpolated image for interpolating the original image based on the corrected parallax;
    An image integration unit for integrating the original image and the interpolation image;
    A display unit for displaying an original image or an image after integration by the image integration unit;
    Comprising
    The parallax correction unit projects a left reference initial parallax onto a right reference initial parallax to obtain a left projection parallax and a left reference occlusion signal; and a right reference initial parallax as a left reference initial parallax A second projection unit that projects right projection parallax and a right reference occlusion signal; a first occlusion signal processing unit that processes a left reference occlusion signal; and a second projection unit that processes a right reference occlusion signal An occlusion signal processing unit; a first mixing unit that mixes a left reference parallax into a right reference projected parallax; a second mixing unit that mixes a right reference parallax into a left reference projected parallax; A first parallax cleaning unit that removes noise from the left reference parallax after processing by the first mixing unit, and a second that removes noise from the right reference parallax after processing by the second mixing unit A parallax cleaning unit;
    Image display device.
PCT/JP2013/066163 2012-07-26 2013-06-12 Image processing device, image processing method, and image display device WO2014017201A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-166012 2012-07-26
JP2012166012 2012-07-26

Publications (1)

Publication Number Publication Date
WO2014017201A1 true WO2014017201A1 (en) 2014-01-30

Family

ID=49997017

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/066163 WO2014017201A1 (en) 2012-07-26 2013-06-12 Image processing device, image processing method, and image display device

Country Status (1)

Country Link
WO (1) WO2014017201A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664745B2 (en) 2016-06-29 2020-05-26 International Business Machines Corporation Resistive processing units and neural network training methods

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07282259A (en) * 1994-04-13 1995-10-27 Matsushita Electric Ind Co Ltd Parallax arithmetic unit and image composite device
JPH0927969A (en) * 1995-05-08 1997-01-28 Matsushita Electric Ind Co Ltd Method for generating intermediate image of plural images, parallax estimate method and device
JP2003209858A (en) * 2002-01-17 2003-07-25 Canon Inc Stereoscopic image generating method and recording medium
JP2011081605A (en) * 2009-10-07 2011-04-21 Fujifilm Corp Image processing apparatus, method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07282259A (en) * 1994-04-13 1995-10-27 Matsushita Electric Ind Co Ltd Parallax arithmetic unit and image composite device
JPH0927969A (en) * 1995-05-08 1997-01-28 Matsushita Electric Ind Co Ltd Method for generating intermediate image of plural images, parallax estimate method and device
JP2003209858A (en) * 2002-01-17 2003-07-25 Canon Inc Stereoscopic image generating method and recording medium
JP2011081605A (en) * 2009-10-07 2011-04-21 Fujifilm Corp Image processing apparatus, method and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10664745B2 (en) 2016-06-29 2020-05-26 International Business Machines Corporation Resistive processing units and neural network training methods

Similar Documents

Publication Publication Date Title
JP6717855B2 (en) Method and apparatus for determining a depth map of an image
US8866882B2 (en) Stereoscopic image format transformation method applied to display system
US8817020B2 (en) Image processing apparatus and image processing method thereof
MX2011001629A (en) Method and apparatus for determining two- or three-dimensional display mode of image sequence.
KR101364883B1 (en) Crosstalk reduction method for 3d steroscopic image and crosstalk reduction apparatus for 3d steroscopic image
US20100321388A1 (en) Temporal parallax induced display
KR20120114145A (en) Image display device, image display system, and image display method
EP2569950A1 (en) Comfort noise and film grain processing for 3 dimensional video
JP2015154091A (en) Image processing method, image processing device and electronic apparatus
US20120229600A1 (en) Image display method and apparatus thereof
EP2490173B1 (en) Method for processing a stereoscopic image comprising a black band and corresponding device
JP2014506768A (en) Processing of 3D scene depth data
WO2014017201A1 (en) Image processing device, image processing method, and image display device
WO2012176526A1 (en) Stereoscopic image processing device, stereoscopic image processing method, and program
JP2014072809A (en) Image generation apparatus, image generation method, and program for the image generation apparatus
KR102180068B1 (en) Method and device of generating multi-view image with resolution scaling function
CN114760457B (en) Automatic 2D/3D image switching method and system of 3D display system
US9299154B2 (en) Method and device for filtering a disparity map
JP5647741B2 (en) Image signal processing apparatus and image signal processing method
KR20150055441A (en) Three dimensional image display device and method of displaying three dimensional image
US20120154554A1 (en) Video signal processing apparatus, processing method, and video display apparatus
US10192508B2 (en) Display apparatus and three-dimensional image display system
KR101917764B1 (en) Three dimensional image display device and method of displaying three dimensional image
WO2012175386A1 (en) Method for reducing the size of a stereoscopic image
WO2014013804A1 (en) Image processing device, image processing method, and image display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13822097

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13822097

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP