WO2012131752A1 - Depth information updating device, stereoscopic video generation device, and depth information updating method - Google Patents

Depth information updating device, stereoscopic video generation device, and depth information updating method Download PDF

Info

Publication number
WO2012131752A1
WO2012131752A1 PCT/JP2011/001795 JP2011001795W WO2012131752A1 WO 2012131752 A1 WO2012131752 A1 WO 2012131752A1 JP 2011001795 W JP2011001795 W JP 2011001795W WO 2012131752 A1 WO2012131752 A1 WO 2012131752A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth
depth information
stereoscopic
value
stereoscopic video
Prior art date
Application number
PCT/JP2011/001795
Other languages
French (fr)
Japanese (ja)
Inventor
山本 純也
仁尾 寛
晴子 寺井
大輔 加瀬
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to PCT/JP2011/001795 priority Critical patent/WO2012131752A1/en
Publication of WO2012131752A1 publication Critical patent/WO2012131752A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • the present invention relates to a depth information update device, a stereoscopic video generation device, and a depth information update method for updating depth information for generating a stereoscopic video.
  • a video display device using a liquid crystal panel or the like has been used as a device for displaying a two-dimensional video.
  • technological development of a three-dimensional video display device capable of viewing three-dimensional video is progressing by alternately viewing two two-dimensional videos having parallax using active shutter glasses.
  • Patent Document 1 adjusts the depth value of a depth image for generating a stereoscopic video.
  • generates a stereo image using the depth value after adjustment is disclosed.
  • the depth value is set to a value between the maximum projection amount and the maximum depression amount for generating a stereoscopic image with less discomfort.
  • the setting of the depth value is not particularly considered for the improvement of the fine stereoscopic effect. Therefore, in the prior art A, the fine three-dimensional effect of a plurality of adjacent pixels is not improved.
  • the present invention has been made to solve the above-described problems, and an object thereof is to provide depth information capable of generating depth information for generating a stereoscopic image with an improved fine stereoscopic expression. It is providing an update apparatus etc.
  • a depth information updating apparatus is a depth information updating apparatus that performs processing using a depth value corresponding to each pixel constituting a stereoscopic video, A process of emphasizing a component of a specific band with respect to a depth image composed of a plurality of depth information indicating a plurality of depth values to be generated and a depth image including the plurality of depth values indicated by the depth information And a depth information update unit configured to update the depth information by performing the emphasizing process.
  • the depth information updating apparatus acquires a depth information acquiring unit that acquires depth information indicating a plurality of depth values, and a component of a specific band with respect to a depth image including the plurality of depth values indicated by the depth information. And a depth information updating unit configured to update the depth information by performing an emphasizing process that emphasizes.
  • the depth information update unit determines, in at least a part of the depth image, an absolute value of a difference between two depth values of which an absolute value of a difference between depth values of two adjacent pixels is less than a threshold.
  • the emphasizing process is performed to increase the size.
  • the depth information update unit is an absolute value of a difference between the depth value of the processing target pixel which is the pixel to be processed and the depth value of the pixel adjacent to the processing target pixel in the depth image.
  • a calculation unit which performs processing of calculating a certain difference depth value for all pixels constituting the depth image, and a difference depth value less than the threshold if the calculated difference depth value is less than a threshold
  • a processing execution unit that performs the enhancement processing on the corresponding processing target pixel.
  • the depth information update unit performs the enhancement process that is a filter process.
  • the depth information updating unit further indicates a plurality of depths indicated by the updated depth information such that absolute values of the plurality of depth values indicated by the updated depth information decrease at substantially the same rate.
  • the updated depth information is updated by changing the value.
  • the depth information update unit updates the updated depth information by multiplying an absolute value of the plurality of depth values indicated by the updated depth information by a coefficient smaller than one.
  • a stereoscopic video generation device includes the depth information update device, and a stereoscopic video generation unit that generates a new stereoscopic video using a plurality of depth values indicated by the updated depth information. .
  • the depth information update method is performed by the depth information update apparatus that performs processing using depth values corresponding to respective pixels that configure a stereoscopic video.
  • the depth information updating method includes a depth information acquisition step of acquiring depth information indicating a plurality of depth values for generating a stereoscopic video, and a depth image including the plurality of depth values indicated by the depth information. And a depth information updating step of updating the depth information by performing an emphasizing process which is a process of emphasizing a component of a specific band.
  • the present invention may realize all or part of a plurality of components constituting such a depth information update apparatus as a system LSI (Large Scale Integration: large scale integrated circuit).
  • LSI Large Scale Integration: large scale integrated circuit
  • the present invention may also be implemented as a program that causes a computer to execute the steps included in the depth information update method.
  • the present invention may be realized as a computer readable recording medium storing such a program.
  • the program may be distributed via a transmission medium such as the Internet.
  • FIG. 1 is a diagram showing an example of the configuration of a stereoscopic video viewing system according to the first embodiment.
  • FIG. 2 is a diagram for explaining a stereoscopic video.
  • FIG. 3 is a block diagram showing an example of the configuration of a stereoscopic video reproduction apparatus.
  • FIG. 4 is a diagram showing a depth table as depth information.
  • FIG. 5 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value.
  • FIG. 6 is a view showing an example of a stereoscopic video showing a stereoscopic target object.
  • FIG. 7 is a flowchart of stereoscopic video generation processing.
  • FIG. 8 is a diagram for explaining changes in depth value.
  • FIG. 1 is a diagram showing an example of the configuration of a stereoscopic video viewing system according to the first embodiment.
  • FIG. 2 is a diagram for explaining a stereoscopic video.
  • FIG. 3 is a block
  • FIG. 9 is a flowchart of stereoscopic video generation processing A.
  • FIG. 10 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value.
  • FIG. 11 is a block diagram showing an example of the configuration of the depth information update unit.
  • FIG. 12 is a flowchart of stereoscopic video generation processing B.
  • FIG. 13 is a diagram for explaining a processing target pixel and a pixel close to the processing target pixel.
  • FIG. 1 is a diagram showing an example of the configuration of a stereoscopic video viewing system 1000 according to the first embodiment.
  • the X, Y, and Z directions are orthogonal to one another.
  • Each of the X, Y, Z directions shown in the following figures are also orthogonal to one another.
  • a stereoscopic video viewing system 1000 includes a stereoscopic video reproduction device 100, a stereoscopic video display device 200, and active shutter glasses 300.
  • the stereoscopic video display device 200 is, for example, a plasma display, a liquid crystal display, an organic EL display, or the like. Note that the stereoscopic video display device 200 is not limited to the above display, and may be a display of another type. For example, the stereoscopic video display device 200 may be a volume display type display in which a plurality of liquid crystal panels are arranged in the depth direction.
  • the video and the stereoscopic video may be either a moving image or a still image.
  • the stereoscopic video display device 200 includes a display surface 210 for displaying a video.
  • the display surface 210 is assumed to be parallel to the XY plane. As an example, it is assumed that the display surface 210 can display an image composed of a plurality of pixels arranged in m (natural number) rows and n (natural number) columns.
  • m and n are assumed to be 1080 and 1920, respectively. That is, it is assumed that the display surface 210 can display an image having a size of 1920 ⁇ 1080 pixels (hereinafter, also referred to as full HD size).
  • an image of a size that can be displayed on the display surface 210 is also referred to as a displayable image.
  • the size of the displayable video is not limited to the full HD size.
  • the size of the displayable image may be, for example, the size of 1280 ⁇ 720 pixels.
  • the stereoscopic video display device 200 is, for example, a device that displays a stereoscopic video by a frame sequential method.
  • the size of the stereoscopic video displayed by the display surface 210 is equal to the size of the displayable video.
  • the display method of the stereoscopic video of the stereoscopic video display device 200 is not limited to the frame sequential method.
  • the stereoscopic video display method of the stereoscopic video display device 200 may be, for example, a lenticular method.
  • the size of the stereoscopic video displayed by the display surface 210 is smaller than the size of the displayable video.
  • the 3D image reproduction apparatus 100 is connected to the 3D image display apparatus 200 by the signal cable 10.
  • the signal cable 10 is a high-definition multimedia interface (HDMI) cable.
  • the signal cable 10 is not limited to the HDMI cable, and may be, for example, a cable for D terminal, a coaxial cable, or the like. Further, the communication between the stereoscopic video reproduction device 100 and the stereoscopic video display device 200 is not limited to communication using a wire, and may be wireless communication.
  • the stereoscopic video reproduction device 100 transmits a signal indicating the stereoscopic video 30 as a three-dimensional video to the stereoscopic video display device 200 via the signal cable 10.
  • the stereoscopic video 30 is configured of a video 31 for the left eye and a video 32 for the right eye.
  • the left-eye video 31 is a video to be shown to the left eye of the viewer (user) (hereinafter also referred to as a first viewpoint).
  • the right-eye image 32 is an image for showing the viewer's right eye (hereinafter, also referred to as a second viewpoint).
  • the left-eye video 31 and the right-eye video 32 are two-dimensional video images having parallax.
  • the stereoscopic video display device 200 alternately displays the left-eye video 31 and the right-eye video 32 on the display surface 210.
  • the active shutter glasses 300 shield the right eye of the viewer. Further, when the right-eye image 32 is displayed on the display surface 210, the active shutter glasses 300 shield the left eye of the viewer.
  • a viewer who wears the active shutter glasses 300 having such a configuration can view the left-eye video 31 with the left eye, and can view the right-eye video 32 with the right eye. Thereby, the viewer can feel the stereoscopic video 30 stereoscopically. That is, the stereoscopic video 30 is expressed using the first viewpoint video (left-eye video 31) of the first viewpoint and the second viewpoint video (right-eye video 32) of the second viewpoint.
  • the display method of the stereoscopic video is not limited to the frame sequential method using the active shutter glasses 300.
  • the display method of the stereoscopic video may be a method using polarized glasses.
  • a method of displaying a stereoscopic image may be a method using a parallax barrier, a lenticular sheet or the like.
  • the number of viewpoints required for the stereoscopic video display device 200 to display the stereoscopic video 30 is not limited to two, and may be three or more.
  • the stereoscopic image 30 is an image generated using a plurality of depth values.
  • the depth value corresponds to the amount of parallax between the left-eye video and the right-eye video.
  • FIG. 2 is a diagram for explaining a stereoscopic video.
  • a stereoscopic video is composed of a plurality of pixels.
  • each pixel forming a stereoscopic video is referred to as a stereoscopic display pixel.
  • a plurality of depth values are respectively associated with a plurality of stereoscopic display pixels constituting a stereoscopic video.
  • the display surface 210 is a parallax zero surface.
  • the parallax zero plane is a plane in which the parallax of the pixels at the same position of the left eye image and the right eye image displayed on the parallax zero plane is zero.
  • the depth value indicates a value larger than a predetermined parallax zero reference value when a stereoscopic display pixel in the stereoscopic image is disposed (displayed) on the other side of the display surface 210. That is, when the insertion amount of the stereoscopic display pixel from the display surface 210 is a positive value, the depth value indicates a value larger than the parallax zero reference value.
  • the parallax zero reference value is a value for arranging a stereoscopic display pixel at the position of the parallax zero surface (display surface 210).
  • the depth value indicates a value less than the parallax zero reference value when stereoscopic display pixels are arranged (displayed) on the front side of the display surface 210 in a stereoscopic image. That is, when the projection amount of the stereoscopic display pixel from the display surface 210 is a positive value, the depth value indicates a value less than the parallax zero reference value.
  • each depth value is a value indicating the projection amount or back amount of the stereoscopic display pixel corresponding to the depth value from the zero parallax surface (display surface 210).
  • the depth value is represented in the range of 0 to 255 so that the value increases as the distance from the position of the viewer 40 increases, and the case where the parallax zero reference value is 123 is An example will be described.
  • the depth value is not limited to the range of 0 to 255, and may be represented, for example, in the range of 0 to 511. Also, if the disparity zero reference value is 0, the depth value may be represented by a positive value and a negative value.
  • a pixel for stereoscopic display is also referred to as a stereoscopic display pixel.
  • the position where the depth value is the parallax zero reference value (123) corresponds to the position of the display surface 210 (parallax zero surface) in the Z direction.
  • the stereoscopic display pixels are displayed at, for example, a position P11 in the left-eye video and a position P12 in the right-eye video.
  • the viewer 40 looks at the stereoscopic display pixels arranged at the position P11 with the left eye 41 by the operation of the active shutter glasses 300.
  • the viewer 40 looks at the pixel for stereoscopic display disposed at the position P12 with the right eye 42 by the operation of the active shutter glasses 300.
  • the viewer 40 looks as if the stereoscopic display pixel is arranged at the position P10 ahead of the display surface 210 by the distance d1 in the stereoscopic video.
  • the depth value of the stereoscopic display pixel indicates a value less than the parallax zero reference value (123).
  • the horizontal distance between the position P12 and the position P11 corresponds to the amount of parallax.
  • the parallax amount also corresponds to the projection amount d1 of the stereoscopic display pixel in the depth value.
  • the depth value is the parallax zero reference value
  • the viewer 40 looks as if stereoscopic display pixels are arranged on the display surface 210 (parallax zero surface).
  • the relation between the parallax amount and the depth value of the stereoscopic display pixel may be performed by a predetermined conversion formula.
  • the amount of parallax is d
  • the depth value of the stereoscopic display pixel is L
  • the depth value of the parallax zero plane is L b
  • the distance between the left eye 41 and the right eye 42 of the viewer 40 is e. It may be At this time, the depth value L is proportional to the difference in distance between the viewer 40 and P10 on the Z axis.
  • the stereoscopic display pixels are displayed at, for example, a position P21 in the left-eye video and a position P22 in the right-eye video.
  • the viewer 40 looks at the stereoscopic display pixels arranged at the position P21 with the left eye 41 by the operation of the active shutter glasses 300. Further, in this case, the viewer 40 looks at the stereoscopic display pixel arranged at the position P22 with the right eye 42 by the operation of the active shutter glasses 300.
  • the viewer 40 looks as if the stereoscopic display pixel is arranged at the position P20 on the opposite side of the display surface 210 by the distance d2 in the stereoscopic image.
  • the depth value of the stereoscopic display pixel indicates a value larger than the parallax zero reference value (123).
  • the relation between the parallax amount and the depth value of the stereoscopic display pixel may be performed by a predetermined conversion formula.
  • the amount of parallax is d
  • the depth value of the stereoscopic display pixel is L
  • the depth value of the parallax zero plane is L b
  • the distance between the left eye 41 and the right eye 42 of the viewer 40 is e. It may be At this time, the depth value L is proportional to the difference in distance between the viewer 40 and P10 on the Z axis.
  • FIG. 3 is a block diagram showing an example of the configuration of the stereoscopic video reproduction device 100. As shown in FIG.
  • the stereoscopic video reproduction device 100 includes a stereoscopic video generation device 110.
  • the 3D image reproduction apparatus 100 also includes a processing unit and the like (not shown).
  • the three-dimensional video generation device 110 is a device for generating a three-dimensional video. Although details will be described later, the three-dimensional video generation apparatus 110 acquires depth information of the three-dimensional video to be processed, and updates the depth information. Then, using the updated depth information, a new stereoscopic image is generated.
  • the depth information is information indicating a plurality of depth values respectively corresponding to a plurality of stereoscopic display pixels constituting a stereoscopic video to be processed.
  • the stereoscopic video generation device 110 receives the stereoscopic video signal SG1.
  • the stereoscopic video signal SG1 is a signal indicating stereoscopic video data and depth information corresponding to the stereoscopic video.
  • the stereoscopic video data is data of a left-eye video and a right-eye video.
  • the stereoscopic video signal SG1 may be, for example, a signal indicating data of a two-dimensional video and depth information for converting the two-dimensional video into a stereoscopic video.
  • the stereoscopic video signal SG1 is, for example, a signal indicating data read from the recording medium.
  • the recording medium is, for example, a BD (Blu-ray Disc (registered trademark)).
  • the above-described processing unit included in the stereoscopic video reproduction device 100 performs processing of reading stereoscopic video data and depth information from the recording medium.
  • the stereoscopic video signal SG1 is not limited to the signals described above.
  • the stereoscopic video signal SG1 may be, for example, a signal acquired from a broadcast wave.
  • the above-described processing unit included in the stereoscopic video reproduction device 100 has, for example, the functions of a tuner and a demodulation circuit.
  • depth information will be described.
  • the depth information is expressed, for example, as the following depth table T100.
  • FIG. 4 is a diagram showing a depth table T100 as depth information.
  • FIG. 4 shows the depth table T100 when the stereoscopic video is one frame of a moving image or a still image.
  • the depth table T100 indicates a plurality of depth values respectively corresponding to a plurality of stereoscopic display pixels constituting a stereoscopic video.
  • the plurality of depth values indicated by the depth table T100 are values for generating a stereoscopic video.
  • the plurality of depth values indicated by the depth table T100 are arranged in a matrix like the plurality of stereoscopic display pixels constituting the stereoscopic video.
  • each of the plurality of depth values indicated by the depth table T100 is regarded as a pixel value.
  • the depth table T100 as depth information indicates an image composed of a plurality of depth values (hereinafter, also referred to as a depth image).
  • the depth image corresponds to a depth map (disparity map).
  • the value of each pixel constituting the depth image is a depth value.
  • Cmn indicates the depth value of a pixel corresponding to m rows and n columns in a stereoscopic video.
  • C12 indicates a depth value of a pixel corresponding to one row and two columns in a stereoscopic video.
  • each depth value indicated by the depth table T100 is not limited to the depth value corresponding to one pixel.
  • Each depth value indicated by the depth table T100 may be, for example, a depth value corresponding to four pixels.
  • the number of depth values indicated by the depth table T100 is one fourth of the number of stereoscopic display pixels constituting the stereoscopic video.
  • FIG. 5 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value.
  • a stereoscopic target object is an object to be displayed stereoscopically in a stereoscopic video.
  • FIG. 5A is a perspective view showing the positional relationship between the stereoscopic target object 50 in the stereoscopic video and the display surface 210.
  • FIG. FIG. 5B is a graph for explaining the depth value of the three-dimensional object 50.
  • the vertical axis indicates the depth value (Z direction).
  • the horizontal axis indicates the X direction (horizontal direction of stereoscopic video).
  • FIG. 6 is a view showing an example of a stereoscopic video G100 showing the stereoscopic target object 50. As shown in FIG. FIG. 6 is a diagram showing a stereoscopic image G100 parallel to the XY plane. Note that, in FIG. 6, in order to explain the depth value, a depth line L10 not shown in the stereoscopic video G100 is actually shown.
  • the depth line L10 is a line indicating n depth values respectively corresponding to n stereoscopic display pixels arranged in one line in the stereoscopic video G100.
  • the stereoscopic target object 50 is disposed, for example, on the front side of the display surface 210.
  • the depth line L10 is shown on the stereoscopic target object 50 and the display surface 210.
  • FIG. 5B is a graph showing the depth line L10.
  • the depth line L10 in the region R10 corresponds to the depth line L10 shown in the stereoscopic target object 50. From the depth line L10 in the region R10 of FIG. 5B, it can be seen that the surface of the three-dimensional object 50 has unevenness.
  • the depth table T100 is a table corresponding to the stereoscopic video G100
  • the depth image indicated by the depth table T100 indicates the stereoscopic video G100 expressed in gray scale.
  • the stereoscopic video generation device 110 includes a depth information updating device 101 and a stereoscopic video generation unit 113.
  • the depth information updating apparatus 101 performs processing using depth values corresponding to respective pixels constituting a stereoscopic video.
  • the depth information update device 101 includes a depth information acquisition unit 111 and a depth information update unit 112.
  • the depth information acquisition unit 111 and the stereoscopic video generation unit 113 receive the stereoscopic video signal SG1.
  • the depth information acquisition unit 111 acquires depth information indicated by the received stereoscopic video signal SG1.
  • the depth information is, for example, the depth table T100 of FIG.
  • the depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112.
  • the depth information acquisition unit 111 acquires depth information by performing stereo matching, which is a known technique, for example.
  • stereo matching which is a known technique, for example.
  • a value obtained by passing a specific conversion equation on the depth information may be acquired as depth information.
  • the depth information update unit 112 updates the received depth information, the details of which will be described later. Then, the depth information update unit 112 transmits the updated depth information to the stereoscopic video generation unit 113.
  • updated depth information transmitted by the depth information update unit 112 to the stereoscopic video generation unit 113 is referred to as updated depth information.
  • the stereoscopic video generation unit 113 generates a new stereoscopic video using stereoscopic video data indicated by the received stereoscopic video signal SG1 and the received updated depth information.
  • the stereoscopic video generation unit 113 transmits, to the stereoscopic video display device 200, the stereoscopic video signal SG2 indicating the generated new stereoscopic video (video for left eye and video for right eye).
  • the stereoscopic video display device 200 alternately displays the video for the left eye and the video for the right eye indicated by the received stereoscopic video signal SG2 on the display surface 210 for each frame.
  • FIG. 7 is a flowchart of stereoscopic video generation processing.
  • step S110 the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1.
  • the depth information is, for example, the depth table T100 of FIG.
  • the depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112.
  • step S120 depth information update processing is performed.
  • the depth information update processing processing in which the depth information update unit 112 emphasizes a component of a specific band for a depth image composed of a plurality of depth values indicated by the depth information (depth table T100) (hereinafter, emphasis processing)
  • the depth information is updated by performing (a).
  • the specific band is a specific frequency band.
  • a stereoscopic video to be processed is referred to as a processing stereoscopic video.
  • the processing target stereoscopic video is a still image.
  • the processing target three-dimensional video is described as an example of the three-dimensional video G100.
  • the depth image processed in the depth information update process corresponds to the stereoscopic video G100.
  • step S125 depth enhancement processing is performed.
  • the depth information updating unit 112 emphasizes (increases) the change in depth value in a specific band for a depth image configured of a plurality of depth values indicated by the depth table T100 as depth information.
  • Perform filter processing That is, the depth information updating unit 112 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information (depth table T100).
  • the enhancement process is, for example, a process of increasing the absolute value of the difference between the two depth values at which the absolute value of the difference between the depth values of two adjacent pixels is less than a threshold in at least part of the depth image.
  • the threshold is a threshold used in step S122 described later.
  • the threshold is, for example, 20.
  • emphasizing the change of the depth value of the high frequency can be considered to be equivalent to the sharpening processing in the image processing. That is, emphasizing a change in depth value of a specific band can be performed by, for example, an FIR filter, as with a filter in image processing.
  • the depth information update unit 112 performs filter processing using a two-dimensional FIR (Finite Impulse Response) filter on the depth image indicated by the depth table T100.
  • FIR Finite Impulse Response
  • the filter process of step S125 is performed with one pixel among the plurality of pixels forming the depth image as a pixel of interest. Note that each time the process of step S125 is performed, the filter process is performed with the different pixel as the pixel of interest.
  • the depth table T100 as depth information is updated.
  • the depth table T100 is updated each time the process of step S125 is performed.
  • the FIR filter used for the above-mentioned filter processing is, for example, a sharpening filter using an unsharp mask.
  • the filter used for the filtering process is not limited to the sharpening filter, and may be another filter as long as it is a filter that emphasizes a change in depth value in a specific band.
  • step S125 When the process of step S125 is performed on all the pixels constituting the depth image, the depth information update unit 112 transmits the updated depth table T100 to the stereoscopic video generation unit 113 as updated depth information. . Then, the depth information update process in step S120 ends, and the process proceeds to step S130.
  • step S120 By the depth information updating process of step S120, the amplitude of the depth line L10 in the region R10 shown in FIG. 8A becomes large as shown in FIG. 8B. That is, the depth (three-dimensional effect) of the three-dimensional target object 50 is emphasized by the process of step S125.
  • step S130 the three-dimensional video generation unit 113 uses the three-dimensional video data indicated by the received three-dimensional video signal SG1 and the received updated depth information (the updated depth table T100) to perform a new process.
  • the stereoscopic video data indicated by the stereoscopic video signal SG1 is data of a left-eye video and a right-eye video.
  • the stereoscopic video generation unit 113 calculates the parallax amount of the left-eye video and the right-eye video using the plurality of depth values indicated by the updated depth information. Then, using the calculated amount of parallax, the stereoscopic video generation unit 113 updates one or both of the left-eye video and the right-eye video indicated by the stereoscopic video signal SG1. As a result, a new three-dimensional video composed of a new left-eye video and a right-eye video is generated.
  • the process of generating a stereoscopic video using depth information is, for example, a process according to the MVC (Multiview Video Coding) standard. Therefore, detailed description of the process of calculating the amount of parallax, the process of updating the left-eye video and the right-eye video, and the like is not performed.
  • MVC Multiview Video Coding
  • the stereoscopic video generation unit 113 determines a plurality of depths indicated by the depth information. The amount of parallax is calculated using the value. The three-dimensional video generation unit 113 generates a new left-eye video and a right-eye video from the two-dimensional video using the calculated parallax amount. As a result, a new three-dimensional video composed of a new left-eye video and a right-eye video is generated.
  • the stereoscopic video generation unit 113 transmits, to the stereoscopic video display device 200, the stereoscopic video signal SG2 indicating the generated new stereoscopic video (video for left eye and video for right eye).
  • the three-dimensional video generation processing described above is processing when the processing target three-dimensional video is a still image, but when the processing target three-dimensional video is a moving image, the above-described three-dimensional video is generated for each frame constituting the moving image. The video generation process is repeated.
  • the depth information updating unit 112 applies a component (depth of a specific band) to a depth image composed of a plurality of depth values indicated by the depth information (depth table T100). Process to emphasize the change of value).
  • a stereoscopic image is generated using such depth information. Therefore, according to the present embodiment, it is possible to generate a three-dimensional video in which the expression of fine three-dimensional effect is improved.
  • the writable stereoscopic effect is, for example, a stereoscopic effect in which each of a plurality of planes representing an object image is disposed at a different depth position.
  • the simplistic 3D effect is a 3D effect in which the 3D effect of the object image alone is hardly expressed.
  • the 3D image viewing system in the first modification of the first embodiment is the 3D image viewing system 1000 of FIG. That is, the three-dimensional video reproduction apparatus in the first modification of the first embodiment is the three-dimensional video reproduction apparatus 100 of FIG. Therefore, detailed description of the configuration of stereoscopic video reproduction device 100 will not be repeated.
  • the stereoscopic video generation device in the first modification of the first embodiment is the stereoscopic video generation device 110 of FIG. 3.
  • stereoscopic video generation processing A processing for generating a stereoscopic video (hereinafter also referred to as stereoscopic video generation processing A) in the first modification of the present embodiment will be described.
  • FIG. 9 is a flowchart of stereoscopic video generation processing A.
  • the process of the same step number as the step number of FIG. 7 is performed in the same manner as the process described in the first embodiment, and therefore the detailed description will not be repeated.
  • differences from the first embodiment will be mainly described.
  • the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1 (S110).
  • step S120A is performed.
  • step S120A depth information update processing A is performed.
  • the depth information update process A differs from the depth information update process of FIG. 7 in that the process of step S125A is performed instead of step S125 and the process of step S126 is further performed.
  • the other processing of depth information update processing A is the same processing as the depth information update processing, and therefore detailed description will not be repeated.
  • step S125A depth enhancement processing A is performed.
  • the depth information update unit 112 determines the depth in a specific band with respect to the depth image including a plurality of depth values indicated by the depth information (depth table T100). Apply a filter process that emphasizes (increases) changes in value. That is, the depth information updating unit 112 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information.
  • the filter process of step S125A is performed with one pixel among the plurality of pixels constituting the depth image indicated by the depth information (depth table T100) as the pixel of interest. Note that each time the process of step S125A is performed, the filter process is performed with the different pixel as the pixel of interest.
  • step S125A When the process of step S125A is performed on all the pixels constituting the depth image, the depth enhancement process A ends, and the process proceeds to step S126.
  • the depth table T100 as depth information is updated by the depth enhancement process A in step S125A.
  • the depth information updated by the depth enhancement processing A is referred to as first updated depth information.
  • the first updated depth information is the updated depth table T100.
  • the updated depth table T100 is also referred to as a first updated depth table.
  • the depth enhancement processing A By the depth enhancement processing A, the amplitude of the depth line L10 in the region R10 shown in FIG. 8A is increased as shown in FIG. 8B. That is, the depth (three-dimensional effect) of the three-dimensional object 50 is emphasized by the process of step S125A.
  • step S126 depth compression processing is performed.
  • the depth information updating unit 112 sets the plurality of depth values indicated by the updated depth information such that the absolute values of the plurality of depth values indicated by the updated depth information decrease at substantially the same rate. By changing, the updated depth information is updated.
  • the depth information updating unit 112 updates the first updated depth information by multiplying the absolute value of the plurality of depth values indicated by the first updated depth information by a coefficient less than one.
  • a coefficient less than 1 is, for example, a value in the range of 0.4 to 0.7.
  • the process which updates 1st updated depth information is not limited to the said process.
  • the process of updating the first updated depth information may be a process of subtracting a predetermined value from the absolute values of the plurality of depth values indicated by the first updated depth information.
  • the processing target stereoscopic video is the stereoscopic video G100 of FIG.
  • the amplitude of the depth line L10 shown in FIG. 8B is entirely reduced as shown in FIG. 10A.
  • the distance between the stereoscopic target object 50 whose stereoscopic effect is enhanced by the processing of step S126 and the display surface 210 (parallax zero surface) is reduced.
  • the greater the distance between the stereoscopic target object 50 and the display surface 210 (parallax zero surface) the harder it is for the viewer to identify the stereoscopic effect of the stereoscopic target object 50.
  • the first updated depth information after update for reducing the distance between the stereoscopic target object 50 in which the stereoscopic effect is emphasized and the display surface 210 (parallax zero surface) is a stereoscopic object in which the writing stereoscopic effect is eliminated.
  • Information for generating a video is a stereoscopic object in which the writing stereoscopic effect is eliminated.
  • the first updated depth information after updating is referred to as second updated depth information.
  • the second updated depth information is the first updated depth table after the update.
  • the depth information update unit 112 transmits the second updated depth information as the updated depth information to the stereoscopic video generation unit 113.
  • step S130 is performed.
  • the stereoscopic video generation processing A described above is processing when the processing target stereoscopic video is a still image, but when the processing target stereoscopic video is a moving image, the above-described processing is performed for each frame constituting the moving image.
  • the stereoscopic video generation processing A is repeatedly performed.
  • the depth information updating unit 112 reduces the absolute values of the plurality of depth values indicated by the first updated depth information at substantially the same rate,
  • the first updated depth information is updated by changing the plurality of depth values indicated by the first updated depth information.
  • the first updated depth information is information generated by the process of step S125A similar to the process of step S125 of the first embodiment.
  • a stereoscopic video is generated using such updated first updated depth information. Therefore, according to the modification of the present embodiment, it is possible to improve the expression of the fine three-dimensional effect and to generate a three-dimensional video in which the sloppy three-dimensional effect is eliminated.
  • the parallax amount when the viewer views the stereoscopic video generated according to the modification of the present embodiment Can reduce eye fatigue caused by being too large.
  • the 3D image viewing system in the second modification of the first embodiment is the 3D image viewing system 1000 of FIG. That is, the three-dimensional video reproduction apparatus in the modification 2 of the first embodiment is the three-dimensional video reproduction apparatus 100 of FIG. Therefore, detailed description of the configuration of stereoscopic video reproduction device 100 will not be repeated.
  • the stereoscopic video generation device in the modification 2 of the first embodiment is the stereoscopic video generation device 110 of FIG. 3.
  • FIG. 11 is a block diagram showing an example of the configuration of the depth information update unit 112. As shown in FIG.
  • the depth information update unit 112 includes a calculation unit 121 and a processing execution unit 123.
  • the calculation unit 121 calculates the difference between the depth value of the processing target pixel that is the processing target pixel and the depth value of the pixel near the processing target pixel among the depth images, although the details will be described later.
  • the process of calculating the depth value is performed on all the pixels constituting the depth image.
  • the process execution unit 123 performs the enhancement process on the processing target pixel corresponding to the difference depth value that is less than the threshold. , Update depth information.
  • stereoscopic video generation processing B processing for generating a stereoscopic video (hereinafter also referred to as stereoscopic video generation processing B) according to the second modification of the present embodiment will be described.
  • FIG. 12 is a flowchart of stereoscopic video generation processing B.
  • the process of the same step number as the step number of FIG. 7 is performed in the same manner as the process described in the first embodiment, and therefore the detailed description will not be repeated.
  • differences from the first embodiment will be mainly described.
  • the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1.
  • the depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112. (S110).
  • step S120B is performed.
  • step S120B depth information update processing B is performed.
  • the depth information update process B differs from the depth information update process of FIG. 7 in that the process of step S125B is performed instead of step S125 and the process of steps S121 and S122 is further performed.
  • the other processing of depth information update processing B is the same processing as the depth information update processing, and therefore detailed description will not be repeated.
  • the depth information update processing B will be described in detail below.
  • a stereoscopic video to be processed is referred to as a processing stereoscopic video.
  • the processing target stereoscopic video is a still image.
  • the processing target three-dimensional video is described as an example of the three-dimensional video G100.
  • the depth image processed in the depth information update process B corresponds to the stereoscopic video G100.
  • step S121 the calculation unit 121 calculates the absolute value of the difference between the depth values of the two adjacent pixels forming a plurality of pixels forming the depth image indicated by the depth information (depth table T100) indicated by the stereoscopic video signal SG1 ( Hereinafter, the difference depth value is calculated.
  • the depth image corresponds to the processing target stereoscopic video.
  • the calculation unit 121 sets one pixel of the plurality of pixels forming the depth image as a processing target pixel.
  • FIG. 13 is a diagram for explaining a processing target pixel and a pixel close to the processing target pixel.
  • the pixels P31, P32, P33, and P34 are pixels adjacent to the processing target pixel P30.
  • the pixel P31 is a pixel that is close to the processing target pixel P30 in the vertical direction.
  • the pixel P32 is a pixel close to the processing target pixel P30 in the left-right direction.
  • the pixel P33 is a pixel close to the processing target pixel P30 in the left-right direction.
  • the pixel P34 is a pixel which is close to the processing target pixel P30 in the vertical direction.
  • each of the pixels P31, P32, P33, and P34 is referred to as a processing target proximity pixel.
  • the processing target proximity pixels are only the pixels P33 and P34. Further, when the processing target pixel is a pixel in the first row and the n-th column (upper right end) of the processing target stereoscopic video, the processing target proximity pixels are only the pixels P32 and P34.
  • the processing target neighboring pixels are only the pixels P31 and P33.
  • the processing target pixel is a pixel in the m-th row and the n-th column (lower right end) of the processing target three-dimensional video, the processing target neighboring pixels are only the pixels P31 and P32.
  • processing target proximity pixel may be a pixel that is close to the processing target pixel P30 in the oblique direction, as with pixels P35, P36, P37, and P38 in FIG.
  • a pixel separated by two or more pixels such as the pixel P41 may be set as the processing close pixel.
  • P42 and P33 may be similarly set as the processing target proximity pixels instead of the pixel P32.
  • the pixel P44 may be set as the processing target proximity pixel.
  • step S121 calculation unit 121 calculates the absolute value of the difference between the depth value of processing target pixel P30 and the depth value of each processing target proximity pixel (hereinafter referred to as difference depth value). calculate.
  • the pixel P30 and the peripheral pixels of the pixel P30 are added at a constant ratio instead of the depth value of the pixel P30. It is good.
  • the pixel P41 and the peripheral pixels of the pixel p41 may be added at a constant ratio.
  • the depth values of pixels P31, P30, and P32 are added at a ratio of 1: 2: 1, and the depth values of pixels P41A, P41, and P41B instead of the depth value of pixel P41.
  • What added at a ratio of 1: 2: 1 may be used, and the difference between these two values may be used as the difference depth value.
  • step S122 the calculation unit 121 determines whether at least one of the plurality of difference depth values calculated by the process of step S121 is equal to or greater than a predetermined threshold.
  • the threshold is, for example, a value of 5 to 20% of the difference between the maximum value and the minimum value set for the depth value.
  • the threshold is, for example, 20 when the depth value is represented in the range of 0 to 255.
  • step S122 If NO in step S122, the process proceeds to step S125B. That is, when the calculated plurality (all) of the difference depth values is less than the threshold, the process proceeds to step S125B. On the other hand, if YES in step S123, the process of step S121 is performed again on the next pixel.
  • step S125B depth enhancement processing B is performed.
  • the processing execution unit 123 sets the processing target pixel as a target pixel, as in the first embodiment, for a depth image composed of a plurality of depth values indicated by the depth table T100 as depth information. Filter processing to emphasize (increase) the change in depth value in a specific band. That is, the process execution unit 123 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information (depth table T100).
  • Step S125B is executed only in the case of NO at step S122. That is, when the calculated difference depth value is less than the threshold, the processing execution unit 123 performs the enhancement process on the processing target pixel corresponding to the difference depth value that is less than the threshold to obtain depth information. Update.
  • steps S121, S122, and S125B are repeated for all the pixels forming the depth image. Note that, each time the process of step S121 is performed, a different pixel is set as the processing target pixel among the plurality of pixels forming the depth image.
  • the calculation unit 121 calculates a difference depth value which is an absolute value of a difference between the depth value of the processing target pixel which is the processing target pixel and the depth value of the pixel close to the processing target pixel in the depth image.
  • the calculation process is performed on all the pixels constituting the depth image.
  • depth information is updated as in the first embodiment.
  • depth emphasis processing B is executed only in the case of NO at step S122.
  • the change in depth is emphasized, but two objects at different depths The change in depth is not emphasized at the boundaries.
  • the change in depth in the same object is hardly expressed. That is, when the change in depth value at the boundary between the same object and another object is much larger than the change in depth value in the range considered to be the same object, the change in depth in the same object is almost all It is not expressed. In this case, the viewer 40 feels as writing sensible.
  • the enhancement process is performed on the processing target pixel corresponding to the difference depth value which is less than the threshold value. This makes it possible to emphasize changes in depth within the same object without emphasizing changes in depth between two objects at different depths. In other words, it is possible to eliminate the scribed stereoscopic effect that the change in depth in the same object is hardly expressed.
  • step S130 is performed.
  • step S126 is performed as in the first modification of the first embodiment. Depth compression processing may be performed.
  • the process of updating depth information may be performed on frequency-converted values.
  • the stereoscopic video generation device 110 may receive a 2D video and generate depth information for generating a stereoscopic video from the 2D video.
  • the depth information acquisition unit 111 of the three-dimensional video generation device 110 sets the depth value so that the image of the subject in focus in the two-dimensional video is on the near side of the zero parallax surface in the three-dimensional video. calculate. Then, the depth information acquisition unit 111 generates information indicating the calculated depth value as depth information.
  • the depth information acquisition unit 111 transmits the depth information acquired by the generation to the depth information update unit 112.
  • two adjacent pixels are not limited to two pixels in contact with each other.
  • the two adjacent pixels may be, for example, pixels arranged to sandwich one or more pixels between the two adjacent pixels.
  • all or part of the plurality of components constituting the depth information updating apparatus 101 described above may be configured by hardware. Further, all or part of the components constituting the memory management unit may be a module of a program executed by a central processing unit (CPU) or the like.
  • CPU central processing unit
  • all or part of the plurality of components constituting the stereoscopic image generation device 110 or the depth information update device 101 described above may be configured from one system LSI (Large Scale Integration: large scale integrated circuit).
  • the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, a microprocessor, a read only memory (ROM), a random access memory (RAM), etc.
  • a computer system configured to include
  • the stereoscopic video generation device 110 or the depth information update device 101 may be configured from one system LSI (integrated circuit). That is, the stereoscopic video generation device 110 or the depth information update device 101 may be an integrated circuit.
  • LSI integrated circuit
  • the present invention may be realized as a depth information updating method in which the operation of a characteristic configuration unit included in the stereoscopic video generating device 110 or the depth information updating device 101 is a step.
  • the present invention may also be implemented as a program that causes a computer to execute the steps included in such a depth information update method.
  • the present invention may be realized as a computer readable recording medium storing such a program.
  • the program may be distributed via a transmission medium such as the Internet.
  • the configuration of the depth information update device 101 is an example of a configuration for specifically explaining the present invention. That is, the depth information updating apparatus 101 may not include all the components shown in FIG. That is, the depth information updating apparatus 101 according to the present invention only needs to have the minimum configuration that can realize the effects of the present invention. For example, if the process execution unit 123 also performs the process performed by the calculation unit 121, the depth information update unit 112 of the depth information update apparatus 101 may not include the calculation unit 121.
  • the depth information updating method according to the present invention corresponds to the depth information updating process of FIG. 7, the depth information updating process A of FIG. 9, or the depth information updating process B of FIG.
  • the depth information updating method according to the present invention does not necessarily include all the corresponding steps in FIG. 7, FIG. 9 or FIG. That is, the depth information updating method according to the present invention may include only the minimum steps capable of realizing the effects of the present invention.
  • the order in which the steps in the depth information update method are performed is an example for specifically explaining the present invention, and may be an order other than the above. Also, some of the steps in the depth information update method and other steps may be performed in parallel independently of each other.
  • the present invention can be used as a depth information updating device capable of generating depth information for generating a stereoscopic video with an improved representation of fine stereoscopic effect.
  • 3D Object 100 3D Image Reproduction Device 101 Depth Information Update Device 110 3D Image Generation Device 111 Depth Information Acquisition Unit 112 Depth Information Update Unit 113 3D Image Generation Unit 121 Calculation Unit 123 Processing Execution Unit 200 3D Image Display Device 210 Display Surface 300 Active shutter glasses 1000 3D viewing system

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A depth information updating device (101) is provided with: a depth information acquisition unit (111) which acquires depth information which indicates a plurality of depth values; and a depth information updating unit (112) which updates the depth information by subjecting a depth image configured from the plurality of depth values indicated by the depth information to enhancement processing which is processing that enhances components in a certain band.

Description

奥行き情報更新装置、立体映像生成装置および奥行き情報更新方法Depth information update device, stereoscopic video generation device, and depth information update method
 本発明は、立体映像を生成するための奥行き情報を更新する奥行き情報更新装置、立体映像生成装置および奥行き情報更新方法に関する。 The present invention relates to a depth information update device, a stereoscopic video generation device, and a depth information update method for updating depth information for generating a stereoscopic video.
 従来、液晶パネルなどを用いた映像表示装置は、2次元映像を表示する装置として使用されてきた。その一方で、視差を有する2つの2次元映像を、アクティブシャッタメガネを利用して交互に視聴することにより、3次元映像(立体映像)を鑑賞できる立体映像表示装置の技術開発が進んでいる。 Conventionally, a video display device using a liquid crystal panel or the like has been used as a device for displaying a two-dimensional video. On the other hand, technological development of a three-dimensional video display device capable of viewing three-dimensional video (stereoscopic video) is progressing by alternately viewing two two-dimensional videos having parallax using active shutter glasses.
 また、近年では、立体映像を生成するための様々な技術が開発されている。例えば、特許文献1には、立体映像を生成するための奥行き画像の奥行き値を調整する。そして、調整後の奥行き値を用いて、立体映像を生成する技術(以下、従来技術Aという)が開示されている。 Also, in recent years, various techniques for generating stereoscopic video have been developed. For example, Patent Document 1 adjusts the depth value of a depth image for generating a stereoscopic video. And the technique (henceforth prior art A) which produces | generates a stereo image using the depth value after adjustment is disclosed.
特開2003-209858号公報(段落0060、段落0076)JP, 2003-209858, A (paragraph 0060, paragraph 0076)
 しかしながら、従来技術Aでは、立体映像における細かな立体感を向上させることができないという問題点がある。 However, in the prior art A, there is a problem that it is not possible to improve the fine three-dimensional effect in a three-dimensional image.
 具体的には、従来技術Aでは、違和感の少ない立体映像を生成するための最大飛び出し量および最大沈み込み量の間の値に、奥行き値を設定する。当該奥行き値の設定は、細かな立体感の向上について特に考慮されていない。そのため、従来技術Aでは、隣接する複数の画素の細かな立体感は向上しない。 Specifically, in the prior art A, the depth value is set to a value between the maximum projection amount and the maximum depression amount for generating a stereoscopic image with less discomfort. The setting of the depth value is not particularly considered for the improvement of the fine stereoscopic effect. Therefore, in the prior art A, the fine three-dimensional effect of a plurality of adjacent pixels is not improved.
 本発明は、上述の問題点を解決するためになされたものであって、その目的は、細かな立体感の表現が向上した立体映像を生成するための奥行き情報を生成することができる奥行き情報更新装置等を提供することである。 The present invention has been made to solve the above-described problems, and an object thereof is to provide depth information capable of generating depth information for generating a stereoscopic image with an improved fine stereoscopic expression. It is providing an update apparatus etc.
 上記目的を達成するために、本発明の一態様に係る奥行き情報更新装置は、立体映像を構成する各画素に対応する奥行き値を用いた処理を行う奥行き情報更新装置であって、立体映像を生成するための複数の奥行き値を示す奥行き情報を取得する奥行き情報取得部と、前記奥行き情報が示す前記複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行うことにより、前記奥行き情報を更新する奥行き情報更新部とを備える。 In order to achieve the above object, a depth information updating apparatus according to an aspect of the present invention is a depth information updating apparatus that performs processing using a depth value corresponding to each pixel constituting a stereoscopic video, A process of emphasizing a component of a specific band with respect to a depth image composed of a plurality of depth information indicating a plurality of depth values to be generated and a depth image including the plurality of depth values indicated by the depth information And a depth information update unit configured to update the depth information by performing the emphasizing process.
 すなわち、奥行き情報更新装置は、複数の奥行き値を示す奥行き情報を取得する奥行き情報取得部と、前記奥行き情報が示す前記複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行うことにより、前記奥行き情報を更新する奥行き情報更新部とを備える。 That is, the depth information updating apparatus acquires a depth information acquiring unit that acquires depth information indicating a plurality of depth values, and a component of a specific band with respect to a depth image including the plurality of depth values indicated by the depth information. And a depth information updating unit configured to update the depth information by performing an emphasizing process that emphasizes.
 これにより、奥行き値の変化が強調された奥行き情報を生成することができる。すなわち、細かな立体感の表現が向上した立体映像を生成するための奥行き情報を生成することができる。 Thereby, it is possible to generate depth information in which a change in depth value is emphasized. That is, it is possible to generate depth information for generating a stereoscopic video with an improved expression of fine stereoscopic effect.
 また、好ましくは、前記奥行き情報更新部は、前記奥行き画像の少なくとも一部において、近接する2つの画素の奥行き値の差の絶対値が閾値未満である該2つの奥行き値の差の絶対値を大きくする前記強調処理を行う。 In addition, preferably, the depth information update unit determines, in at least a part of the depth image, an absolute value of a difference between two depth values of which an absolute value of a difference between depth values of two adjacent pixels is less than a threshold. The emphasizing process is performed to increase the size.
 また、好ましくは、前記奥行き情報更新部は、前記奥行き画像のうち、処理対象の画素である処理対象画素の奥行き値と、該処理対象画素に近接する画素の奥行き値との差の絶対値である差分奥行き値を算出する処理を、前記奥行き画像を構成する全ての画素に対して行う算出部と、算出された前記差分奥行き値が閾値未満である場合、当該閾値未満である差分奥行き値に対応する前記処理対象画素に対し前記強調処理を行う処理実行部とを含む。 In addition, preferably, the depth information update unit is an absolute value of a difference between the depth value of the processing target pixel which is the pixel to be processed and the depth value of the pixel adjacent to the processing target pixel in the depth image. A calculation unit which performs processing of calculating a certain difference depth value for all pixels constituting the depth image, and a difference depth value less than the threshold if the calculated difference depth value is less than a threshold And a processing execution unit that performs the enhancement processing on the corresponding processing target pixel.
 また、好ましくは、前記奥行き情報更新部は、フィルタ処理である前記強調処理を行う。 In addition, preferably, the depth information update unit performs the enhancement process that is a filter process.
 また、好ましくは、前記奥行き情報更新部は、さらに、更新後の奥行き情報が示す複数の奥行き値の絶対値が、ほぼ同じ割合で小さくなるように、該更新後の奥行き情報が示す複数の奥行き値を変更することにより、該更新後の奥行き情報を更新する。 In addition, preferably, the depth information updating unit further indicates a plurality of depths indicated by the updated depth information such that absolute values of the plurality of depth values indicated by the updated depth information decrease at substantially the same rate. The updated depth information is updated by changing the value.
 また、好ましくは、前記奥行き情報更新部は、前記更新後の奥行き情報が示す複数の奥行き値の絶対値に1未満の係数を乗算することにより、該更新後の奥行き情報を更新する。 In addition, preferably, the depth information update unit updates the updated depth information by multiplying an absolute value of the plurality of depth values indicated by the updated depth information by a coefficient smaller than one.
 本発明の一態様に係る立体映像生成装置は、前記奥行き情報更新装置と、更新後の前記奥行き情報が示す複数の奥行き値を用いて、新たな立体映像を生成する立体映像生成部とを備える。 A stereoscopic video generation device according to an aspect of the present invention includes the depth information update device, and a stereoscopic video generation unit that generates a new stereoscopic video using a plurality of depth values indicated by the updated depth information. .
 本発明の一態様に係る奥行き情報更新方法は、立体映像を構成する各画素に対応する奥行き値を用いた処理を行う奥行き情報更新装置が行う。前記奥行き情報更新方法は、立体映像を生成するための複数の奥行き値を示す奥行き情報を取得する奥行き情報取得ステップと、前記奥行き情報が示す前記複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行うことにより、前記奥行き情報を更新する奥行き情報更新ステップとを含む。 The depth information update method according to an aspect of the present invention is performed by the depth information update apparatus that performs processing using depth values corresponding to respective pixels that configure a stereoscopic video. The depth information updating method includes a depth information acquisition step of acquiring depth information indicating a plurality of depth values for generating a stereoscopic video, and a depth image including the plurality of depth values indicated by the depth information. And a depth information updating step of updating the depth information by performing an emphasizing process which is a process of emphasizing a component of a specific band.
 なお、本発明は、このような奥行き情報更新装置を構成する複数の構成要素の全てまたは一部を、システムLSI(Large Scale Integration:大規模集積回路)として実現してもよい。 Note that the present invention may realize all or part of a plurality of components constituting such a depth information update apparatus as a system LSI (Large Scale Integration: large scale integrated circuit).
 また、本発明は、奥行き情報更新方法に含まれる各ステップをコンピュータに実行させるプログラムとして実現してもよい。また、本発明は、そのようなプログラムを格納するコンピュータ読み取り可能な記録媒体として実現されてもよい。また、当該プログラムは、インターネット等の伝送媒体を介して配信されてもよい。 The present invention may also be implemented as a program that causes a computer to execute the steps included in the depth information update method. Also, the present invention may be realized as a computer readable recording medium storing such a program. Also, the program may be distributed via a transmission medium such as the Internet.
 本発明により、細かな立体感の表現が向上した立体映像を生成するための奥行き情報を生成することができる。 According to the present invention, it is possible to generate depth information for generating a three-dimensional image with an improved fine three-dimensional expression.
図1は、第1の実施形態に係る立体映像視聴システムの構成の一例を示す図である。FIG. 1 is a diagram showing an example of the configuration of a stereoscopic video viewing system according to the first embodiment. 図2は、立体映像を説明するための図である。FIG. 2 is a diagram for explaining a stereoscopic video. 図3は、立体映像再生装置の構成の一例を示すブロック図である。FIG. 3 is a block diagram showing an example of the configuration of a stereoscopic video reproduction apparatus. 図4は、奥行き情報としての奥行きテーブルを示す図である。FIG. 4 is a diagram showing a depth table as depth information. 図5は、立体映像における立体対象オブジェクトと、奥行き値との関係を示す図である。FIG. 5 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value. 図6は、立体対象オブジェクトを示す立体映像の一例を示す図である。FIG. 6 is a view showing an example of a stereoscopic video showing a stereoscopic target object. 図7は、立体映像生成処理のフローチャートである。FIG. 7 is a flowchart of stereoscopic video generation processing. 図8は、奥行き値の変化を説明するための図である。FIG. 8 is a diagram for explaining changes in depth value. 図9は、立体映像生成処理Aのフローチャートである。FIG. 9 is a flowchart of stereoscopic video generation processing A. 図10は、立体映像における立体対象オブジェクトと、奥行き値との関係を示す図である。FIG. 10 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value. 図11は、奥行き情報更新部の構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of the configuration of the depth information update unit. 図12は、立体映像生成処理Bのフローチャートである。FIG. 12 is a flowchart of stereoscopic video generation processing B. 図13は、処理対象画素および処理対象画素に近接する画素を説明するための図である。FIG. 13 is a diagram for explaining a processing target pixel and a pixel close to the processing target pixel.
 以下、図面を参照しつつ、本発明の実施の形態について説明する。以下の説明では、同一の構成要素には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明を省略する場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same components are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description about them may be omitted.
 <第1の実施形態>
 図1は、第1の実施形態に係る立体映像視聴システム1000の構成の一例を示す図である。図1において、X、Y、Z方向の各々は、互いに直交する。以下の図に示されるX、Y、Z方向の各々も、互いに直交する。
First Embodiment
FIG. 1 is a diagram showing an example of the configuration of a stereoscopic video viewing system 1000 according to the first embodiment. In FIG. 1, the X, Y, and Z directions are orthogonal to one another. Each of the X, Y, Z directions shown in the following figures are also orthogonal to one another.
 図1に示されるように、立体映像視聴システム1000は、立体映像再生装置100と、立体映像表示装置200と、アクティブシャッタメガネ300とを含む。 As shown in FIG. 1, a stereoscopic video viewing system 1000 includes a stereoscopic video reproduction device 100, a stereoscopic video display device 200, and active shutter glasses 300.
 立体映像表示装置200は、例えば、プラズマディスプレイ、液晶ディスプレイ、有機ELディスプレイ等である。なお、立体映像表示装置200は、上記のディスプレイに限定されず、他の方式のディスプレイであってもよい。例えば、立体映像表示装置200は、複数の液晶パネルを奥行き方向に並べた、体積表示型のディスプレイであってもよい。 The stereoscopic video display device 200 is, for example, a plasma display, a liquid crystal display, an organic EL display, or the like. Note that the stereoscopic video display device 200 is not limited to the above display, and may be a display of another type. For example, the stereoscopic video display device 200 may be a volume display type display in which a plurality of liquid crystal panels are arranged in the depth direction.
 なお、本明細書において、映像および立体映像とは、動画像および静止画像のいずれであってもよい。 In the present specification, the video and the stereoscopic video may be either a moving image or a still image.
 立体映像表示装置200は、映像を表示するための表示面210を備える。表示面210は、XY平面に平行であるとする。表示面210は、一例として、m(自然数)行n(自然数)列に配列される複数の画素から構成される映像を表示可能であるとする。 The stereoscopic video display device 200 includes a display surface 210 for displaying a video. The display surface 210 is assumed to be parallel to the XY plane. As an example, it is assumed that the display surface 210 can display an image composed of a plurality of pixels arranged in m (natural number) rows and n (natural number) columns.
 ここで、mおよびnは、それぞれ、1080および1920であるとする。すなわち、表示面210は、横1920×縦1080画素のサイズ(以下、フルHDサイズともいう)の映像を表示可能であるとする。以下においては、表示面210が表示可能なサイズの映像を、表示可能映像ともいう。 Here, m and n are assumed to be 1080 and 1920, respectively. That is, it is assumed that the display surface 210 can display an image having a size of 1920 × 1080 pixels (hereinafter, also referred to as full HD size). Hereinafter, an image of a size that can be displayed on the display surface 210 is also referred to as a displayable image.
 なお、表示可能映像のサイズは、フルHDサイズに限定されない。表示可能映像のサイズは、例えば、横1280×縦720画素のサイズであってもよい。 The size of the displayable video is not limited to the full HD size. The size of the displayable image may be, for example, the size of 1280 × 720 pixels.
 本実施形態では、立体映像表示装置200は、一例として、フレームシーケンシャル方式により立体映像を表示する装置である。この場合、表示面210により表示される立体映像のサイズは、表示可能映像のサイズと等しい。 In the present embodiment, the stereoscopic video display device 200 is, for example, a device that displays a stereoscopic video by a frame sequential method. In this case, the size of the stereoscopic video displayed by the display surface 210 is equal to the size of the displayable video.
 なお、立体映像表示装置200の立体映像の表示方式は、フレームシーケンシャル方式に限定されない。立体映像表示装置200の立体映像の表示方式は、例えば、レンチキュラー方式であってもよい。この場合、表示面210により表示される立体映像のサイズは、表示可能映像のサイズより小さい。 Note that the display method of the stereoscopic video of the stereoscopic video display device 200 is not limited to the frame sequential method. The stereoscopic video display method of the stereoscopic video display device 200 may be, for example, a lenticular method. In this case, the size of the stereoscopic video displayed by the display surface 210 is smaller than the size of the displayable video.
 立体映像再生装置100は、信号ケーブル10により、立体映像表示装置200と接続される。信号ケーブル10は、HDMI(High-Definition Multimedia Interface)ケーブルである。 The 3D image reproduction apparatus 100 is connected to the 3D image display apparatus 200 by the signal cable 10. The signal cable 10 is a high-definition multimedia interface (HDMI) cable.
 なお、信号ケーブル10は、HDMIケーブルに限定されず、例えば、D端子用のケーブル、同軸ケーブル等であってもよい。また、立体映像再生装置100と、立体映像表示装置200との通信は、有線を利用した通信に限定されず、無線通信であってもよい。 The signal cable 10 is not limited to the HDMI cable, and may be, for example, a cable for D terminal, a coaxial cable, or the like. Further, the communication between the stereoscopic video reproduction device 100 and the stereoscopic video display device 200 is not limited to communication using a wire, and may be wireless communication.
 立体映像再生装置100は、3次元映像としての立体映像30を示す信号を、信号ケーブル10を介して、立体映像表示装置200へ送信する。立体映像30は、左目用映像31と、右目用映像32とから構成される。 The stereoscopic video reproduction device 100 transmits a signal indicating the stereoscopic video 30 as a three-dimensional video to the stereoscopic video display device 200 via the signal cable 10. The stereoscopic video 30 is configured of a video 31 for the left eye and a video 32 for the right eye.
 左目用映像31は、視聴者(ユーザ)の左目(以下、第1視点ともいう)に見せるための映像である。右目用映像32は、視聴者の右目(以下、第2視点ともいう)に見せるための映像である。左目用映像31および右目用映像32は、互いに視差を有する2次元映像である。 The left-eye video 31 is a video to be shown to the left eye of the viewer (user) (hereinafter also referred to as a first viewpoint). The right-eye image 32 is an image for showing the viewer's right eye (hereinafter, also referred to as a second viewpoint). The left-eye video 31 and the right-eye video 32 are two-dimensional video images having parallax.
 立体映像表示装置200は、左目用映像31および右目用映像32を、表示面210に交互に表示する。 The stereoscopic video display device 200 alternately displays the left-eye video 31 and the right-eye video 32 on the display surface 210.
 アクティブシャッタメガネ300は、表示面210に左目用映像31が表示されているときには、視聴者の右目を遮光する。また、アクティブシャッタメガネ300は、表示面210に右目用映像32が表示されているときには、視聴者の左目を遮光する。 When the left-eye video 31 is displayed on the display surface 210, the active shutter glasses 300 shield the right eye of the viewer. Further, when the right-eye image 32 is displayed on the display surface 210, the active shutter glasses 300 shield the left eye of the viewer.
 このような構成のアクティブシャッタメガネ300をかけた視聴者は、左目で左目用映像31を見ることができ、右目で右目用映像32を見ることができる。これにより、視聴者は、立体映像30を立体的に感じることができる。つまり、立体映像30は、第1視点の第1視点映像(左目用映像31)と第2視点の第2視点映像(右目用映像32)とを用いて表現される。 A viewer who wears the active shutter glasses 300 having such a configuration can view the left-eye video 31 with the left eye, and can view the right-eye video 32 with the right eye. Thereby, the viewer can feel the stereoscopic video 30 stereoscopically. That is, the stereoscopic video 30 is expressed using the first viewpoint video (left-eye video 31) of the first viewpoint and the second viewpoint video (right-eye video 32) of the second viewpoint.
 なお、前述したように、立体映像の表示方式は、アクティブシャッタメガネ300を用いるフレームシーケンシャル方式に限定されない。例えば、立体映像の表示方式は、偏光メガネを用いた方式であってもよい。また、例えば、立体映像の表示方式は、パララックスバリア、レンチキュラシート等を用いた方式であってもよい。 As described above, the display method of the stereoscopic video is not limited to the frame sequential method using the active shutter glasses 300. For example, the display method of the stereoscopic video may be a method using polarized glasses. Further, for example, a method of displaying a stereoscopic image may be a method using a parallax barrier, a lenticular sheet or the like.
 また、立体映像表示装置200が立体映像30を表示するために必要な視点の数は2つに限定されず、3つ以上であってもよい。 Further, the number of viewpoints required for the stereoscopic video display device 200 to display the stereoscopic video 30 is not limited to two, and may be three or more.
 次に、立体映像について詳細に説明する。立体映像30は、複数の奥行き値を利用して生成される映像である。当該奥行き値は、左目用映像と右目用映像との視差量に相当する。 Next, stereoscopic video will be described in detail. The stereoscopic image 30 is an image generated using a plurality of depth values. The depth value corresponds to the amount of parallax between the left-eye video and the right-eye video.
 図2は、立体映像を説明するための図である。立体映像は、複数の画素から構成される。以下においては、立体映像を構成する各画素を、立体表示画素という。立体映像を構成する複数の立体表示画素には、それぞれ、複数の奥行き値が対応づけられる。 FIG. 2 is a diagram for explaining a stereoscopic video. A stereoscopic video is composed of a plurality of pixels. Hereinafter, each pixel forming a stereoscopic video is referred to as a stereoscopic display pixel. A plurality of depth values are respectively associated with a plurality of stereoscopic display pixels constituting a stereoscopic video.
 図2において、表示面210は視差ゼロ面である。視差ゼロ面とは、視差ゼロ面に表示される左目用映像および右目用映像の同じ位置の画素の視差がゼロである面である。 In FIG. 2, the display surface 210 is a parallax zero surface. The parallax zero plane is a plane in which the parallax of the pixels at the same position of the left eye image and the right eye image displayed on the parallax zero plane is zero.
 奥行き値は、立体映像において立体表示画素が、表示面210の向こう側に配置(表示)される場合、所定の視差ゼロ基準値より大きい値を示す。すなわち、立体表示画素の表示面210からの奥まり量が正の値の場合、奥行き値は視差ゼロ基準値より大きい値を示す。 The depth value indicates a value larger than a predetermined parallax zero reference value when a stereoscopic display pixel in the stereoscopic image is disposed (displayed) on the other side of the display surface 210. That is, when the insertion amount of the stereoscopic display pixel from the display surface 210 is a positive value, the depth value indicates a value larger than the parallax zero reference value.
 ここで、視差ゼロ基準値とは、視差ゼロ面(表示面210)の位置に、立体表示画素を配置するための値である。 Here, the parallax zero reference value is a value for arranging a stereoscopic display pixel at the position of the parallax zero surface (display surface 210).
 奥行き値は、立体映像において立体表示画素が、表示面210の手前側に配置(表示)される場合、前記視差ゼロ基準値未満の値を示す。すなわち、立体表示画素の表示面210からの飛び出し量が正の値の場合、奥行き値は前記視差ゼロ基準値未満の値を示すとする。 The depth value indicates a value less than the parallax zero reference value when stereoscopic display pixels are arranged (displayed) on the front side of the display surface 210 in a stereoscopic image. That is, when the projection amount of the stereoscopic display pixel from the display surface 210 is a positive value, the depth value indicates a value less than the parallax zero reference value.
 つまり、各奥行き値は、該奥行き値に対応する立体表示画素の、視差ゼロ面(表示面210)からの飛び出し量または奥まり量を示す値である。 That is, each depth value is a value indicating the projection amount or back amount of the stereoscopic display pixel corresponding to the depth value from the zero parallax surface (display surface 210).
 奥行き値は、一例として、視聴者40の位置を0として視聴者40の位置から遠いほど値が大きくなるように0~255の範囲で表されるとし、視差ゼロ基準値が123である場合を例にとり説明する。なお、奥行き値は、0~255の範囲に限定されず、例えば、0~511の範囲で表されてもよい。また、視差ゼロ基準値が0である場合、奥行き値は、正の値および負の値で表されてもよい。 As an example, assuming that the position of the viewer 40 is 0, the depth value is represented in the range of 0 to 255 so that the value increases as the distance from the position of the viewer 40 increases, and the case where the parallax zero reference value is 123 is An example will be described. The depth value is not limited to the range of 0 to 255, and may be represented, for example, in the range of 0 to 511. Also, if the disparity zero reference value is 0, the depth value may be represented by a positive value and a negative value.
 以下においては、立体表示させるための画素を、立体表示用画素ともいう。 Hereinafter, a pixel for stereoscopic display is also referred to as a stereoscopic display pixel.
 まず、立体映像において、立体表示画素を表示面210の手前側に配置する場合、すなわち、立体表示画素の飛び出し量が正の値の場合について説明する。 First, in the case of arranging a stereoscopic display pixel on the front side of the display surface 210 in a stereoscopic image, that is, a case where the projection amount of the stereoscopic display pixel is a positive value will be described.
 図2において、奥行き値が、視差ゼロ基準値(123)である位置は、Z方向における表示面210(視差ゼロ面)の位置に相当する。 In FIG. 2, the position where the depth value is the parallax zero reference value (123) corresponds to the position of the display surface 210 (parallax zero surface) in the Z direction.
 図2に示されるように、表示面210において、立体表示用画素が、例えば、左目用映像内の位置P11、および、右目用映像内の位置P12に表示されたとする。この場合、視聴者40は、アクティブシャッタメガネ300の動作により、左目41で位置P11に配置された立体表示用画素をみる。また、この場合、視聴者40は、アクティブシャッタメガネ300の動作により、右目42で位置P12に配置された立体表示用画素をみる。 As shown in FIG. 2, it is assumed that, on the display surface 210, the stereoscopic display pixels are displayed at, for example, a position P11 in the left-eye video and a position P12 in the right-eye video. In this case, the viewer 40 looks at the stereoscopic display pixels arranged at the position P11 with the left eye 41 by the operation of the active shutter glasses 300. Also, in this case, the viewer 40 looks at the pixel for stereoscopic display disposed at the position P12 with the right eye 42 by the operation of the active shutter glasses 300.
 これにより、視聴者40は、立体映像において立体表示画素が、表示面210から距離d1だけ手前の位置P10に配置されているように見える。この場合、立体表示画素の奥行き値は、視差ゼロ基準値(123)未満の値を示す。 As a result, the viewer 40 looks as if the stereoscopic display pixel is arranged at the position P10 ahead of the display surface 210 by the distance d1 in the stereoscopic video. In this case, the depth value of the stereoscopic display pixel indicates a value less than the parallax zero reference value (123).
 なお、位置P12と位置P11との水平方向の距離が視差量に相当する。また、当該視差量は、奥行き値における当該立体表示画素の飛び出し量d1にも相当する。奥行き値が前記視差ゼロ基準値の場合、視聴者40は、立体表示画素が、表示面210(視差ゼロ面)に配置されているように見える。 The horizontal distance between the position P12 and the position P11 corresponds to the amount of parallax. In addition, the parallax amount also corresponds to the projection amount d1 of the stereoscopic display pixel in the depth value. When the depth value is the parallax zero reference value, the viewer 40 looks as if stereoscopic display pixels are arranged on the display surface 210 (parallax zero surface).
 なお、視差量と当該立体表示画素の奥行き値との関係は、予め定めた変換式によって行なわれてもよい。例えば、視差量をd、当該立体表示画素の奥行き値をL、視差ゼロ面の奥行き値をL、視聴者40の左目41と右目42との距離(瞳孔間距離)をeとして、
Figure JPOXMLDOC01-appb-M000001
としてもよい。このとき、奥行き値LはZ軸における視聴者40とP10との距離の差に比例する。
The relation between the parallax amount and the depth value of the stereoscopic display pixel may be performed by a predetermined conversion formula. For example, the amount of parallax is d, the depth value of the stereoscopic display pixel is L, the depth value of the parallax zero plane is L b , and the distance between the left eye 41 and the right eye 42 of the viewer 40 (interpupillary distance) is e.
Figure JPOXMLDOC01-appb-M000001
It may be At this time, the depth value L is proportional to the difference in distance between the viewer 40 and P10 on the Z axis.
 次に、立体映像において、立体表示画素を表示面210の向こう側に配置する場合、すなわち、立体表示画素の奥まり量が正の値の場合について説明する。 Next, the case where a stereoscopic display pixel is disposed on the other side of the display surface 210 in a stereoscopic image, that is, the case where the insertion amount of the stereoscopic display pixel is a positive value will be described.
 図2に示されるように、表示面210において、立体表示用画素が、例えば、左目用映像内の位置P21、および、右目用映像内の位置P22に表示されたとする。この場合、視聴者40は、アクティブシャッタメガネ300の動作により、左目41で位置P21に配置された立体表示用画素をみる。また、この場合、視聴者40は、アクティブシャッタメガネ300の動作により、右目42で位置P22に配置された立体表示用画素をみる。 As shown in FIG. 2, it is assumed that, on the display surface 210, the stereoscopic display pixels are displayed at, for example, a position P21 in the left-eye video and a position P22 in the right-eye video. In this case, the viewer 40 looks at the stereoscopic display pixels arranged at the position P21 with the left eye 41 by the operation of the active shutter glasses 300. Further, in this case, the viewer 40 looks at the stereoscopic display pixel arranged at the position P22 with the right eye 42 by the operation of the active shutter glasses 300.
 これにより、視聴者40は、立体映像において立体表示画素が、表示面210から距離d2だけ向こう側の位置P20に配置されているように見える。この場合、立体表示画素の奥行き値は、視差ゼロ基準値(123)より大きい値を示す。 As a result, the viewer 40 looks as if the stereoscopic display pixel is arranged at the position P20 on the opposite side of the display surface 210 by the distance d2 in the stereoscopic image. In this case, the depth value of the stereoscopic display pixel indicates a value larger than the parallax zero reference value (123).
 なお、視差量と当該立体表示画素との奥行き値との関係は、予め定めた変換式によって行われてもよい。例えば、視差量をd、当該立体表示画素の奥行き値をL、視差ゼロ面の奥行き値をL、視聴者40の左目41と右目42との距離(瞳孔間距離)をeとして、
Figure JPOXMLDOC01-appb-M000002
としてもよい。このとき、奥行き値LはZ軸における視聴者40とP10との距離の差に比例する。
The relation between the parallax amount and the depth value of the stereoscopic display pixel may be performed by a predetermined conversion formula. For example, the amount of parallax is d, the depth value of the stereoscopic display pixel is L, the depth value of the parallax zero plane is L b , and the distance between the left eye 41 and the right eye 42 of the viewer 40 (interpupillary distance) is e.
Figure JPOXMLDOC01-appb-M000002
It may be At this time, the depth value L is proportional to the difference in distance between the viewer 40 and P10 on the Z axis.
 次に、立体映像再生装置100の構成について説明する。 Next, the configuration of the stereoscopic video reproduction device 100 will be described.
 図3は、立体映像再生装置100の構成の一例を示すブロック図である。 FIG. 3 is a block diagram showing an example of the configuration of the stereoscopic video reproduction device 100. As shown in FIG.
 図3に示されるように、立体映像再生装置100は、立体映像生成装置110を備える。なお、立体映像再生装置100は、図示されない処理部等も含む。 As shown in FIG. 3, the stereoscopic video reproduction device 100 includes a stereoscopic video generation device 110. The 3D image reproduction apparatus 100 also includes a processing unit and the like (not shown).
 立体映像生成装置110は、立体映像を生成するための装置である。詳細は後述するが、立体映像生成装置110は、処理対象となる立体映像の奥行き情報を取得し、当該奥行き情報を更新する。そして、更新後の奥行き情報を用いて、新たな立体映像を生成する。ここで、奥行き情報とは、処理対象となる立体映像を構成する複数の立体表示画素にそれぞれ対応する複数の奥行き値を示す情報である。 The three-dimensional video generation device 110 is a device for generating a three-dimensional video. Although details will be described later, the three-dimensional video generation apparatus 110 acquires depth information of the three-dimensional video to be processed, and updates the depth information. Then, using the updated depth information, a new stereoscopic image is generated. Here, the depth information is information indicating a plurality of depth values respectively corresponding to a plurality of stereoscopic display pixels constituting a stereoscopic video to be processed.
 立体映像生成装置110は、立体映像信号SG1を受信する。ここで、立体映像信号SG1は、立体映像データと、該立体映像に対応する奥行き情報とを示す信号である。立体映像データは、左目用映像および右目用映像のデータである。 The stereoscopic video generation device 110 receives the stereoscopic video signal SG1. Here, the stereoscopic video signal SG1 is a signal indicating stereoscopic video data and depth information corresponding to the stereoscopic video. The stereoscopic video data is data of a left-eye video and a right-eye video.
 なお、立体映像信号SG1が示すデータおよび情報等は、上記のものに限定されない。立体映像信号SG1は、例えば、2次元映像のデータと、該2次元映像を立体映像に変換するための奥行き情報とを示す信号であってもよい。 Note that the data, information, and the like indicated by the stereoscopic video signal SG1 are not limited to those described above. The stereoscopic video signal SG1 may be, for example, a signal indicating data of a two-dimensional video and depth information for converting the two-dimensional video into a stereoscopic video.
 立体映像信号SG1は、例えば、記録媒体から読み出されたデータを示す信号である。当該記録媒体は、例えば、BD(Blu-ray Disc(登録商標))である。この場合、立体映像再生装置100に含まれる前述の処理部は、記録媒体から、立体映像データと、奥行き情報とを読み出す処理を行う。 The stereoscopic video signal SG1 is, for example, a signal indicating data read from the recording medium. The recording medium is, for example, a BD (Blu-ray Disc (registered trademark)). In this case, the above-described processing unit included in the stereoscopic video reproduction device 100 performs processing of reading stereoscopic video data and depth information from the recording medium.
 また、立体映像信号SG1は、上記記載の信号に限定されない。立体映像信号SG1は、例えば、放送波から取得された信号であってもよい。この場合、立体映像再生装置100に含まれる前述の処理部は、例えば、チューナおよび復調回路の機能を有する。 Also, the stereoscopic video signal SG1 is not limited to the signals described above. The stereoscopic video signal SG1 may be, for example, a signal acquired from a broadcast wave. In this case, the above-described processing unit included in the stereoscopic video reproduction device 100 has, for example, the functions of a tuner and a demodulation circuit.
 ここで、奥行き情報について説明する。奥行き情報は、例えば、以下の奥行きテーブルT100として表現される。 Here, depth information will be described. The depth information is expressed, for example, as the following depth table T100.
 図4は、奥行き情報としての奥行きテーブルT100を示す図である。図4では、立体映像が、動画像のある1フレーム、または、静止画像である場合における奥行きテーブルT100を示す。 FIG. 4 is a diagram showing a depth table T100 as depth information. FIG. 4 shows the depth table T100 when the stereoscopic video is one frame of a moving image or a still image.
 奥行きテーブルT100は、立体映像を構成する複数の立体表示画素にそれぞれ対応する複数の奥行き値を示す。奥行きテーブルT100が示す複数の奥行き値は、立体映像を生成するための値である。奥行きテーブルT100が示す複数の奥行き値は、立体映像を構成する複数の立体表示画素と同様に、行列状に配置される。 The depth table T100 indicates a plurality of depth values respectively corresponding to a plurality of stereoscopic display pixels constituting a stereoscopic video. The plurality of depth values indicated by the depth table T100 are values for generating a stereoscopic video. The plurality of depth values indicated by the depth table T100 are arranged in a matrix like the plurality of stereoscopic display pixels constituting the stereoscopic video.
 ここで、奥行きテーブルT100が示す複数の奥行き値の各々を画素値とみなす。この場合、奥行き情報としての奥行きテーブルT100は、複数の奥行き値から構成される画像(以下、奥行き画像ともいう)を示す。奥行き画像は、デプスマップ(視差マップ)に相当する。この場合、当該奥行き画像を構成する各画素の値は、奥行き値である。 Here, each of the plurality of depth values indicated by the depth table T100 is regarded as a pixel value. In this case, the depth table T100 as depth information indicates an image composed of a plurality of depth values (hereinafter, also referred to as a depth image). The depth image corresponds to a depth map (disparity map). In this case, the value of each pixel constituting the depth image is a depth value.
 図4において、例えば、Cmnとは、立体映像において、m行n列に対応する画素の奥行き値を示す。また、例えば、C12とは、立体映像において、1行2列に対応する画素の奥行き値を示す。 In FIG. 4, for example, Cmn indicates the depth value of a pixel corresponding to m rows and n columns in a stereoscopic video. Also, for example, C12 indicates a depth value of a pixel corresponding to one row and two columns in a stereoscopic video.
 なお、奥行きテーブルT100が示す各奥行き値は、1つの画素に対応する奥行き値に限定されない。奥行きテーブルT100が示す各奥行き値は、例えば、4つの画素に対応する奥行き値であってもよい。この場合、奥行きテーブルT100が示す奥行き値の数は、立体映像を構成する立体表示画素の数の1/4である。 Note that each depth value indicated by the depth table T100 is not limited to the depth value corresponding to one pixel. Each depth value indicated by the depth table T100 may be, for example, a depth value corresponding to four pixels. In this case, the number of depth values indicated by the depth table T100 is one fourth of the number of stereoscopic display pixels constituting the stereoscopic video.
 次に、立体映像および奥行き値について説明する。 Next, stereoscopic video and depth values will be described.
 図5は、立体映像における立体対象オブジェクトと、奥行き値との関係を示す図である。ここで、立体対象オブジェクトとは、立体映像において立体的に表示させる対象となるオブジェクトである。 FIG. 5 is a diagram showing a relationship between a stereoscopic target object in a stereoscopic video and a depth value. Here, a stereoscopic target object is an object to be displayed stereoscopically in a stereoscopic video.
 図5(a)は、立体映像における立体対象オブジェクト50と、表示面210との位置関係を示す斜視図である。図5(b)は、立体対象オブジェクト50の奥行き値を説明するためのグラフである。図5(b)において、縦軸は、奥行き値(Z方向)を示す。横軸は、X方向(立体映像の水平方向)を示す。 FIG. 5A is a perspective view showing the positional relationship between the stereoscopic target object 50 in the stereoscopic video and the display surface 210. FIG. FIG. 5B is a graph for explaining the depth value of the three-dimensional object 50. In FIG. 5B, the vertical axis indicates the depth value (Z direction). The horizontal axis indicates the X direction (horizontal direction of stereoscopic video).
 図6は、立体対象オブジェクト50を示す立体映像G100の一例を示す図である。なお、図6は、XY平面に平行な立体映像G100を示す図である。なお、図6には、奥行き値を説明するために、実際には、立体映像G100に示されない奥行き線L10が示される。 FIG. 6 is a view showing an example of a stereoscopic video G100 showing the stereoscopic target object 50. As shown in FIG. FIG. 6 is a diagram showing a stereoscopic image G100 parallel to the XY plane. Note that, in FIG. 6, in order to explain the depth value, a depth line L10 not shown in the stereoscopic video G100 is actually shown.
 奥行き線L10は、立体映像G100における、ある1ラインに並ぶn個の立体表示画素にそれぞれ対応するn個の奥行き値を示す線である。 The depth line L10 is a line indicating n depth values respectively corresponding to n stereoscopic display pixels arranged in one line in the stereoscopic video G100.
 図5(a)に示されるように、立体映像において、立体対象オブジェクト50は、例えば、表示面210の手前側に配置されるとする。図5(a)では、奥行き線L10が、立体対象オブジェクト50および表示面210に示される。 As illustrated in FIG. 5A, in the stereoscopic image, the stereoscopic target object 50 is disposed, for example, on the front side of the display surface 210. In FIG. 5A, the depth line L10 is shown on the stereoscopic target object 50 and the display surface 210.
 図5(b)は、奥行き線L10を示すグラフである。 FIG. 5B is a graph showing the depth line L10.
 図5(b)において、領域R10内の奥行き線L10は、立体対象オブジェクト50に示される奥行き線L10に相当する。図5(b)の領域R10内の奥行き線L10により、立体対象オブジェクト50の表面には凹凸があることが分かる。 In FIG. 5B, the depth line L10 in the region R10 corresponds to the depth line L10 shown in the stereoscopic target object 50. From the depth line L10 in the region R10 of FIG. 5B, it can be seen that the surface of the three-dimensional object 50 has unevenness.
 なお、奥行きテーブルT100が立体映像G100に対応するテーブルである場合、当該奥行きテーブルT100が示す奥行き画像は、グレースケールで表現した立体映像G100を示す。 When the depth table T100 is a table corresponding to the stereoscopic video G100, the depth image indicated by the depth table T100 indicates the stereoscopic video G100 expressed in gray scale.
 次に、立体映像生成装置110の構成について説明する。 Next, the configuration of the stereoscopic video generation device 110 will be described.
 図3に示されるように、立体映像生成装置110は、奥行き情報更新装置101と、立体映像生成部113とを備える。 As shown in FIG. 3, the stereoscopic video generation device 110 includes a depth information updating device 101 and a stereoscopic video generation unit 113.
 奥行き情報更新装置101は、立体映像を構成する各画素に対応する奥行き値を用いた処理を行う。 The depth information updating apparatus 101 performs processing using depth values corresponding to respective pixels constituting a stereoscopic video.
 奥行き情報更新装置101は、奥行き情報取得部111と、奥行き情報更新部112とを備える。奥行き情報取得部111および立体映像生成部113は、立体映像信号SG1を受信する。 The depth information update device 101 includes a depth information acquisition unit 111 and a depth information update unit 112. The depth information acquisition unit 111 and the stereoscopic video generation unit 113 receive the stereoscopic video signal SG1.
 奥行き情報取得部111は、受信した立体映像信号SG1が示す奥行き情報を取得する。当該奥行き情報は、例えば、図4の奥行きテーブルT100である。奥行き情報取得部111は、取得した奥行き情報を、奥行き情報更新部112へ送信する。 The depth information acquisition unit 111 acquires depth information indicated by the received stereoscopic video signal SG1. The depth information is, for example, the depth table T100 of FIG. The depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112.
 なお、奥行き情報取得部111は、例えば、立体映像信号SG1が多視点映像を示す信号である場合、例えば、公知の技術であるステレオマッチングを行うことにより奥行き情報を取得する。なお、立体映像信号SG1に奥行き情報が含まれている場合、奥行き情報に対し特定の変換式を経た値を奥行き情報として取得してもよい。 Note that, for example, when the stereoscopic video signal SG1 is a signal indicating a multi-view video, the depth information acquisition unit 111 acquires depth information by performing stereo matching, which is a known technique, for example. When depth information is included in the stereoscopic video signal SG1, a value obtained by passing a specific conversion equation on the depth information may be acquired as depth information.
 奥行き情報更新部112は、詳細は後述するが、受信した奥行き情報を更新する処理を行う。そして、奥行き情報更新部112は、更新後の奥行き情報を、立体映像生成部113へ送信する。 The depth information update unit 112 updates the received depth information, the details of which will be described later. Then, the depth information update unit 112 transmits the updated depth information to the stereoscopic video generation unit 113.
 以下においては、奥行き情報更新部112が立体映像生成部113へ送信する更新後の奥行き情報を、更新済奥行き情報という。 Hereinafter, the updated depth information transmitted by the depth information update unit 112 to the stereoscopic video generation unit 113 is referred to as updated depth information.
 立体映像生成部113は、詳細は後述するが、受信した立体映像信号SG1が示す立体映像データと、受信した更新済奥行き情報とを用いて、新たな立体映像を生成する。 Although details will be described later, the stereoscopic video generation unit 113 generates a new stereoscopic video using stereoscopic video data indicated by the received stereoscopic video signal SG1 and the received updated depth information.
 そして、立体映像生成部113は、生成した新たな立体映像(左目用映像および右目用映像)を示す立体映像信号SG2を、立体映像表示装置200へ送信する。 Then, the stereoscopic video generation unit 113 transmits, to the stereoscopic video display device 200, the stereoscopic video signal SG2 indicating the generated new stereoscopic video (video for left eye and video for right eye).
 立体映像表示装置200は、受信した立体映像信号SG2が示す左目用映像および右目用映像を、1フレーム毎に、交互に表示面210に表示する。 The stereoscopic video display device 200 alternately displays the video for the left eye and the video for the right eye indicated by the received stereoscopic video signal SG2 on the display surface 210 for each frame.
 次に、本実施形態における立体映像を生成するための処理(以下、立体映像生成処理ともいう)について説明する。 Next, processing for generating a stereoscopic video in the present embodiment (hereinafter, also referred to as stereoscopic video generation processing) will be described.
 図7は、立体映像生成処理のフローチャートである。 FIG. 7 is a flowchart of stereoscopic video generation processing.
 ステップS110では、奥行き情報取得部111が、立体映像信号SG1が示す奥行き情報を取得する。当該奥行き情報は、例えば、図4の奥行きテーブルT100である。奥行き情報取得部111は、取得した奥行き情報を、奥行き情報更新部112へ送信する。 In step S110, the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1. The depth information is, for example, the depth table T100 of FIG. The depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112.
 ステップS120では、奥行き情報更新処理が行われる。奥行き情報更新処理では、奥行き情報更新部112が、奥行き情報(奥行きテーブルT100)が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理(以下、強調処理ともいう)を行うことにより、前記奥行き情報を更新する。ここで、特定の帯域とは、特定の周波数帯域である。 In step S120, depth information update processing is performed. In the depth information update processing, processing in which the depth information update unit 112 emphasizes a component of a specific band for a depth image composed of a plurality of depth values indicated by the depth information (depth table T100) (hereinafter, emphasis processing) The depth information is updated by performing (a). Here, the specific band is a specific frequency band.
 以下に、奥行き情報更新処理について詳細に説明する。以下においては、処理対象となる立体映像を、処理対象立体映像という。ここで、処理対象立体映像は、静止画像であるとして説明する。また、処理対象立体映像が、一例として、立体映像G100であるとして説明する。この場合、奥行き情報更新処理において処理される奥行き画像は、立体映像G100に対応する。 The depth information update processing will be described in detail below. Hereinafter, a stereoscopic video to be processed is referred to as a processing stereoscopic video. Here, it is assumed that the processing target stereoscopic video is a still image. Further, the processing target three-dimensional video is described as an example of the three-dimensional video G100. In this case, the depth image processed in the depth information update process corresponds to the stereoscopic video G100.
 ステップS125では、奥行き強調処理が行われる。奥行き強調処理では、奥行き情報更新部112が、奥行き情報としての奥行きテーブルT100が示す複数の奥行き値から構成される奥行き画像に対して、ある特定の帯域における奥行き値の変化を強調する(大きくする)フィルタ処理を行う。すなわち、奥行き情報更新部112は、奥行き情報(奥行きテーブルT100)が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行う。当該強調処理は、例えば、奥行き画像の少なくとも一部において、近接する2つの画素の奥行き値の差の絶対値が閾値未満である該2つの奥行き値の差の絶対値を大きくする処理である。当該閾値は、後述のステップS122で使用される閾値である。当該閾値は、例えば、20である。 In step S125, depth enhancement processing is performed. In the depth enhancement processing, the depth information updating unit 112 emphasizes (increases) the change in depth value in a specific band for a depth image configured of a plurality of depth values indicated by the depth table T100 as depth information. ) Perform filter processing. That is, the depth information updating unit 112 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information (depth table T100). The enhancement process is, for example, a process of increasing the absolute value of the difference between the two depth values at which the absolute value of the difference between the depth values of two adjacent pixels is less than a threshold in at least part of the depth image. The threshold is a threshold used in step S122 described later. The threshold is, for example, 20.
 奥行きテーブルT100が示す奥行き画像を、上記フィルタ処理の処理対象とした場合、例えば、高域周波数の奥行き値の変化を強調することは、画像処理における鮮鋭化処理と同等と考えることができる。すなわち、特定の帯域の奥行き値の変化を強調することは、画像処理におけるフィルタと同様、例えば、FIRフィルタによって行うことができる。 When the depth image indicated by the depth table T100 is a processing target of the filter processing, for example, emphasizing the change of the depth value of the high frequency can be considered to be equivalent to the sharpening processing in the image processing. That is, emphasizing a change in depth value of a specific band can be performed by, for example, an FIR filter, as with a filter in image processing.
 さらに、具体的には、奥行き強調処理では、奥行き情報更新部112は、奥行きテーブルT100が示す奥行き画像に対して、2次元のFIR(Finite Impulse Response)フィルタを用いたフィルタ処理を行う。 Furthermore, specifically, in depth enhancement processing, the depth information update unit 112 performs filter processing using a two-dimensional FIR (Finite Impulse Response) filter on the depth image indicated by the depth table T100.
 以下においては、「奥行き値の変化」を、「奥行きの変化」または「奥行き変化」ともいう。 Hereinafter, "change in depth value" is also referred to as "change in depth" or "change in depth".
 ステップS125のフィルタ処理は、奥行き画像を構成する複数の画素のうちの1つの画素を注目画素として行われる。なお、ステップS125の処理が行われる毎に、異なる画素を注目画素としてフィルタ処理が行われる。 The filter process of step S125 is performed with one pixel among the plurality of pixels forming the depth image as a pixel of interest. Note that each time the process of step S125 is performed, the filter process is performed with the different pixel as the pixel of interest.
 これにより、奥行き情報としての奥行きテーブルT100が更新される。奥行きテーブルT100は、ステップS125の処理が行われる毎に更新される。なお、上記フィルタ処理に利用されるFIRフィルタは、例えば、アンシャープマスクを使用した鮮鋭化フィルタである。 Thereby, the depth table T100 as depth information is updated. The depth table T100 is updated each time the process of step S125 is performed. The FIR filter used for the above-mentioned filter processing is, for example, a sharpening filter using an unsharp mask.
 なお、フィルタ処理に利用されるフィルタは、前記鮮鋭化フィルタに限定されず、特定の帯域における奥行き値の変化を強調するフィルタであれば他のフィルタであってもよい。 The filter used for the filtering process is not limited to the sharpening filter, and may be another filter as long as it is a filter that emphasizes a change in depth value in a specific band.
 ステップS125の処理が、奥行き画像を構成する全ての画素に対して行われると、奥行き情報更新部112は、更新後の奥行きテーブルT100を、更新済奥行き情報として、立体映像生成部113へ送信する。そして、ステップS120の奥行き情報更新処理が終了し、処理はステップS130に移行する。 When the process of step S125 is performed on all the pixels constituting the depth image, the depth information update unit 112 transmits the updated depth table T100 to the stereoscopic video generation unit 113 as updated depth information. . Then, the depth information update process in step S120 ends, and the process proceeds to step S130.
 ステップS120の奥行き情報更新処理により、図8(a)に示される領域R10内の奥行き線L10の振幅が、図8(b)のように大きくなる。すなわち、ステップS125の処理により、立体対象オブジェクト50の奥行き(立体感)が強調される。 By the depth information updating process of step S120, the amplitude of the depth line L10 in the region R10 shown in FIG. 8A becomes large as shown in FIG. 8B. That is, the depth (three-dimensional effect) of the three-dimensional target object 50 is emphasized by the process of step S125.
 ステップS130では、立体映像生成部113は、前述したように、受信した立体映像信号SG1が示す立体映像データと、受信した更新済奥行き情報(更新後の奥行きテーブルT100)とを用いて、新たな立体映像を生成する。立体映像信号SG1が示す立体映像データは、左目用映像および右目用映像のデータである。 In step S130, as described above, the three-dimensional video generation unit 113 uses the three-dimensional video data indicated by the received three-dimensional video signal SG1 and the received updated depth information (the updated depth table T100) to perform a new process. Generate stereoscopic video. The stereoscopic video data indicated by the stereoscopic video signal SG1 is data of a left-eye video and a right-eye video.
 具体的には、立体映像生成部113は、更新済奥行き情報が示す複数の奥行き値を用いて、左目用映像および右目用映像の視差量を算出する。そして、立体映像生成部113は、算出した視差量を用いて、立体映像信号SG1が示す左目用映像および右目用映像の一方または両方を更新する。これにより、新たな左目用映像および右目用映像から構成される新たな立体映像が生成される。 Specifically, the stereoscopic video generation unit 113 calculates the parallax amount of the left-eye video and the right-eye video using the plurality of depth values indicated by the updated depth information. Then, using the calculated amount of parallax, the stereoscopic video generation unit 113 updates one or both of the left-eye video and the right-eye video indicated by the stereoscopic video signal SG1. As a result, a new three-dimensional video composed of a new left-eye video and a right-eye video is generated.
 なお、奥行き情報を用いて立体映像を生成する処理は、例えば、MVC(Multiview Video Coding)規格に従った処理である。そのため、前述の視差量を算出する処理、左目用映像および右目用映像を更新する処理等の詳細な説明は行わない。 Note that the process of generating a stereoscopic video using depth information is, for example, a process according to the MVC (Multiview Video Coding) standard. Therefore, detailed description of the process of calculating the amount of parallax, the process of updating the left-eye video and the right-eye video, and the like is not performed.
 なお、受信した立体映像信号SG1が、2次元映像のデータと該2次元映像を立体映像に変換するための奥行き情報とを示す場合、立体映像生成部113は、当該奥行き情報が示す複数の奥行き値を用いて、視差量を算出する。立体映像生成部113は、算出した視差量を用いて、当該2次元映像から、新たな左目用映像および右目用映像を生成する。これにより、新たな左目用映像および右目用映像から構成される新たな立体映像が生成される。 When the received stereoscopic video signal SG1 indicates data of a two-dimensional video and depth information for converting the two-dimensional video into a stereoscopic video, the stereoscopic video generation unit 113 determines a plurality of depths indicated by the depth information. The amount of parallax is calculated using the value. The three-dimensional video generation unit 113 generates a new left-eye video and a right-eye video from the two-dimensional video using the calculated parallax amount. As a result, a new three-dimensional video composed of a new left-eye video and a right-eye video is generated.
 そして、立体映像生成部113は、生成した新たな立体映像(左目用映像および右目用映像)を示す立体映像信号SG2を、立体映像表示装置200へ送信する。 Then, the stereoscopic video generation unit 113 transmits, to the stereoscopic video display device 200, the stereoscopic video signal SG2 indicating the generated new stereoscopic video (video for left eye and video for right eye).
 以上説明した立体映像生成処理は、処理対象立体映像が静止画像である場合の処理であるが、処理対象立体映像が動画像である場合は、動画像を構成する1フレーム毎に、上記の立体映像生成処理が繰り返し行われる。 The three-dimensional video generation processing described above is processing when the processing target three-dimensional video is a still image, but when the processing target three-dimensional video is a moving image, the above-described three-dimensional video is generated for each frame constituting the moving image. The video generation process is repeated.
 以上説明したように、本実施形態によれば、奥行き情報更新部112は、奥行き情報(奥行きテーブルT100)が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分(奥行き値の変化)を強調する処理を行う。 As described above, according to the present embodiment, the depth information updating unit 112 applies a component (depth of a specific band) to a depth image composed of a plurality of depth values indicated by the depth information (depth table T100). Process to emphasize the change of value).
 これにより、奥行き値の変化が小さい部分の立体感の変化を、十分に強調するための奥行き情報を生成することができる。すなわち、細かな立体感の表現が向上した立体映像を生成するための奥行き情報を生成することができる。つまり、立体映像の品質(立体感)が向上した立体映像を生成するための奥行き情報を生成することができる。 Thereby, it is possible to generate depth information for sufficiently emphasizing the change in the three-dimensional effect of the portion where the change in depth value is small. That is, it is possible to generate depth information for generating a stereoscopic video with an improved expression of fine stereoscopic effect. That is, it is possible to generate depth information for generating a three-dimensional video with improved quality (three-dimensional effect) of the three-dimensional video.
 また、このような奥行き情報を用いて、立体映像が生成される。そのため、本実施形態によれば、細かな立体感の表現が向上した立体映像を生成することができる。 In addition, a stereoscopic image is generated using such depth information. Therefore, according to the present embodiment, it is possible to generate a three-dimensional video in which the expression of fine three-dimensional effect is improved.
 <第1の実施形態の変形例1>
 第1の実施形態の変形例1では、書き割り的立体感を解消した立体映像を生成するための処理について説明する。ここで、書き割り的立体感とは、例えば、オブジェクト画像を示す複数の平面の各々が、異なる奥行きの位置に配置されているような立体感である。すなわち、書き割り的立体感とは、オブジェクト画像単体の立体感がほとんど表現されない立体感である。
<Modified Example 1 of First Embodiment>
In the first modification of the first embodiment, a process for generating a stereoscopic image in which the sloppy three-dimensional effect is eliminated will be described. Here, the writable stereoscopic effect is, for example, a stereoscopic effect in which each of a plurality of planes representing an object image is disposed at a different depth position. In other words, the simplistic 3D effect is a 3D effect in which the 3D effect of the object image alone is hardly expressed.
 第1の実施形態の変形例1における立体映像視聴システムは、図1の立体映像視聴システム1000である。すなわち、第1の実施形態の変形例1における立体映像再生装置は、図3の立体映像再生装置100である。そのため、立体映像再生装置100の構成の詳細な説明は繰り返さない。また、第1の実施形態の変形例1における立体映像生成装置は、図3の立体映像生成装置110である。 The 3D image viewing system in the first modification of the first embodiment is the 3D image viewing system 1000 of FIG. That is, the three-dimensional video reproduction apparatus in the first modification of the first embodiment is the three-dimensional video reproduction apparatus 100 of FIG. Therefore, detailed description of the configuration of stereoscopic video reproduction device 100 will not be repeated. In addition, the stereoscopic video generation device in the first modification of the first embodiment is the stereoscopic video generation device 110 of FIG. 3.
 次に、本実施形態の変形例1における立体映像を生成するための処理(以下、立体映像生成処理Aともいう)について説明する。 Next, processing for generating a stereoscopic video (hereinafter also referred to as stereoscopic video generation processing A) in the first modification of the present embodiment will be described.
 図9は、立体映像生成処理Aのフローチャートである。図9において、図7のステップ番号と同じステップ番号の処理は、第1の実施形態で説明した処理と同様な処理が行われるので詳細な説明は繰り返さない。以下、第1の実施形態と異なる点を中心に説明する。 FIG. 9 is a flowchart of stereoscopic video generation processing A. In FIG. 9, the process of the same step number as the step number of FIG. 7 is performed in the same manner as the process described in the first embodiment, and therefore the detailed description will not be repeated. Hereinafter, differences from the first embodiment will be mainly described.
 まず、奥行き情報取得部111は、立体映像信号SG1が示す奥行き情報を取得する(S110)。 First, the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1 (S110).
 そして、ステップS120Aの処理が行われる。 Then, the process of step S120A is performed.
 ステップS120Aでは、奥行き情報更新処理Aが行われる。奥行き情報更新処理Aは、図7の奥行き情報更新処理と比較して、ステップS125の代わりにステップS125Aの処理が行われる点と、ステップS126の処理がさらに行われる点が異なる。奥行き情報更新処理Aのそれ以外の処理は、奥行き情報更新処理と同様な処理であるので詳細な説明は繰り返さない。 In step S120A, depth information update processing A is performed. The depth information update process A differs from the depth information update process of FIG. 7 in that the process of step S125A is performed instead of step S125 and the process of step S126 is further performed. The other processing of depth information update processing A is the same processing as the depth information update processing, and therefore detailed description will not be repeated.
 ステップS125Aでは、奥行き強調処理Aが行われる。奥行き強調処理Aでは、第1の実施形態と同様に、奥行き情報更新部112が、奥行き情報(奥行きテーブルT100)が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域における奥行き値の変化を強調する(大きくする)フィルタ処理を行う。すなわち、奥行き情報更新部112は、奥行き情報が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行う。 In step S125A, depth enhancement processing A is performed. In the depth enhancement processing A, as in the first embodiment, the depth information update unit 112 determines the depth in a specific band with respect to the depth image including a plurality of depth values indicated by the depth information (depth table T100). Apply a filter process that emphasizes (increases) changes in value. That is, the depth information updating unit 112 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information.
 ステップS125Aのフィルタ処理は、奥行き情報(奥行きテーブルT100)が示す奥行き画像を構成する複数の画素のうちの1つの画素を注目画素として行われる。なお、ステップS125Aの処理が行われる毎に、異なる画素を注目画素としてフィルタ処理が行われる。 The filter process of step S125A is performed with one pixel among the plurality of pixels constituting the depth image indicated by the depth information (depth table T100) as the pixel of interest. Note that each time the process of step S125A is performed, the filter process is performed with the different pixel as the pixel of interest.
 ステップS125Aの処理が、奥行き画像を構成する全ての画素に対して行われると、奥行き強調処理Aは終了し、処理はステップS126に移行する。 When the process of step S125A is performed on all the pixels constituting the depth image, the depth enhancement process A ends, and the process proceeds to step S126.
 ステップS125Aの奥行き強調処理Aにより、奥行き情報としての奥行きテーブルT100が更新される。以下においては、奥行き強調処理Aにより更新された奥行き情報を、第1更新済奥行き情報という。第1更新済奥行き情報は、更新後の奥行きテーブルT100である。以下においては、更新後の奥行きテーブルT100を、第1更新済奥行きテーブルともいう。 The depth table T100 as depth information is updated by the depth enhancement process A in step S125A. Hereinafter, the depth information updated by the depth enhancement processing A is referred to as first updated depth information. The first updated depth information is the updated depth table T100. Hereinafter, the updated depth table T100 is also referred to as a first updated depth table.
 奥行き強調処理Aにより、図8(a)に示される領域R10内の奥行き線L10の振幅が、図8(b)のように大きくなる。すなわち、ステップS125Aの処理により、立体対象オブジェクト50の奥行き(立体感)が強調される。 By the depth enhancement processing A, the amplitude of the depth line L10 in the region R10 shown in FIG. 8A is increased as shown in FIG. 8B. That is, the depth (three-dimensional effect) of the three-dimensional object 50 is emphasized by the process of step S125A.
 ステップS126では、奥行き圧縮処理が行われる。奥行き圧縮処理では、奥行き情報更新部112が、更新後の奥行き情報が示す複数の奥行き値の絶対値が、ほぼ同じ割合で小さくなるように、該更新後の奥行き情報が示す複数の奥行き値を変更することにより、該更新後の奥行き情報を更新する。 In step S126, depth compression processing is performed. In the depth compression process, the depth information updating unit 112 sets the plurality of depth values indicated by the updated depth information such that the absolute values of the plurality of depth values indicated by the updated depth information decrease at substantially the same rate. By changing, the updated depth information is updated.
 具体的には、奥行き情報更新部112が、第1更新済奥行き情報が示す複数の奥行き値の絶対値に1未満の係数を乗算することにより、該第1更新済奥行き情報を更新する。1未満の係数は、例えば、0.4~0.7の範囲の値である。 Specifically, the depth information updating unit 112 updates the first updated depth information by multiplying the absolute value of the plurality of depth values indicated by the first updated depth information by a coefficient less than one. A coefficient less than 1 is, for example, a value in the range of 0.4 to 0.7.
 この処理により、視差ゼロ面を中心として、第1更新済奥行き情報が示す複数の奥行き値を小さくすることができる。 By this processing, it is possible to reduce the plurality of depth values indicated by the first updated depth information, centering on the zero parallax surface.
 なお、第1更新済奥行き情報を更新する処理は、上記処理に限定されない。例えば、第1更新済奥行き情報を更新する処理は、第1更新済奥行き情報が示す複数の奥行き値の絶対値から、所定の値を減算する処理であってもよい。 In addition, the process which updates 1st updated depth information is not limited to the said process. For example, the process of updating the first updated depth information may be a process of subtracting a predetermined value from the absolute values of the plurality of depth values indicated by the first updated depth information.
 ここで、処理対象立体映像が、図6の立体映像G100であるとする。この場合、奥行き圧縮処理が行われることにより、図8(b)に示される奥行き線L10の振幅は、図10(a)のように全体的に小さくなる。これにより、図10(b)のように、ステップS126の処理により立体感が強調された立体対象オブジェクト50と、表示面210(視差ゼロ面)との距離が小さくなる。立体対象オブジェクト50と、表示面210(視差ゼロ面)との距離が大きい程、視聴者は、立体対象オブジェクト50の立体感を識別しにくくなる。 Here, it is assumed that the processing target stereoscopic video is the stereoscopic video G100 of FIG. In this case, as the depth compression process is performed, the amplitude of the depth line L10 shown in FIG. 8B is entirely reduced as shown in FIG. 10A. As a result, as shown in FIG. 10B, the distance between the stereoscopic target object 50 whose stereoscopic effect is enhanced by the processing of step S126 and the display surface 210 (parallax zero surface) is reduced. The greater the distance between the stereoscopic target object 50 and the display surface 210 (parallax zero surface), the harder it is for the viewer to identify the stereoscopic effect of the stereoscopic target object 50.
 そのため、立体感が強調された立体対象オブジェクト50と、表示面210(視差ゼロ面)との距離を小さくするための更新後の第1更新済奥行き情報は、書き割り的立体感が解消した立体映像を生成するための情報である。 Therefore, the first updated depth information after update for reducing the distance between the stereoscopic target object 50 in which the stereoscopic effect is emphasized and the display surface 210 (parallax zero surface) is a stereoscopic object in which the writing stereoscopic effect is eliminated. Information for generating a video.
 以下においては、更新後の第1更新済奥行き情報を、第2更新済奥行き情報という。第2更新済奥行き情報は、更新後の第1更新済奥行きテーブルである。 Hereinafter, the first updated depth information after updating is referred to as second updated depth information. The second updated depth information is the first updated depth table after the update.
 奥行き情報更新部112は、第2更新済奥行き情報を、更新済奥行き情報として、立体映像生成部113へ送信する。 The depth information update unit 112 transmits the second updated depth information as the updated depth information to the stereoscopic video generation unit 113.
 そして、第1の実施形態と同様に、ステップS130の処理が行われる。 Then, as in the first embodiment, the process of step S130 is performed.
 以上説明した立体映像生成処理Aは、処理対象立体映像は静止画像である場合の処理であるが、処理対象立体映像が動画像である場合は、動画像を構成する1フレーム毎に、上記の立体映像生成処理Aが繰り返し行われる。 The stereoscopic video generation processing A described above is processing when the processing target stereoscopic video is a still image, but when the processing target stereoscopic video is a moving image, the above-described processing is performed for each frame constituting the moving image. The stereoscopic video generation processing A is repeatedly performed.
 以上説明したように、本実施形態の変形例1によれば、奥行き情報更新部112が、第1更新済奥行き情報が示す複数の奥行き値の絶対値が、ほぼ同じ割合で小さくなるように、該第1更新済奥行き情報が示す複数の奥行き値を変更することにより、該第1更新済奥行き情報を更新する。第1更新済奥行き情報は、第1の実施形態のステップS125の処理と同様なステップS125Aの処理により生成される情報である。 As described above, according to the first modification of the present embodiment, the depth information updating unit 112 reduces the absolute values of the plurality of depth values indicated by the first updated depth information at substantially the same rate, The first updated depth information is updated by changing the plurality of depth values indicated by the first updated depth information. The first updated depth information is information generated by the process of step S125A similar to the process of step S125 of the first embodiment.
 これにより、細かな立体感の表現を向上させるとともに、書き割り的立体感を解消するための奥行き情報(更新後の第1更新済奥行き情報)を生成することができる。例えば、立体映像G100の立体対象オブジェクト50の立体感を強調させるとともに、書き割り的立体感を解消するための奥行き情報を生成することができる。 As a result, it is possible to improve the fine three-dimensional expression and to generate depth information (first updated depth information after update) for eliminating the writing three-dimensional effect. For example, while emphasizing the three-dimensional effect of the three-dimensional target object 50 of the three-dimensional video G100, it is possible to generate depth information for eliminating the draught three-dimensional effect.
 また、このような更新後の第1更新済奥行き情報を用いて、立体映像が生成される。そのため、本実施形態の変形例によれば、細かな立体感の表現を向上させるとともに、書き割り的立体感を解消した立体映像を生成することができる。 Also, a stereoscopic video is generated using such updated first updated depth information. Therefore, according to the modification of the present embodiment, it is possible to improve the expression of the fine three-dimensional effect and to generate a three-dimensional video in which the sloppy three-dimensional effect is eliminated.
 また、第1更新済奥行き情報が示す複数の奥行き値の絶対値が、ほぼ同じ割合で小さくなるので、本実施形態の変形例により生成された立体映像を視聴者が視聴する場合において、視差量が大きすぎることを起因とする目の疲労感を軽減することができる。 In addition, since the absolute values of the plurality of depth values indicated by the first updated depth information decrease at approximately the same rate, the parallax amount when the viewer views the stereoscopic video generated according to the modification of the present embodiment Can reduce eye fatigue caused by being too large.
 <第1の実施形態の変形例2>
 第1の実施形態の変形例2では、書き割り的立体感を解消した立体映像を生成するための処理について説明する。
<Modification 2 of First Embodiment>
In the second modification of the first embodiment, a process for generating a stereoscopic image in which the sloppy stereoscopic effect is eliminated will be described.
 第1の実施形態の変形例2における立体映像視聴システムは、図1の立体映像視聴システム1000である。すなわち、第1の実施形態の変形例2における立体映像再生装置は、図3の立体映像再生装置100である。そのため、立体映像再生装置100の構成の詳細な説明は繰り返さない。また、第1の実施形態の変形例2における立体映像生成装置は、図3の立体映像生成装置110である。 The 3D image viewing system in the second modification of the first embodiment is the 3D image viewing system 1000 of FIG. That is, the three-dimensional video reproduction apparatus in the modification 2 of the first embodiment is the three-dimensional video reproduction apparatus 100 of FIG. Therefore, detailed description of the configuration of stereoscopic video reproduction device 100 will not be repeated. In addition, the stereoscopic video generation device in the modification 2 of the first embodiment is the stereoscopic video generation device 110 of FIG. 3.
 図11は、奥行き情報更新部112の構成の一例を示すブロック図である。 FIG. 11 is a block diagram showing an example of the configuration of the depth information update unit 112. As shown in FIG.
 図11に示されるように、奥行き情報更新部112は、算出部121と、処理実行部123とを含む。 As shown in FIG. 11, the depth information update unit 112 includes a calculation unit 121 and a processing execution unit 123.
 算出部121は、詳細は後述するが、奥行き画像のうち、処理対象の画素である処理対象画素の奥行き値と、該処理対象画素に近接する画素の奥行き値との差の絶対値である差分奥行き値を算出する処理を、前記奥行き画像を構成する全ての画素に対して行う。 The calculation unit 121 calculates the difference between the depth value of the processing target pixel that is the processing target pixel and the depth value of the pixel near the processing target pixel among the depth images, although the details will be described later. The process of calculating the depth value is performed on all the pixels constituting the depth image.
 処理実行部123は、詳細は後述するが、算出された前記差分奥行き値が閾値未満である場合、当該閾値未満である差分奥行き値に対応する前記処理対象画素に対し前記強調処理を行うことにより、奥行き情報を更新する。 Although the details will be described later, when the calculated difference depth value is less than a threshold, the process execution unit 123 performs the enhancement process on the processing target pixel corresponding to the difference depth value that is less than the threshold. , Update depth information.
 次に、本実施形態の変形例2における立体映像を生成するための処理(以下、立体映像生成処理Bともいう)について説明する。 Next, processing for generating a stereoscopic video (hereinafter also referred to as stereoscopic video generation processing B) according to the second modification of the present embodiment will be described.
 図12は、立体映像生成処理Bのフローチャートである。図12において、図7のステップ番号と同じステップ番号の処理は、第1の実施形態で説明した処理と同様な処理が行われるので詳細な説明は繰り返さない。以下、第1の実施形態と異なる点を中心に説明する。 FIG. 12 is a flowchart of stereoscopic video generation processing B. In FIG. 12, the process of the same step number as the step number of FIG. 7 is performed in the same manner as the process described in the first embodiment, and therefore the detailed description will not be repeated. Hereinafter, differences from the first embodiment will be mainly described.
 まず、奥行き情報取得部111は、立体映像信号SG1が示す奥行き情報を取得する。奥行き情報取得部111は、取得した奥行き情報を、奥行き情報更新部112へ送信する。(S110)。 First, the depth information acquisition unit 111 acquires depth information indicated by the stereoscopic video signal SG1. The depth information acquisition unit 111 transmits the acquired depth information to the depth information update unit 112. (S110).
 そして、ステップS120Bの処理が行われる。 Then, the process of step S120B is performed.
 ステップS120Bでは、奥行き情報更新処理Bが行われる。奥行き情報更新処理Bは、図7の奥行き情報更新処理と比較して、ステップS125の代わりにステップS125Bの処理が行われる点と、ステップS121,S122の処理がさらに行われる点が異なる。奥行き情報更新処理Bのそれ以外の処理は、奥行き情報更新処理と同様な処理であるので詳細な説明は繰り返さない。 In step S120B, depth information update processing B is performed. The depth information update process B differs from the depth information update process of FIG. 7 in that the process of step S125B is performed instead of step S125 and the process of steps S121 and S122 is further performed. The other processing of depth information update processing B is the same processing as the depth information update processing, and therefore detailed description will not be repeated.
 以下に、奥行き情報更新処理Bについて詳細に説明する。以下においては、処理対象となる立体映像を、処理対象立体映像という。ここで、処理対象立体映像は、静止画像であるとして説明する。また、処理対象立体映像が、一例として、立体映像G100であるとして説明する。この場合、奥行き情報更新処理Bにおいて処理される奥行き画像は、立体映像G100に対応する。 The depth information update processing B will be described in detail below. Hereinafter, a stereoscopic video to be processed is referred to as a processing stereoscopic video. Here, it is assumed that the processing target stereoscopic video is a still image. Further, the processing target three-dimensional video is described as an example of the three-dimensional video G100. In this case, the depth image processed in the depth information update process B corresponds to the stereoscopic video G100.
 ステップS121では、算出部121が、立体映像信号SG1が示す奥行き情報(奥行きテーブルT100)が示す奥行き画像を構成する複数の画素を構成する各近接する2つの画素の奥行き値の差の絶対値(以下、差分奥行き値という)を算出する。当該奥行き画像は、処理対象立体映像に対応する。 In step S121, the calculation unit 121 calculates the absolute value of the difference between the depth values of the two adjacent pixels forming a plurality of pixels forming the depth image indicated by the depth information (depth table T100) indicated by the stereoscopic video signal SG1 ( Hereinafter, the difference depth value is calculated. The depth image corresponds to the processing target stereoscopic video.
 具体的には、まず、算出部121は、奥行き画像を構成する複数の画素のうちの1つの画素を、処理対象画素に設定する。 Specifically, first, the calculation unit 121 sets one pixel of the plurality of pixels forming the depth image as a processing target pixel.
 図13は、処理対象画素および処理対象画素に近接する画素を説明するための図である。 FIG. 13 is a diagram for explaining a processing target pixel and a pixel close to the processing target pixel.
 図13において、処理対象画素P30が、処理対象立体映像の上端、下端、左端および右端以外の画素である場合、画素P31,P32,P33,P34は、処理対象画素P30に近接する画素である。 In FIG. 13, when the processing target pixel P30 is a pixel other than the upper end, the lower end, the left end, and the right end of the processing target stereoscopic video, the pixels P31, P32, P33, and P34 are pixels adjacent to the processing target pixel P30.
 画素P31は、上下方向において処理対象画素P30と近接する画素である。画素P32は、左右方向において処理対象画素P30と近接する画素である。画素P33は、左右方向において処理対象画素P30と近接する画素である。画素P34は、上下方向において処理対象画素P30と近接する画素である。 The pixel P31 is a pixel that is close to the processing target pixel P30 in the vertical direction. The pixel P32 is a pixel close to the processing target pixel P30 in the left-right direction. The pixel P33 is a pixel close to the processing target pixel P30 in the left-right direction. The pixel P34 is a pixel which is close to the processing target pixel P30 in the vertical direction.
 以下においては、画素P31,P32,P33,P34の各々を、処理対象近接画素という。 Hereinafter, each of the pixels P31, P32, P33, and P34 is referred to as a processing target proximity pixel.
 なお、処理対象画素が、処理対象立体映像の1行1列目(左上端)の画素である場合、処理対象近接画素は、画素P33,P34のみである。また、処理対象画素が、処理対象立体映像の1行n列目(右上端)の画素である場合、処理対象近接画素は、画素P32,P34のみである。 When the processing target pixel is a pixel in the first row and the first column (upper left corner) of the processing target stereoscopic video, the processing target proximity pixels are only the pixels P33 and P34. Further, when the processing target pixel is a pixel in the first row and the n-th column (upper right end) of the processing target stereoscopic video, the processing target proximity pixels are only the pixels P32 and P34.
 また、処理対象画素が、処理対象立体映像のm行1列目(左下端)の画素である場合、処理対象近接画素は、画素P31,P33のみである。また、処理対象画素が、処理対象立体映像のm行n列目(右下端)の画素である場合、処理対象近接画素は、画素P31,P32のみである。 When the processing target pixel is a pixel in the m-th row and first column (lower left end) of the processing target stereoscopic video, the processing target neighboring pixels are only the pixels P31 and P33. When the processing target pixel is a pixel in the m-th row and the n-th column (lower right end) of the processing target three-dimensional video, the processing target neighboring pixels are only the pixels P31 and P32.
 なお、処理対象近接画素は図13における画素P35,P36,P37,P38のように、斜め方向において処理対象画素P30と近接する画素であってもよい。 Note that the processing target proximity pixel may be a pixel that is close to the processing target pixel P30 in the oblique direction, as with pixels P35, P36, P37, and P38 in FIG.
 また、画素P31の代わりに、画素P41のような2画素以上離れた画素を処理対象近接画素としてもよい。このとき、同様に画素P32の代わりにP42,P33を処理対象近接画素としてもよい。また、画素P43,P34の代わりに画素P44を処理対象近接画素としてもよい。 Further, instead of the pixel P31, a pixel separated by two or more pixels such as the pixel P41 may be set as the processing close pixel. At this time, P42 and P33 may be similarly set as the processing target proximity pixels instead of the pixel P32. Further, instead of the pixels P43 and P34, the pixel P44 may be set as the processing target proximity pixel.
 再び、図12を参照して、ステップS121では、算出部121が、処理対象画素P30の奥行き値と、各処理対象近接画素の奥行き値との差の絶対値(以下、差分奥行き値という)を算出する。 Referring again to FIG. 12, in step S121, calculation unit 121 calculates the absolute value of the difference between the depth value of processing target pixel P30 and the depth value of each processing target proximity pixel (hereinafter referred to as difference depth value). calculate.
 なお、このとき、図13における画素P30と画素P41との差分奥行き値を求める際に、画素P30の奥行き値の代わりに画素P30および当該画素P30の周辺画素を一定の割合で加算したものを用いても良い。同様に、画素P41の奥行き値の代わりに画素P41および当該画素p41の周辺画素を一定の割合で加算したものを用いても良い。 At this time, when the difference depth value between the pixel P30 and the pixel P41 in FIG. 13 is obtained, the pixel P30 and the peripheral pixels of the pixel P30 are added at a constant ratio instead of the depth value of the pixel P30. It is good. Similarly, instead of the depth value of the pixel P41, the pixel P41 and the peripheral pixels of the pixel p41 may be added at a constant ratio.
 例えば、画素P30の奥行き値の代わりに画素P31,P30,P32の奥行き値を1:2:1の割合で加算したもの、画素P41の奥行き値の代わりに画素P41A,P41,P41Bの奥行き値を1:2:1の割合で加算したものを使用し、これら2値の差を差分奥行き値としてもよい。 For example, instead of the depth value of pixel P30, the depth values of pixels P31, P30, and P32 are added at a ratio of 1: 2: 1, and the depth values of pixels P41A, P41, and P41B instead of the depth value of pixel P41. What added at a ratio of 1: 2: 1 may be used, and the difference between these two values may be used as the difference depth value.
 ステップS122では、算出部121が、ステップS121の処理により算出された複数の差分奥行き値の少なくとも1つが、予め定めた閾値以上であるか否かを判定する。 In step S122, the calculation unit 121 determines whether at least one of the plurality of difference depth values calculated by the process of step S121 is equal to or greater than a predetermined threshold.
 ここで、閾値は、例えば、奥行き値に設定される最大値および最小値の差の5~20%の値である。閾値は、奥行き値が0~255の範囲で表される場合、例えば、20である。 Here, the threshold is, for example, a value of 5 to 20% of the difference between the maximum value and the minimum value set for the depth value. The threshold is, for example, 20 when the depth value is represented in the range of 0 to 255.
 ステップS122において、NOならば処理はステップS125Bに移行する。すなわち、算出された複数(全て)の差分奥行き値が閾値未満である場合、処理はステップS125Bに移行する。一方、ステップS123において、YESならば、次の画素に対して、再度、ステップS121の処理が行われる。 If NO in step S122, the process proceeds to step S125B. That is, when the calculated plurality (all) of the difference depth values is less than the threshold, the process proceeds to step S125B. On the other hand, if YES in step S123, the process of step S121 is performed again on the next pixel.
 ステップS125Bでは、奥行き強調処理Bが行われる。奥行き強調処理Bでは、処理実行部123が、処理対象画素を注目画素として、第1の実施形態と同様に、奥行き情報としての奥行きテーブルT100が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域における奥行き値の変化を強調する(大きくする)フィルタ処理を行う。すなわち、処理実行部123は、奥行き情報(奥行きテーブルT100)が示す複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行う。 In step S125B, depth enhancement processing B is performed. In the depth enhancement processing B, the processing execution unit 123 sets the processing target pixel as a target pixel, as in the first embodiment, for a depth image composed of a plurality of depth values indicated by the depth table T100 as depth information. Filter processing to emphasize (increase) the change in depth value in a specific band. That is, the process execution unit 123 performs an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of a plurality of depth values indicated by the depth information (depth table T100).
 なお、ステップS125Bは、ステップS122でNOの場合のみ実行される。すなわち、処理実行部123は、算出された前記差分奥行き値が閾値未満である場合、当該閾値未満である差分奥行き値に対応する前記処理対象画素に対し前記強調処理を行うことにより、奥行き情報を更新する。 Step S125B is executed only in the case of NO at step S122. That is, when the calculated difference depth value is less than the threshold, the processing execution unit 123 performs the enhancement process on the processing target pixel corresponding to the difference depth value that is less than the threshold to obtain depth information. Update.
 以上のステップS121,S122,S125Bの処理が、奥行き画像を構成する全ての画素について繰り返し行われる。なお、ステップS121の処理が行われる毎に、奥行き画像を構成する複数の画素のうち、異なる画素が処理対象画素に設定される。 The above-described processes of steps S121, S122, and S125B are repeated for all the pixels forming the depth image. Note that, each time the process of step S121 is performed, a different pixel is set as the processing target pixel among the plurality of pixels forming the depth image.
 これにより、算出部121は、奥行き画像のうち、処理対象の画素である処理対象画素の奥行き値と、該処理対象画素に近接する画素の奥行き値との差の絶対値である差分奥行き値を算出する処理を、前記奥行き画像を構成する全ての画素に対して行う。 Accordingly, the calculation unit 121 calculates a difference depth value which is an absolute value of a difference between the depth value of the processing target pixel which is the processing target pixel and the depth value of the pixel close to the processing target pixel in the depth image. The calculation process is performed on all the pixels constituting the depth image.
 また、これにより、実施の形態1と同様に、奥行き情報は更新される。 Also, as a result, depth information is updated as in the first embodiment.
 また、ステップS122にてNOの場合のみ奥行き強調処理Bが実行されることになる。これにより、更新後の奥行き情報(奥行きテーブルT100)が示す奥行き画像において同一の物体(例えば、立体対象オブジェクト50)とみなされる範囲では奥行きの変化が強調されるが、異なる奥行きにある2物体の境界では奥行きの変化は強調されない。 In addition, depth emphasis processing B is executed only in the case of NO at step S122. Thereby, in the range regarded as the same object (for example, the stereoscopic target object 50) in the depth image indicated by the updated depth information (depth table T100), the change in depth is emphasized, but two objects at different depths The change in depth is not emphasized at the boundaries.
 ここで、同一物体内の奥行きの変化に比べて、異なる奥行きにある2物体間の奥行きの変化がはるかに大きい時、同一物体内の奥行きの変化がほとんど表現されない。すなわち、同一の物体とみなされる範囲内の奥行き値の変化よりも、当該同一の物体と他の物体との境界部分における奥行き値の変化がはるかに大きい場合、同一物体内の奥行きの変化がほとんど表現されない。この場合、視聴者40は、書き割り的立体感を感じる。 Here, when the change in depth between two objects at different depths is much larger than the change in depth in the same object, the change in depth in the same object is hardly expressed. That is, when the change in depth value at the boundary between the same object and another object is much larger than the change in depth value in the range considered to be the same object, the change in depth in the same object is almost all It is not expressed. In this case, the viewer 40 feels as writing sensible.
 しかし、奥行き情報更新処理Bにより、算出された差分奥行き値が閾値未満である場合のみ、当該閾値未満である差分奥行き値に対応する処理対象画素に対し前記強調処理が行われる。これにより、異なる奥行きにある2物体間の奥行きの変化を強調することなく、同一物体内の奥行きの変化を強調することができる。すなわち、同一物体内の奥行きの変化がほとんど表現されないという書き割り的立体感を解消することができる。 However, only when the calculated difference depth value is less than the threshold value by the depth information update process B, the enhancement process is performed on the processing target pixel corresponding to the difference depth value which is less than the threshold value. This makes it possible to emphasize changes in depth within the same object without emphasizing changes in depth between two objects at different depths. In other words, it is possible to eliminate the scribed stereoscopic effect that the change in depth in the same object is hardly expressed.
 そして、実施の形態1と同様に、ステップS130の処理が行われる。 Then, as in the first embodiment, the process of step S130 is performed.
 なお、奥行き情報更新処理Bでは、S121,S122,S125Bの処理が、奥行き画像を構成する全ての画素について繰り返し行われた直後に、第1の実施形態の変形例1と同様に、ステップS126の奥行き圧縮処理が行われてもよい。 In the depth information update process B, immediately after the processes of S121, S122, and S125B are repeatedly performed on all the pixels constituting the depth image, the process of step S126 is performed as in the first modification of the first embodiment. Depth compression processing may be performed.
 以上、本発明における奥行き情報更新装置又は立体映像生成装置について、実施の形態に基づいて説明したが、本発明は、これらの実施の形態に限定されるものではない。本発明の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態に施したもの、あるいは異なる実施の形態における構成要素を組み合わせて構築される形態も、本発明の範囲内に含まれる。 As mentioned above, although the depth information update apparatus or the three-dimensional-video production | generation apparatus in this invention was demonstrated based on embodiment, this invention is not limited to these embodiment. Without departing from the spirit of the present invention, various modifications as may occur to those skilled in the art may be applied to the present embodiment, or a form constructed by combining components in different embodiments is also included in the scope of the present invention. .
 例えば、奥行き情報を更新する処理は、周波数変換された値に対し行われてもよい。 For example, the process of updating depth information may be performed on frequency-converted values.
 また、例えば、立体映像生成装置110は、2次元映像を受信し、2次元映像から立体映像を生成するための奥行き情報を生成してもよい。この場合、立体映像生成装置110の奥行き情報取得部111は、例えば、2次元映像において、ピントがあっている被写体の画像が、立体映像において、視差ゼロ面の手前側になるように奥行き値を算出する。そして、奥行き情報取得部111は、算出した奥行き値を示す情報を、奥行き情報として生成する。奥行き情報取得部111は、生成することにより取得した奥行き情報を、奥行き情報更新部112へ送信する。 Also, for example, the stereoscopic video generation device 110 may receive a 2D video and generate depth information for generating a stereoscopic video from the 2D video. In this case, the depth information acquisition unit 111 of the three-dimensional video generation device 110, for example, sets the depth value so that the image of the subject in focus in the two-dimensional video is on the near side of the zero parallax surface in the three-dimensional video. calculate. Then, the depth information acquisition unit 111 generates information indicating the calculated depth value as depth information. The depth information acquisition unit 111 transmits the depth information acquired by the generation to the depth information update unit 112.
 また、近接する2つの画素とは、互いに接する2つの画素に限定されない。近接する2つの画素は、例えば、該近接する2つの画素の間に1つ以上の画素を挟むように配置された画素であってもよい。 In addition, two adjacent pixels are not limited to two pixels in contact with each other. The two adjacent pixels may be, for example, pixels arranged to sandwich one or more pixels between the two adjacent pixels.
 また、上記の奥行き情報更新装置101を構成する複数の構成要素の全てまたは一部は、ハードウエアで構成されてもよい。また、上記のメモリ管理ユニットを構成する構成要素の全てまたは一部は、CPU(Central Processing Unit)等により実行されるプログラムのモジュールであってもよい。 Further, all or part of the plurality of components constituting the depth information updating apparatus 101 described above may be configured by hardware. Further, all or part of the components constituting the memory management unit may be a module of a program executed by a central processing unit (CPU) or the like.
 また、上記の立体映像生成装置110または奥行き情報更新装置101を構成する複数の構成要素の全てまたは一部は、1個のシステムLSI(Large Scale Integration:大規模集積回路)から構成されてもよい。システムLSIは、複数の構成要素を1個のチップ上に集積して製造された超多機能LSIであり、具体的には、マイクロプロセッサ、ROM(Read Only Memory)及びRAM(Random Access Memory)などを含んで構成されるコンピュータシステムである。 In addition, all or part of the plurality of components constituting the stereoscopic image generation device 110 or the depth information update device 101 described above may be configured from one system LSI (Large Scale Integration: large scale integrated circuit). . The system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, a microprocessor, a read only memory (ROM), a random access memory (RAM), etc. A computer system configured to include
 例えば、立体映像生成装置110または奥行き情報更新装置101は、1個のシステムLSI(集積回路)から構成されてもよい。すなわち、立体映像生成装置110または奥行き情報更新装置101は、集積回路であってもよい。 For example, the stereoscopic video generation device 110 or the depth information update device 101 may be configured from one system LSI (integrated circuit). That is, the stereoscopic video generation device 110 or the depth information update device 101 may be an integrated circuit.
 また、本発明は、立体映像生成装置110または奥行き情報更新装置101が備える特徴的な構成部の動作をステップとする奥行き情報更新方法として実現してもよい。また、本発明は、そのような奥行き情報更新方法に含まれる各ステップをコンピュータに実行させるプログラムとして実現してもよい。また、本発明は、そのようなプログラムを格納するコンピュータ読み取り可能な記録媒体として実現されてもよい。また、当該プログラムは、インターネット等の伝送媒体を介して配信されてもよい。 Further, the present invention may be realized as a depth information updating method in which the operation of a characteristic configuration unit included in the stereoscopic video generating device 110 or the depth information updating device 101 is a step. The present invention may also be implemented as a program that causes a computer to execute the steps included in such a depth information update method. Also, the present invention may be realized as a computer readable recording medium storing such a program. Also, the program may be distributed via a transmission medium such as the Internet.
 また、上記実施形態で用いた全ての数値は、本発明を具体的に説明するための一例の数値である。すなわち、本発明は、上記実施形態で用いた各数値に制限されない。 Further, all the numerical values used in the above embodiment are an example of numerical values for specifically explaining the present invention. That is, the present invention is not limited to each numerical value used in the above embodiment.
 また、奥行き情報更新装置101の構成は、本発明を具体的に説明するための一例の構成である。すなわち、奥行き情報更新装置101は、図3に示される全ての構成要素を備えなくてもよい。つまり、本発明に係る奥行き情報更新装置101は、本発明の効果を実現できる最小限の構成のみを備えればよい。例えば、処理実行部123が、算出部121が行う処理も行うようにすれば、奥行き情報更新装置101の奥行き情報更新部112は、算出部121を備えなくてもよい。 Further, the configuration of the depth information update device 101 is an example of a configuration for specifically explaining the present invention. That is, the depth information updating apparatus 101 may not include all the components shown in FIG. That is, the depth information updating apparatus 101 according to the present invention only needs to have the minimum configuration that can realize the effects of the present invention. For example, if the process execution unit 123 also performs the process performed by the calculation unit 121, the depth information update unit 112 of the depth information update apparatus 101 may not include the calculation unit 121.
 また、本発明に係る奥行き情報更新方法は、図7の奥行き情報更新処理、図9の奥行き情報更新処理Aまたは図12の奥行き情報更新処理Bに相当する。本発明に係る奥行き情報更新方法は、図7、図9または図12における、対応する全てのステップを必ずしも含む必要はない。すなわち、本発明に係る奥行き情報更新方法は、本発明の効果を実現できる最小限のステップのみを含めばよい。 Further, the depth information updating method according to the present invention corresponds to the depth information updating process of FIG. 7, the depth information updating process A of FIG. 9, or the depth information updating process B of FIG. The depth information updating method according to the present invention does not necessarily include all the corresponding steps in FIG. 7, FIG. 9 or FIG. That is, the depth information updating method according to the present invention may include only the minimum steps capable of realizing the effects of the present invention.
 また、奥行き情報更新方法における各ステップの実行される順序は、本発明を具体的に説明するための一例であり、上記以外の順序であってもよい。また、奥行き情報更新方法におけるステップの一部と、他のステップとは、互いに独立して並列に実行されてもよい。 Further, the order in which the steps in the depth information update method are performed is an example for specifically explaining the present invention, and may be an order other than the above. Also, some of the steps in the depth information update method and other steps may be performed in parallel independently of each other.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 It should be understood that the embodiments disclosed herein are illustrative and non-restrictive in every respect. The scope of the present invention is shown not by the above description but by the scope of claims, and is intended to include all modifications within the scope and meaning equivalent to the scope of claims.
 本発明は、細かな立体感の表現が向上した立体映像を生成するための奥行き情報を生成することができる奥行き情報更新装置として、利用することができる。 INDUSTRIAL APPLICABILITY The present invention can be used as a depth information updating device capable of generating depth information for generating a stereoscopic video with an improved representation of fine stereoscopic effect.
50 立体対象オブジェクト
100 立体映像再生装置
101 奥行き情報更新装置
110 立体映像生成装置
111 奥行き情報取得部
112 奥行き情報更新部
113 立体映像生成部
121 算出部
123 処理実行部
200 立体映像表示装置
210 表示面
300 アクティブシャッタメガネ
1000 立体映像視聴システム
 
50 3D Object 100 3D Image Reproduction Device 101 Depth Information Update Device 110 3D Image Generation Device 111 Depth Information Acquisition Unit 112 Depth Information Update Unit 113 3D Image Generation Unit 121 Calculation Unit 123 Processing Execution Unit 200 3D Image Display Device 210 Display Surface 300 Active shutter glasses 1000 3D viewing system

Claims (8)

  1.  立体映像を構成する各画素に対応する奥行き値を用いた処理を行う奥行き情報更新装置であって、
     立体映像を生成するための複数の奥行き値を示す奥行き情報を取得する奥行き情報取得部と、
     前記奥行き情報が示す前記複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行うことにより、前記奥行き情報を更新する奥行き情報更新部とを備える
     奥行き情報更新装置。
    A depth information updating apparatus that performs processing using a depth value corresponding to each pixel constituting a stereoscopic video, and
    A depth information acquisition unit that acquires depth information indicating a plurality of depth values for generating a stereoscopic video;
    A depth information updating unit configured to update the depth information by performing an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image constituted by the plurality of depth values indicated by the depth information; Depth information update device provided.
  2.  前記奥行き情報更新部は、前記奥行き画像の少なくとも一部において、近接する2つの画素の奥行き値の差の絶対値が閾値未満である該2つの奥行き値の差の絶対値を大きくする前記強調処理を行う
     請求項1に記載の奥行き情報更新装置。
    The emphasizing process of increasing the absolute value of the difference between the two depth values where the absolute value of the difference between the depth values of two adjacent pixels is less than a threshold in at least a part of the depth image The depth information update device according to claim 1.
  3.  前記奥行き情報更新部は、
      前記奥行き画像のうち、処理対象の画素である処理対象画素の奥行き値と、該処理対象画素に近接する画素の奥行き値との差の絶対値である差分奥行き値を算出する処理を、前記奥行き画像を構成する全ての画素に対して行う算出部と、
      算出された前記差分奥行き値が閾値未満である場合、当該閾値未満である差分奥行き値に対応する前記処理対象画素に対し前記強調処理を行う処理実行部とを含む
     請求項1または2に記載の奥行き情報更新装置。
    The depth information update unit is
    The process of calculating a difference depth value which is an absolute value of a difference between a depth value of a processing target pixel which is a processing target pixel among the depth images and a depth value of a pixel close to the processing target pixel; A calculation unit performed on all the pixels constituting the image;
    The process execution part which performs the said emphasis process with respect to the said process target pixel corresponding to the difference depth value which is less than the said threshold value when the said difference depth value calculated is less than a threshold value is included. Depth information update device.
  4.  前記奥行き情報更新部は、フィルタ処理である前記強調処理を行う
     請求項1~3のいずれか1項に記載の奥行き情報更新装置。
    The depth information updating apparatus according to any one of claims 1 to 3, wherein the depth information updating unit performs the enhancement process which is a filter process.
  5.  前記奥行き情報更新部は、さらに、更新後の奥行き情報が示す複数の奥行き値の絶対値が、ほぼ同じ割合で小さくなるように、該更新後の奥行き情報が示す複数の奥行き値を変更することにより、該更新後の奥行き情報を更新する
     請求項1~4のいずれか1項に記載の奥行き情報更新装置。
    The depth information updating unit further changes the plurality of depth values indicated by the updated depth information such that absolute values of the plurality of depth values indicated by the updated depth information decrease at substantially the same rate. The depth information updating device according to any one of claims 1 to 4, wherein the depth information after the updating is updated.
  6.  前記奥行き情報更新部は、前記更新後の奥行き情報が示す複数の奥行き値の絶対値に1未満の係数を乗算することにより、該更新後の奥行き情報を更新する
     請求項5に記載の奥行き情報更新装置。
    The depth information update unit according to claim 5, wherein the depth information update unit updates the updated depth information by multiplying the absolute value of the plurality of depth values indicated by the updated depth information by a coefficient less than one. Update device.
  7.  請求項1~6のいずれか1項に記載の奥行き情報更新装置と、
     更新後の前記奥行き情報が示す複数の奥行き値を用いて、新たな立体映像を生成する立体映像生成部とを備える
     立体映像生成装置。
    The depth information update device according to any one of claims 1 to 6,
    A three-dimensional video generation device comprising: a three-dimensional video generation unit that generates a new three-dimensional video using a plurality of depth values indicated by the updated depth information.
  8.  立体映像を構成する各画素に対応する奥行き値を用いた処理を行う奥行き情報更新装置が行う奥行き情報更新方法であって、
     立体映像を生成するための複数の奥行き値を示す奥行き情報を取得する奥行き情報取得ステップと、
     前記奥行き情報が示す前記複数の奥行き値から構成される奥行き画像に対して、特定の帯域の成分を強調する処理である強調処理を行うことにより、前記奥行き情報を更新する奥行き情報更新ステップとを含む
     奥行き情報更新方法。
     
    A depth information updating method performed by a depth information updating apparatus that performs processing using a depth value corresponding to each pixel constituting a stereoscopic video, comprising:
    A depth information acquisition step of acquiring depth information indicating a plurality of depth values for generating a stereoscopic video;
    A depth information updating step of updating the depth information by performing an emphasizing process, which is a process of emphasizing a component of a specific band, on a depth image composed of the plurality of depth values indicated by the depth information; Including depth information update method.
PCT/JP2011/001795 2011-03-25 2011-03-25 Depth information updating device, stereoscopic video generation device, and depth information updating method WO2012131752A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/001795 WO2012131752A1 (en) 2011-03-25 2011-03-25 Depth information updating device, stereoscopic video generation device, and depth information updating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/001795 WO2012131752A1 (en) 2011-03-25 2011-03-25 Depth information updating device, stereoscopic video generation device, and depth information updating method

Publications (1)

Publication Number Publication Date
WO2012131752A1 true WO2012131752A1 (en) 2012-10-04

Family

ID=46929607

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/001795 WO2012131752A1 (en) 2011-03-25 2011-03-25 Depth information updating device, stereoscopic video generation device, and depth information updating method

Country Status (1)

Country Link
WO (1) WO2012131752A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005065162A (en) * 2003-08-20 2005-03-10 Matsushita Electric Ind Co Ltd Display device, transmitting apparatus, transmitting/receiving system, transmitting/receiving method, display method, transmitting method, and remote controller
JP2007110360A (en) * 2005-10-13 2007-04-26 Ntt Comware Corp Stereoscopic image processing apparatus and program
JP2008141666A (en) * 2006-12-05 2008-06-19 Fujifilm Corp Stereoscopic image creating device, stereoscopic image output device, and stereoscopic image creating method
JP2010206774A (en) * 2009-02-05 2010-09-16 Fujifilm Corp Three-dimensional image output device and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005065162A (en) * 2003-08-20 2005-03-10 Matsushita Electric Ind Co Ltd Display device, transmitting apparatus, transmitting/receiving system, transmitting/receiving method, display method, transmitting method, and remote controller
JP2007110360A (en) * 2005-10-13 2007-04-26 Ntt Comware Corp Stereoscopic image processing apparatus and program
JP2008141666A (en) * 2006-12-05 2008-06-19 Fujifilm Corp Stereoscopic image creating device, stereoscopic image output device, and stereoscopic image creating method
JP2010206774A (en) * 2009-02-05 2010-09-16 Fujifilm Corp Three-dimensional image output device and method

Similar Documents

Publication Publication Date Title
EP2648414B1 (en) 3d display apparatus and method for processing image using the same
JP6147275B2 (en) Stereoscopic image processing apparatus, stereoscopic image processing method, and program
EP3350989B1 (en) 3d display apparatus and control method thereof
US20120293489A1 (en) Nonlinear depth remapping system and method thereof
CN102404592B (en) Image processing device and method, and stereoscopic image display device
CN103444193B (en) Image processing equipment and image processing method
CN102905145B (en) Stereoscopic image system, image generation method, image adjustment device and method thereof
US10694173B2 (en) Multiview image display apparatus and control method thereof
JP6033625B2 (en) Multi-viewpoint image generation device, image generation method, display device, program, and recording medium
US20120308115A1 (en) Method for Adjusting 3-D Images by Using Human Visual Model
US20140063206A1 (en) System and method of viewer centric depth adjustment
WO2014038476A1 (en) Stereoscopic image processing device, stereoscopic image processing method, and program
JP6377155B2 (en) Multi-view video processing apparatus and video processing method thereof
WO2012131752A1 (en) Depth information updating device, stereoscopic video generation device, and depth information updating method
CN102970498A (en) Display method and display device for three-dimensional menu display
KR101912242B1 (en) 3d display apparatus and method for image processing thereof
JP2011223126A (en) Three-dimensional video display apparatus and three-dimensional video display method
Li et al. On adjustment of stereo parameters in multiview synthesis for planar 3D displays
US20130114884A1 (en) Three-dimension image processing method and a three-dimension image display apparatus applying the same
Kim et al. Crosstalk Reduction of Glasses-free 3D Displays using Multiview Image Processing
JP2014053782A (en) Stereoscopic image data processor and stereoscopic image data processing method
US9547933B2 (en) Display apparatus and display method thereof
Masia et al. Perceptually-optimized content remapping for automultiscopic displays
US20140055579A1 (en) Parallax adjustment device, three-dimensional image generation device, and method of adjusting parallax amount
TW201325202A (en) Three-dimension image processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11862537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11862537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP