WO2012111756A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2012111756A1 WO2012111756A1 PCT/JP2012/053676 JP2012053676W WO2012111756A1 WO 2012111756 A1 WO2012111756 A1 WO 2012111756A1 JP 2012053676 W JP2012053676 W JP 2012053676W WO 2012111756 A1 WO2012111756 A1 WO 2012111756A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- viewpoint
- unit
- parallax
- resolution
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 289
- 238000003672 processing method Methods 0.000 title abstract description 8
- 238000000034 method Methods 0.000 claims description 165
- 230000008569 process Effects 0.000 claims description 109
- 238000012937 correction Methods 0.000 claims description 17
- 230000002093 peripheral effect Effects 0.000 claims description 4
- 230000008685 targeting Effects 0.000 claims 2
- 238000009499 grossing Methods 0.000 abstract description 69
- 238000005516 engineering process Methods 0.000 abstract description 25
- 238000000926 separation method Methods 0.000 description 38
- 238000006243 chemical reaction Methods 0.000 description 31
- 238000003384 imaging method Methods 0.000 description 29
- 238000010586 diagram Methods 0.000 description 20
- 238000004891 communication Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 16
- 230000003287 optical effect Effects 0.000 description 13
- 239000011521 glass Substances 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 8
- 230000009467 reduction Effects 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 7
- 230000003321 amplification Effects 0.000 description 4
- 238000003199 nucleic acid amplification method Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012856 packing Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
- H04N13/354—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying sequentially
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/005—Aspects relating to the "3D+depth" image format
Definitions
- the present technology relates to an image processing apparatus and an image processing method, and in particular, when a depth image reduced in resolution is transmitted, an image of a predetermined viewpoint is generated with high accuracy on the receiving side using the depth image.
- the present invention relates to an image processing apparatus and an image processing method.
- glasses for opening a left-eye shutter when displaying one of two viewpoint images and opening a right-eye shutter when displaying the other image are mounted alternately.
- a method of viewing images of two viewpoints to be displayed (hereinafter referred to as a method with glasses) is common.
- images of three or more viewpoints are displayed so that the angles that can be viewed are different for each viewpoint, and the viewer views each image of two arbitrary viewpoints with the left and right eyes, 3D images can be viewed without wearing.
- a decoding device that provides viewing without a glasses generates and displays an image of three or more viewpoints (hereinafter, three or more viewpoints are referred to as multi-viewpoints) from two viewpoint images.
- the encoding apparatus obtains the parallax (depth) value of the two viewpoint images, and transmits a parallax image (depth image) representing the parallax value by a luminance value or the like to the decoding apparatus.
- the decoding apparatus generates a multi-view parallax image by performing a parallax image warping process on the received 2-view image, and performs a 2-view image warping process using the multi-view parallax image.
- a multi-viewpoint image is generated, combined and displayed.
- the warping process refers to the parallax value (or pixel value) of each pixel of the parallax image (or image) of a predetermined viewpoint, and the parallax of the pixel on the parallax image (or image) of the virtual viewpoint corresponding to the pixel. This is processing for setting a value (or pixel value).
- AVC Advanced Video Coding
- MVC Multiview Video Coding
- an encoding apparatus corresponding to a decoding apparatus that provides viewing without a glasses system reduces the data amount of a parallax image by reducing the resolution of the parallax image
- the decoding apparatus It is necessary to increase the resolution and generate a parallax image before reducing the resolution.
- the present technology has been made in view of such a situation, and when a parallax image with a reduced resolution is transmitted, on the receiving side, the parallax image is used to accurately obtain an image of a predetermined viewpoint. It is something that can be generated.
- An image processing apparatus includes a receiving unit that receives a depth image that has been reduced in resolution, a resolution increasing unit that increases the resolution of the depth image received by the receiving unit, and a virtual viewpoint position. Based on the depth image warping processing unit that generates the depth image of the virtual viewpoint by performing warping processing on the depth image that has been increased in resolution by the resolution increasing unit, and the depth image warping processing unit A region that exists in the viewpoint image of the virtual viewpoint for the generated depth image of the virtual viewpoint, but does not exist in the viewpoint image corresponding to the depth image used to generate the depth image of the virtual viewpoint And a correction unit that corrects the pixel value of the occlusion area.
- the image processing method according to one aspect of the present technology corresponds to the image processing apparatus according to one aspect of the present technology.
- a depth image with a reduced resolution is received, the received depth image is increased in resolution, and the depth image is increased in resolution based on a position of a virtual viewpoint.
- a depth image of the virtual viewpoint is generated, and the depth image of the virtual viewpoint is used as a target to generate the depth image of the virtual viewpoint.
- the pixel value of the occlusion area which is an area that does not exist in the viewpoint image corresponding to the depth image, is corrected.
- the image processing apparatus can be realized by causing a computer to execute a program.
- a program to be executed by a computer can be provided by being transmitted via a transmission medium or by being recorded on a recording medium.
- an image of a predetermined viewpoint when a depth image with a reduced resolution is transmitted, an image of a predetermined viewpoint can be generated with high accuracy using the depth image.
- FIG. 6 is a flowchart for explaining decoding processing by the decoding device of FIG. 4.
- 12 is a flowchart illustrating details of multi-viewpoint image generation processing in FIG. 11. It is a block diagram which shows the structural example of 2nd Embodiment of an encoding apparatus. It is a figure which shows the example of boundary information. It is a flowchart explaining the encoding process by the encoding apparatus of FIG.
- FIG. It is a flowchart explaining the encoding process by the encoding apparatus of FIG. It is a flowchart explaining the boundary information generation process of FIG. It is a block diagram which shows the structural example of 2nd Embodiment of the decoding apparatus to which this technique is applied. It is a block diagram which shows the detailed structural example of the 3D image generation part of FIG. It is a figure explaining the smoothing process based on boundary information. It is a flowchart explaining the decoding process by the decoding apparatus of FIG. It is a flowchart explaining the multiview image generation process of FIG. It is a figure explaining parallax and depth. It is a figure which shows the structural example of one Embodiment of a computer.
- FIG. 23 is a diagram illustrating parallax and depth.
- the depth of the subject M from the camera c1 (camera c2).
- the depth Z that is the distance in the direction is defined by the following equation (a).
- L is a horizontal distance between the position C1 and the position C2 (hereinafter, referred to as an inter-camera distance).
- D is the position of the subject M on the captured image captured by the camera c2 from the horizontal distance u1 of the position of the subject M on the captured image captured by the camera c1 from the center of the captured image.
- f is the focal length of the camera c1, and in the formula (a), the focal lengths of the camera c1 and the camera c2 are the same.
- the parallax d and the depth Z can be uniquely converted. Therefore, in this specification, the image representing the parallax d and the image representing the depth Z of the two viewpoint color images taken by the camera c1 and the camera c2 are collectively referred to as a depth image (parallax image).
- the depth image may be an image representing the parallax d or the depth Z
- the pixel value of the depth image is not the parallax d or the depth Z itself, but the parallax d is normalized.
- a value obtained by normalizing the value, the reciprocal 1 / Z of the depth Z, or the like can be employed.
- the value I obtained by normalizing the parallax d with 8 bits (0 to 255) can be obtained by the following equation (b). Note that the normalization bit number of the parallax d is not limited to 8 bits, and other bit numbers such as 10 bits and 12 bits may be used.
- D max is the maximum value of the parallax d
- D min is the minimum value of the parallax d.
- the maximum value D max and the minimum value D min may be set in units of one screen, or may be set in units of a plurality of screens.
- the value y obtained by normalizing the reciprocal 1 / Z of the depth Z by 8 bits (0 to 255) can be obtained by the following equation (c).
- the normalized bit number of the inverse 1 / Z of the depth Z is not limited to 8 bits, and other bit numbers such as 10 bits and 12 bits may be used.
- Z far is the maximum value of the depth Z
- Z near is the minimum value of the depth Z.
- the maximum value Z far and the minimum value Z near may be set in units of one screen or may be set in units of a plurality of screens.
- an image having a pixel value of the value I obtained by normalizing the parallax d, and an inverse 1 / of the depth Z Images with the pixel value normalized by the value y obtained by normalizing Z are collectively referred to as a depth image (parallax image).
- the color format of the depth image (parallax image) is YUV420 or YUV400, but other color formats are also possible.
- the value I or the value y is set as the depth information (parallax information). Further, a map obtained by mapping the value I or the value y is a depth map (disparity map).
- FIG. 1 is a block diagram illustrating a configuration example of a first embodiment of an encoding device corresponding to an image processing device to which the present technology is applied.
- the encoding device 50 in FIG. 1 includes an imaging unit 51A to 51C, an image conversion unit 52, a parallax image generation unit 53, an image information generation unit 54, a compatible information generation unit 55, an imaging information generation unit 56, and parallax image multiplexed information generation.
- the unit 57, the encoder 58, and the multiplexing unit 59 are included.
- the encoding device 50 encodes and transmits a parallax image with a reduced resolution.
- the imaging unit 51A captures an HD (High Definition) image of a predetermined viewpoint as the viewpoint image A1, and the image conversion unit 52, the parallax image generation unit 53, and the imaging information generation unit 56.
- the imaging unit 51B captures an HD image of a viewpoint different from the viewpoint image A1 as a viewpoint image B1 at a position that is separated from the imaging unit 51A in the horizontal direction by a distance ⁇ d1 AB , and the image conversion unit 52, the parallax image generation unit 53, And supplied to the photographing information generation unit 56.
- the image capturing unit 51C captures an HD image of a viewpoint different from the viewpoint image A1 and the viewpoint image B1 as the viewpoint image C1 at a position away from the image capturing unit 51A in the horizontal direction opposite to the image capturing unit 51A by a distance ⁇ d1 AC .
- the image conversion unit 52, the parallax image generation unit 53, and the shooting information generation unit 56 are supplied.
- the viewpoint corresponding to the viewpoint image B1 and the viewpoint image C1 is an outer viewpoint from among viewpoints of an image that can be perceived as a 3D image.
- the decoding device corresponding to the encoding device 50 uses the viewpoint images A1 to C1 to interpolate the viewpoint image B1 and the viewpoint image inside the viewpoint image C1, thereby allowing the multi-viewpoint image to be interpolated. Can be generated.
- the distance ⁇ d1 AB and the distance ⁇ d1 AC may be fixed or may change with time.
- the image conversion unit 52 determines the viewpoint image A1 supplied from the photographing unit 51A having the horizontal position on the inner side among the photographing units 51A to 51C as a compatible image.
- a compatible image is an image that is encoded by an existing encoding method in order to ensure compatibility with an existing encoding device among multi-view images.
- the image conversion unit 52 supplies information specifying the viewpoint image A1 as a compatible image to the compatible information generation unit 55, and supplies the viewpoint image A1 that is a compatible image to the encoder 58 as it is.
- the image conversion unit 52 uses the viewpoint image B1 and the viewpoint image C1 other than the viewpoint image A1 as auxiliary images.
- the auxiliary image is an image for generating an image of viewpoints having more viewpoints than the compatible images using the compatible images.
- the image conversion unit 52 multiplexes the auxiliary image viewpoint image B1 and viewpoint image C1 with a reduced resolution based on a predetermined multiplexing method. Specifically, for example, when the multiplexing method is a side-by-side method, the image conversion unit 52 halves the resolution of the viewpoint image B1 and the viewpoint image C1.
- the image conversion unit 52 then converts the viewpoint image B1 with half the resolution (hereinafter referred to as 1/2 resolution viewpoint image B1) to the left half of the screen, and the viewpoint image C1 with half resolution (hereinafter referred to as the following).
- the 1/2 resolution viewpoint image B1 and the 1/2 resolution viewpoint image C1 are multiplexed so that the half resolution viewpoint image C1) is an image on the right half of the screen.
- the image conversion unit 52 supplies the multiplexed image obtained as a result of the multiplexing to the encoder 58 and supplies information indicating the auxiliary image multiplexing method to the image information generation unit 54.
- the parallax image generation unit 53 detects the parallax value of each pixel of the viewpoint image A1 to the viewpoint image C1 using the viewpoint image A1 to the viewpoint image C1 supplied from the imaging unit 51A to the imaging unit 51C.
- the parallax image generation unit 53 generates a parallax image A ⁇ b> 1 ′ that represents the parallax value of each pixel of the viewpoint image A ⁇ b> 1 that is a compatible image, and supplies the parallax image A ⁇ b> 1 ′ to the encoder 58.
- the parallax image generation unit 53 multiplexes the parallax image B1 'and the parallax image C1' with a reduced resolution based on a predetermined multiplexing method, and supplies the resulting multiplexed image to the encoder 58.
- the parallax image generation unit 53 supplies information indicating the multiplexing method of the parallax images of the auxiliary image to the parallax image multiplexing information generation unit 57.
- the image information generation unit 54 Based on the information supplied from the image conversion unit 52, the image information generation unit 54 generates information indicating the auxiliary image multiplexing method as image information that is information related to the compatible image and the auxiliary image, and sends the information to the encoder 58. Supply.
- the compatibility information generation unit 55 Based on the information supplied from the image conversion unit 52, the compatibility information generation unit 55 generates information for specifying a compatible image, a compatibility mode, and the like as compatibility information that is information related to compatibility, and supplies the information to the encoder 58.
- the compatibility mode is a mode representing a coding method of a compatible image.
- the compatibility mode represents, for example, a mono mode (mono) representing an encoding method for encoding a single-view compatible image by the AVC method, and an encoding method for multiplexing two-view compatible images and encoding them by the AVC method.
- the shooting information generation unit 56 uses the viewpoint images A1 to C1 supplied from the shooting units 51A to 51C to determine the distance between the viewpoints of the two viewpoint images among the viewpoint images A1 to C1 ( Hereinafter, the distance between viewpoints) is detected. Specifically, the shooting information generation unit 56 calculates the horizontal distance ⁇ d1 AB between the shooting unit 51A and the shooting unit 51B and the horizontal distance ⁇ d1 AC between the shooting unit 51A and the shooting unit 51C between the viewpoints. Detect as distance.
- the shooting information generation unit 56 acquires the internal parameters of the shooting units 51A to 51C and the rotation matrix for the warping process from the shooting units 51A to 51C.
- the internal parameters include the focal length, the position of the principal point (the optical center of the lens) that is the center of the image, the radial distortion coefficient, and the like.
- the shooting information generation unit 56 generates shooting information such as the distance between viewpoints, internal parameters, and a rotation matrix for warping processing as shooting information, and supplies the information to the encoder 58.
- the parallax image multiplexing information generation unit 57 Based on the information supplied from the parallax image generation unit 53, the parallax image multiplexing information generation unit 57 performs parallax image multiplexing on information related to the parallax image multiplexing such as information indicating the multiplexing method of the parallax image of the auxiliary image. Information is generated and supplied to the encoder 58.
- the encoder 58 includes a compatible encoder 61 and an auxiliary encoder 62.
- the compatible encoder 61 encodes the multiplexed image of the compatible image supplied from the image conversion unit 52 by using the existing AVC method, adds various information, and uses the resulting encoded stream as a compatible stream. 59.
- the auxiliary encoder 62 encodes the multiplexed image of the auxiliary image from the image converting unit 52, the parallax image A1 ′ of the compatible image from the parallax image generating unit 53, and the multiplexed image of the parallax image of the auxiliary image by a predetermined method.
- a predetermined method an encoding method in the auxiliary encoder 62, an AVC method, an MVC method, an MPEG2 (Moving / Pictures / Experts / Group / phase-2) method, or the like can be used.
- the auxiliary encoder 62 adds, to the encoded image obtained as a result of encoding, image information from the image information generation unit 54, compatibility information from the compatibility information generation unit 55, shooting information from the shooting information generation unit 56, and parallax.
- the encoded stream is generated by adding the parallax image multiplexing information from the image multiplexing information generation unit 57.
- the auxiliary encoder 62 supplies the encoded stream to the multiplexing unit 59 as an auxiliary stream.
- the multiplexing unit 59 generates and multiplexes TS (Transport Stream) from the compatible stream supplied from the compatible encoder 61 and the auxiliary stream supplied from the auxiliary encoder 62, respectively.
- the multiplexing unit 59 transmits a multiplexed stream obtained as a result of multiplexing.
- FIG. 2 and 3 are flowcharts for explaining the encoding process by the encoding device 50 of FIG. This encoding process is started, for example, when the viewpoint image A1 to the viewpoint image C1 are output from the imaging unit 51A to the imaging unit 51C.
- the shooting information generation unit 56 acquires the internal parameters of the shooting units 51A to 51C and the rotation matrix for the warping process from the shooting units 51A to 51C.
- step S11 the shooting information generation unit 56 uses the viewpoint images A1 to C1 supplied from the shooting units 51A to 51C, and the viewpoints of two viewpoint images of the viewpoint images A1 to C1. Detect the distance.
- step S12 the shooting information generation unit 56 generates shooting information such as the distance between viewpoints, internal parameters, and a rotation matrix for warping processing as shooting information, and supplies it to the encoder 58.
- step S13 the image conversion unit 52 determines the viewpoint image A1 supplied from the imaging unit 51A having the horizontal position on the inner side among the imaging units 51A to 51C as a compatible image, and multiplexes auxiliary images. Determine the method.
- the image conversion unit 52 supplies information specifying the viewpoint image A ⁇ b> 1 as a compatible image to the compatible information generation unit 55 and supplies the auxiliary image multiplexing method to the image information generation unit 54.
- step S ⁇ b> 15 the image information generation unit 54 generates information indicating the auxiliary image multiplexing method as image information based on the information supplied from the image conversion unit 52 and inputs the information to the encoder 58.
- step S16 the image conversion unit 52 uses the viewpoint image B1 and the viewpoint image C1 other than the viewpoint image A1 as auxiliary images, reduces the resolution of the auxiliary images based on the auxiliary image multiplexing method determined in step S13, and multiplexes them. To obtain a multiplexed image of the auxiliary image.
- step S ⁇ b> 17 the image conversion unit 52 inputs the viewpoint image A ⁇ b> 1 that is a compatible image and the multiplexed image of the auxiliary image to the encoder 58.
- step S18 of FIG. 3 the parallax image generation unit 53 detects the parallax value of each pixel of the viewpoint images A1 to C1 using the viewpoint images A1 to C1 supplied from the imaging units 51A to 51C. Then, the parallax images A1 ′ to C1 ′ are generated.
- step S19 the parallax image generation unit 53 determines the multiplexing method of the parallax image of the auxiliary image, and supplies information indicating the multiplexing method to the parallax image multiplexing information generation unit 57.
- step S ⁇ b> 20 the parallax image multiplexing information generation unit 57 generates information indicating the parallax image multiplexing method of the auxiliary image as parallax image multiplexing information based on the information supplied from the parallax image generation unit 53. And input to the encoder 58.
- step S21 the parallax image generation unit 53 multiplexes the parallax image B1 ′ and the parallax image C1 ′ of the auxiliary image at a low resolution based on the method of multiplexing the parallax images of the auxiliary image determined in step S21. A multiplexed image of the parallax images of the auxiliary image is obtained.
- step S22 the parallax image generation unit 53 inputs the multiplexed image of the parallax image A1 ′ of the compatible image and the parallax image of the auxiliary image to the encoder 58.
- step S23 the compatible encoder 61 of the encoder 58 encodes the viewpoint image A1, which is a compatible image supplied from the image conversion unit 52, using the existing AVC method, and multiplexes the resulting encoded stream as a compatible stream. To the unit 59.
- step S24 the auxiliary encoder 62 receives the multiplexed image of the auxiliary image from the image converting unit 52, the parallax image A1 ′ of the compatible image from the parallax image generating unit 53, and the multiplexed image of the parallax image of the auxiliary image. Encode using a predetermined method.
- step S25 the auxiliary encoder 62 adds the image information from the image information generation unit 54, the compatibility information from the compatibility information generation unit 55, and the imaging information generation unit to the encoded image obtained as a result of the encoding in step S24.
- the encoded information is generated by adding the shooting information from 56, the parallax image multiplexing information from the parallax image multiplexing information generation unit 57, and the like.
- the auxiliary encoder 62 supplies the encoded stream to the multiplexing unit 59 as an auxiliary stream.
- step S26 the multiplexing unit 59 generates a TS from the compatible stream supplied from the compatible encoder 61 and the auxiliary stream supplied from the auxiliary encoder 62, and multiplexes and transmits the TS. Then, the process ends.
- the encoding device 50 since the encoding device 50 performs encoding by reducing the resolution of the auxiliary image and the parallax image of the auxiliary image, the encoding target information is compared with the case of encoding without reducing the resolution. The amount can be reduced, and the processing costs of the encoding process and the decoding process can be reduced. As a result, it is possible to prevent the performance of the decoding process in the decoding device from greatly affecting the image quality of multi-viewpoint images.
- the encoding device 50 since the encoding device 50 uses one viewpoint image among the multi-viewpoint images as a compatible image and encodes it using an existing encoding method, the encoding device 50 is compatible with an encoding device that encodes an existing 2D image. Can be secured.
- the decoding device corresponding to the encoding device 50 needs to generate a parallax image in order to generate a multi-viewpoint image. Therefore, the processing load of the decoding device can be reduced. As a result, the cost of the decoding device can be reduced. In addition, it is possible to prevent the parallax detection performance of the decoding apparatus from significantly affecting the image quality of multi-viewpoint images.
- FIG. 4 is a diagram illustrating a configuration example of a decoding device as an image processing device to which the present technology is applied, which decodes a multiplexed stream transmitted from the encoding device 50 in FIG. 1.
- the decoding device 120 decodes the multiplexed stream transmitted from the encoding device 50, performs a parallax image warping process, generates a multi-viewpoint image, and displays the image on a display device (not shown).
- the separation unit 121 of the decoding device 120 functions as a reception unit, receives the multiplexed stream transmitted from the encoding device 50, and separates the TS for each TS.
- the separation unit 121 extracts a compatible stream and an auxiliary stream from the separated TS and supplies them to the decoder 122.
- the decoder 122 includes a compatible decoder 131 and an auxiliary decoder 132. Based on the compatibility information supplied from the auxiliary decoder 132, the compatible decoder 131 of the decoder 122 identifies a compatible stream among the compatible stream and auxiliary stream supplied from the separation unit 121. Based on the compatibility information, the compatible decoder 131 decodes the encoded compatible image included in the compatible stream in a method corresponding to the AVC method. The compatible decoder 131 supplies the viewpoint image A1 obtained as a result of the decoding to the image generation unit 127.
- the auxiliary decoder 132 supplies compatibility information included in the auxiliary stream supplied from the separation unit 121 to the compatibility decoder 131.
- the auxiliary decoder 132 identifies an auxiliary stream among the compatible stream and the auxiliary stream supplied from the separation unit 121 based on the compatibility information.
- the auxiliary decoder 132 functions as a decoding unit and multiplexes the encoded auxiliary image, the compatible image parallax image, and the auxiliary image parallax image included in the auxiliary stream supplied from the separation unit 121.
- the image is decoded by a method corresponding to the auxiliary encoder 62 of FIG.
- the auxiliary decoder 132 supplies the image generation unit 127 with the multiplexed image of the auxiliary image, the parallax image of the compatible image, and the multiplexed image of the parallax image of the auxiliary image obtained as a result of the decoding.
- the auxiliary decoder 132 supplies the image information included in the auxiliary stream to the image information acquisition unit 123 and supplies the shooting information to the shooting information acquisition unit 124. Further, the auxiliary decoder 132 supplies the parallax image multiplexing information included in the auxiliary stream to the parallax image multiplexing information acquisition unit 125 and supplies the compatibility information to the compatibility information acquisition unit 126.
- the image information acquisition unit 123 acquires the image information supplied from the auxiliary decoder 132 and supplies it to the image generation unit 127.
- the shooting information acquisition unit 124 acquires the shooting information supplied from the auxiliary decoder 132 and supplies the acquired shooting information to the image generation unit 127.
- the parallax image multiplexing information acquisition unit 125 acquires the parallax image multiplexing information supplied from the auxiliary decoder 132 and supplies the parallax image multiplexing information to the image generation unit 127.
- the compatibility information acquisition unit 126 acquires the compatibility information supplied from the auxiliary decoder 132 and supplies it to the image generation unit 127.
- the image generation unit 127 includes a 2D image generation unit 141 and a 3D image generation unit 142.
- the 2D image generation unit 141 of the image generation unit 127 outputs a viewpoint image A1 that is a compatible image supplied from the compatibility decoder 131 in response to a 2D image display command from the viewer, and displays it on a display device (not shown). Display. Thereby, the viewer can see the 2D image.
- the 3D image generation unit 142 uses the viewpoint image A1, the auxiliary image multiplexed image, the compatible image parallax image A1 ′, and the auxiliary image parallax image multiplexed image supplied from the decoder 122 to generate image information. Based on the shooting information, the parallax image multiplexing information, the compatibility information, and the like, an image having the same resolution as the compatible image and having three or more viewpoints corresponding to a display device (not shown) is generated.
- the 3D image generation unit 142 converts the resolution of the generated multi-viewpoint image into a 1 / viewpoint resolution of the resolution of the compatible image or the auxiliary image, and synthesizes it to display on a display device (not shown).
- the combined multi-viewpoint image is displayed so that the viewable angle differs for each viewpoint.
- a viewer can view a 3D image without wearing glasses by viewing each image of two arbitrary viewpoints with the left and right eyes.
- FIG. 5 is a block diagram illustrating a detailed configuration example of the 3D image generation unit 142 of FIG.
- the 3D image generation unit 142 includes a virtual viewpoint position determination unit 160, a parallax image separation unit 161, a parallax image resolution enhancement unit 162, a parallax image warping processing unit 163, a parallax image warping processing unit 164, smoothing.
- the virtual viewpoint position determination unit 160 of the 3D image generation unit 142 generates multiple viewpoints based on the inter-viewpoint distance included in the shooting information supplied from the shooting information acquisition unit 124 and the number of viewpoints corresponding to a display device (not shown).
- the viewpoint position of the viewpoint image is determined as the virtual viewpoint position.
- the virtual viewpoint position determination unit 160 uses the virtual viewpoint to generate an image of the virtual viewpoint for each of the parallax image warping processing unit 163 and the parallax image warping processing unit 164 based on the position of each virtual viewpoint.
- Parallax image specifying information that is information for specifying a parallax image of a viewpoint outside the viewpoint is generated.
- the parallax image specifying information supplied to the parallax image warping processing unit 163 and the parallax image specifying information supplied to the parallax image warping processing unit 164 are generated differently.
- the virtual viewpoint position determination unit 160 supplies the position of each virtual viewpoint and the corresponding parallax image specifying information to the parallax image warping processing unit 163 and the parallax image warping processing unit 164.
- the parallax image separation unit 161 supplies the parallax image A1 'of the compatible image supplied from the decoder 122 to the parallax image high-resolution unit 162 as it is based on the compatibility information supplied from the compatibility information acquisition unit 126. Further, the parallax image separation unit 161 separates the multiplexed image of the parallax image of the auxiliary image supplied from the decoder 122 based on the parallax image multiplexing information supplied from the parallax image multiplexing information acquisition unit 125. Then, the parallax image separation unit 161 supplies the parallax images of the viewpoint image B1 and the viewpoint image C1 having a resolution half that of the compatible image obtained as a result to the parallax image high-resolution unit 162.
- the parallax image resolution increasing unit 162 functions as a resolution increasing unit, and increases the parallax images of the viewpoint image B1 and the viewpoint image C1 having a resolution half that of the compatible image supplied from the parallax image separation unit 161, respectively. Make resolution. Thereby, the parallax image resolution increasing unit 162 obtains the parallax images of the viewpoint image B1 and the viewpoint image C1 having the same resolution as the compatible image. Then, the parallax image resolution increasing unit 162 converts the obtained parallax images of the viewpoint image B1 and the viewpoint image C1 and the parallax image A1 ′ supplied from the parallax image separation unit 161 into the parallax image warping processing unit 163 and the parallax.
- the image warping processing unit 164 is supplied.
- the parallax image warping processing unit 163 functions as a parallax image warping processing unit. Specifically, the parallax image warping processing unit 163, for each virtual viewpoint, based on the parallax image specifying information supplied from the virtual viewpoint position determination unit 160, the viewpoint image B1 supplied from the parallax image resolution increasing unit 162. Then, one of the parallax image of the viewpoint image C1 and the parallax image A1 ′ is selected. For each virtual viewpoint, the parallax image warping processing unit 163 performs a warping process on the selected parallax image based on the shooting information from the shooting information acquisition unit 124 and the position of the virtual viewpoint from the virtual viewpoint position determination unit 160. Do. The parallax image warping processing unit 163 supplies the parallax image of each virtual viewpoint generated by the warping processing to the smoothing processing unit 165.
- the parallax image warping processing unit 164 functions as a parallax image warping processing unit, performs the same processing as the parallax image warping processing unit 163, and supplies the parallax images of the respective virtual viewpoints generated as a result to the smoothing processing unit 166. .
- the smoothing processing unit 165 functions as a correction unit, and removes parallax values other than 0 in an occlusion area (details will be described later) from the parallax images of each virtual viewpoint supplied from the parallax image warping processing unit 163. Process. Specifically, the smoothing processing unit 165 detects a parallax value other than 0 in the occlusion area from the parallax image of each virtual viewpoint, and corrects the parallax value to 0. The smoothing processing unit 165 supplies the parallax images of each virtual viewpoint after the smoothing processing to the viewpoint image warping processing unit 167.
- an occlusion area is a parallax image that is present in a virtual viewpoint image that is generated when a virtual viewpoint and a viewpoint of an actually captured viewpoint image are different, but used to generate a parallax image of the virtual viewpoint. This is a region that does not exist in the viewpoint image corresponding to.
- the parallax values of pixels that do not correspond to the pixels of the parallax image before the warping process are set to 0. Accordingly, the parallax value of the occlusion area is originally 0, and the occlusion area is represented by a black image.
- the smoothing processing unit 166 functions as a correction unit, and performs a smoothing process on the parallax images of each virtual viewpoint supplied from the parallax image warping processing unit 164 in the same manner as the smoothing processing unit 165. Then, the smoothing processing unit 166 supplies the parallax images of each virtual viewpoint after the smoothing processing to the viewpoint image warping processing unit 168.
- the viewpoint image warping processing unit 167 functions as a viewpoint image warping processing unit. Specifically, the viewpoint image warping processing unit 167 is based on the parallax image of each virtual viewpoint supplied from the smoothing processing unit 165, and for each virtual viewpoint, the viewpoint image resolution increasing unit 170 corresponding to the parallax image. Warping processing of the viewpoint image supplied from is performed. The viewpoint image warping processing unit 167 supplies an image of each virtual viewpoint having an occlusion area obtained as a result of the warping process to the multi-viewpoint image generation unit 171.
- the viewpoint image warping processing unit 168 functions as a viewpoint image warping processing unit, and performs the same processing as the viewpoint image warping processing unit 167 based on the parallax images of each virtual viewpoint supplied from the smoothing processing unit 166.
- the viewpoint image separation unit 169 supplies the viewpoint image A1, which is a compatible image supplied from the decoder 122, to the viewpoint image resolution increasing unit 170 as it is based on the compatibility information supplied from the compatibility information acquisition unit 126.
- the viewpoint image separation unit 169 separates the multiplexed image of the auxiliary image supplied from the decoder 122 based on the image information supplied from the image information acquisition unit 123.
- the viewpoint image separation unit 169 supplies the viewpoint image B1 and the viewpoint image C1 having a resolution 1 ⁇ 2 of the resolution of the compatible image obtained as a result to the viewpoint image resolution increasing unit 170.
- the viewpoint image resolution increasing unit 170 performs the interpolation process on each of the viewpoint image B1 and the viewpoint image C1 having a resolution that is half the resolution of the compatible image supplied from the viewpoint image separation unit 169, thereby increasing the resolution. I do. Thereby, the viewpoint image resolution increasing unit 170 obtains the viewpoint image B1 and the viewpoint image C1 having the same resolution as the compatible image. Then, the viewpoint image resolution increasing unit 170 converts the obtained viewpoint image B1 and the parallax images of the viewpoint image C1 and the viewpoint image A1 supplied from the viewpoint image separation unit 169 into the viewpoint image warping processing unit 167 and the viewpoint image. This is supplied to the warping processing unit 168.
- the multi-viewpoint image generation unit 171 functions as an interpolation unit, and for each virtual viewpoint, the occlusion area of the virtual viewpoint image supplied from one of the viewpoint image warping processing unit 167 and the viewpoint image warping processing unit 168 Is interpolated with the image of the virtual viewpoint supplied from.
- the multi-viewpoint image generation unit 171 supplies the image of each virtual viewpoint obtained as a result to the multi-viewpoint image composition processing unit 172 as a multi-viewpoint image.
- the multi-view image synthesis processing unit 172 converts the resolution of the multi-view image supplied from the multi-view image generation unit 171 into a resolution of the number of virtual viewpoints of the resolution of the compatible image or the auxiliary image, and synthesizes it. And displayed on a display device (not shown).
- FIG. 6 is a diagram illustrating generation of a virtual viewpoint image by the 3D image generation unit 142 of FIG.
- the parallax image warping processing unit 163 and the parallax image warping processing unit 164 select different parallax images based on the parallax image specifying information.
- the parallax image warping processing unit 163 has a parallax composed of a circular area with a parallax value that is a predetermined value other than 0 and an area with a parallax value other than 0 located on the left side of the screen.
- An image hereinafter referred to as parallax image # 1 is selected.
- parallax image warping processing unit 164 includes a parallax image (hereinafter, referred to as a parallax image having a parallax value that is a predetermined value other than 0 and a region that has a parallax value other than 0 located on the right side of the screen).
- Parallax image # 2 is selected.
- the parallax image warping processing unit 163 and the parallax image warping processing unit 164 each perform a warping process on the selected parallax image based on the position of the virtual viewpoint and the shooting information to generate a parallax image of the virtual viewpoint. Then, the smoothing processing unit 165 and the smoothing processing unit 166 each perform a smoothing process on the generated parallax image of the virtual viewpoint.
- the parallax image # 1 of the virtual viewpoint for example, a circular area in the parallax image # 1 is moved to the right side and is generated on the left side of the circular area.
- a parallax image in which all the parallax values of the occlusion area painted black are zero.
- the parallax image # 2 of the virtual viewpoint is, for example, an occlusion area that is blacked out in the figure, where a circular area in the parallax image # 2 moves to the left and is generated on the right side of the circular area. All the parallax values are 0 parallax images. Note that the position of the circular area in the parallax image # 1 of the virtual viewpoint and the position of the circular area in the parallax image # 2 of the virtual viewpoint are the same.
- the viewpoint image warping processing unit 167 performs a warping process on the viewpoint image # 1 corresponding to the parallax image # 1 based on the parallax image # 1 of the virtual viewpoint.
- the viewpoint image # 1 is an image in which a circular area whose parallax value is a predetermined value other than 0 is a different color from an area where the parallax value other than that area is 0. Therefore, the viewpoint image # 1 after the warping process is an image in which a circular area having a different color from the surrounding area moves to the right side compared to the viewpoint image # 1 before the warping process, and an occlusion area exists on the left side of the area. It becomes.
- the viewpoint image warping processing unit 168 performs the warping process of the viewpoint image # 2 corresponding to the parallax image # 2 based on the parallax image # 2 of the virtual viewpoint.
- the viewpoint image # 2 is an image in which a circular area whose parallax value is a predetermined value other than 0 is a different color from an area where the parallax value other than that area is 0. Therefore, the viewpoint image # 2 after the warping process is an image in which a circular area having a different color from the surrounding area moves to the left side compared to the viewpoint image # 2 before the warping process, and an occlusion area exists on the right side of the area. It becomes.
- the multi-viewpoint image generation unit 171 interpolates the occlusion area of one of the viewpoint images # 1 and # 2 after the warping process with the other viewpoint image.
- the parallax image specifying information is information for specifying the parallax image of the viewpoint outside the virtual viewpoint
- FIG. 7 is a diagram illustrating a parallax image warping process.
- Equation (1) R is a rotation matrix for warping processing of the photographing unit 51A to the photographing unit 51C for photographing a parallax image, and is represented by the following Equation (2).
- r_11 to r_13, r_21 to r_23, and r_31 to r_33 are predetermined values.
- A is a matrix including internal parameters of the photographing unit 51A to the photographing unit 51C that captures a parallax image, and is represented by the following Expression (3).
- focal_length_x and focal_length_y represent the focal lengths in the x and y directions included in the internal parameters, respectively.
- principal_point_x and principal_point_y represent the positions of the principal points included in the internal parameters in the x and y directions, respectively.
- radial_distortion represents a radial distortion coefficient included in the internal parameter.
- R ′ is a rotation matrix for warping processing of a virtual imaging unit that captures a parallax image of a virtual viewpoint expressed in the same manner as R, and A ′ is expressed in the same manner as A. It is a matrix containing the internal parameters of the virtual imaging
- the parallax image selected based on the position m ′ (x ′, y ′, z ′) corresponding to the position m (x, y, z) of each pixel Each pixel of the parallax image of the virtual viewpoint corresponding to each pixel is determined.
- the pixel having the largest parallax value among the plurality of pixels, that is, the pixel corresponding to the subject on the near side is the virtual viewpoint.
- the pixel corresponds to a predetermined pixel of the parallax image.
- the parallax value of each pixel of the selected parallax image is set as the parallax value of the pixel of the parallax image of the virtual viewpoint corresponding to the pixel, and as a result, the parallax image of the virtual viewpoint is generated.
- FIG. 8 is a diagram illustrating a parallax image before the resolution is reduced by the parallax image generation unit 53 of the encoding device 50 and a warping process result of the parallax image.
- a small circle represents a pixel
- a round pattern represents a parallax value. This also applies to FIGS. 9 and 10 described later.
- the parallax image before the resolution is reduced is a predetermined value in which the parallax value of the circular area located in the center of the screen is significantly different from 0, as shown in A of FIG.
- This is a parallax image in which the parallax value of the other area is 0.
- the virtual viewpoint parallax image obtained as a result of the warping process has, for example, a circular area of 10 pixels as shown in FIG. Just move to the left.
- FIG. 9 illustrates warping of the parallax image that has been reduced in resolution by the parallax image generation unit 53 of the encoding device 50, the parallax image that has been increased in resolution by the parallax image enhancement unit 162, and the parallax image that has been increased in resolution. It is a figure which shows a processing result.
- the parallax image generation unit 53 uses the average value of the parallax values of two pixels adjacent in the horizontal direction of the parallax image before the resolution reduction in A of FIG. 8A as the parallax image after the resolution reduction corresponding to the pixel.
- the parallax image with the reduced resolution is as shown in A of FIG.
- the parallax value of a pixel represented by a circle with a vertical line is represented by a parallax value of a pixel represented by a circle with a check pattern and a circle with a gray color. It is an average value of parallax values of pixels.
- FIG. 9A for convenience of explanation, pixels before resolution reduction are represented by circles, and pixels thinned out by resolution reduction are represented by circles without a pattern.
- the parallax image higher resolution unit 162 corresponds to the parallax value of each pixel of the parallax image after higher resolution, for example,
- the resolution is increased by linearly interpolating using the parallax values of two pixels adjacent in the horizontal direction of the parallax image with the resolution reduced in FIG. 9A.
- a high-resolution parallax image B in FIG. 9 is generated.
- the parallax values of pixels represented by circles with horizontal lines are the pixels represented by circles with vertical lines and parallax values of pixels represented by circles with check patterns. It is a value within the range of the parallax value.
- the smoothing processing unit 165 and the smoothing processing unit 166 perform the smoothing process and correct the parallax values other than 0 in the occlusion area to 0.
- FIG. 10 is a diagram for explaining the smoothing processing by the smoothing processing unit 165 (166) of FIG.
- the smoothing processing unit 165 (166) Pixels having parallax values other than 0 in the occlusion area are detected. Then, as shown on the right side of FIG. 10, the smoothing processing unit 165 (166) corrects the parallax value of the pixel whose parallax value in the occlusion area is other than 0 to 0.
- the smoothing processing unit 165 (166) detects a pixel corresponding to a parallax value other than 0 in the occlusion area, and corrects the parallax value of the pixel to 0. Therefore, in the occlusion area of the image of the virtual viewpoint It is possible to prevent the pixel value from being erroneously arranged. As a result, it is possible to prevent the failure of the virtual viewpoint image.
- FIG. 11 is a flowchart illustrating a decoding process performed by the decoding device 120 in FIG. This decoding process is started, for example, when a multiplexed stream transmitted from the encoding device 50 in FIG. 1 is input to the decoding device 120.
- the separation unit 121 of the decoding device 120 receives the multiplexed stream transmitted from the encoding device 50 and separates it for each TS.
- the separation unit 121 extracts a compatible stream and an auxiliary stream from the separated TS and supplies them to the decoder 122.
- the auxiliary decoder 132 of the decoder 122 supplies the compatibility information included in the auxiliary stream supplied from the separator 121 to the compatibility decoder 131.
- step S62 the compatible decoder 131 identifies a compatible stream among the compatible stream and auxiliary stream supplied from the separation unit 121 based on the compatibility information supplied from the auxiliary decoder 132.
- step S63 based on the compatibility information, the compatible decoder 131 decodes the compatible image included in the compatible stream in a method corresponding to the AVC method, and supplies the viewpoint image A1 obtained as a result to the image generation unit 127.
- step S64 the image generation unit 127 determines whether or not a 2D image display is instructed by the viewer. If it is determined in step S64 that display of 2D images is not instructed by the viewer, that is, if display of glasses-less 3D images is instructed by the viewer, the auxiliary decoder 132 is based on the compatibility information. The auxiliary stream of the compatible stream and auxiliary stream supplied from the separation unit 121 is identified.
- the auxiliary decoder 132 converts the encoded auxiliary image multiplexed image, the compatible image parallax image A1 ′, and the auxiliary image parallax image multiplexed image included in the auxiliary stream into the auxiliary image 132 of FIG.
- the decoding is performed by a method corresponding to the auxiliary encoder 62.
- the auxiliary decoder 132 supplies the image generation unit 127 with the multiplexed image of the auxiliary image, the parallax image A1 ′ of the compatible image, and the multiplexed image of the parallax image of the auxiliary image obtained as a result of the decoding.
- the auxiliary decoder 132 supplies the image information included in the auxiliary stream to the image information acquisition unit 123 and supplies the shooting information to the shooting information acquisition unit 124. Further, the auxiliary decoder 132 supplies the parallax image multiplexing information included in the auxiliary stream to the parallax image multiplexing information acquisition unit 125 and supplies the compatibility information to the compatibility information acquisition unit 126.
- step S 66 the image information acquisition unit 123 acquires the image information supplied from the auxiliary decoder 132 and inputs it to the image generation unit 127.
- step S ⁇ b> 67 the shooting information acquisition unit 124 acquires shooting information supplied from the auxiliary decoder 132 and inputs the acquired shooting information to the image generation unit 127.
- step S68 the parallax image multiplexing information acquisition unit 125 acquires the parallax image multiplexing information supplied from the auxiliary decoder 132 and inputs the parallax image multiplexing information to the image generation unit 127.
- step S ⁇ b> 69 the compatibility information acquisition unit 126 acquires the compatibility information supplied from the auxiliary decoder 132 and inputs it to the image generation unit 127.
- step S70 the 3D image generation unit 142 of the image generation unit 127 performs multi-viewpoint image generation processing for generating a composite image of multi-viewpoint images. Details of the multi-viewpoint image generation processing will be described with reference to FIG.
- step S71 the image generation unit 127 outputs a composite image of the multi-viewpoint images generated by the process in step S70 to a display device (not shown), and displays the viewable angles different for each viewpoint. Then, the process ends.
- step S72 when it is determined in step S64 that display of a 2D image is instructed by the viewer, in step S72, the 2D image generation unit 141 of the image generation unit 127 is a viewpoint that is a compatible image supplied from the compatibility decoder 131.
- the image A1 is output to a display device (not shown) and displayed. Then, the process ends.
- FIG. 12 is a flowchart for explaining the details of the multi-viewpoint image generation process in step S70 of FIG.
- the virtual viewpoint position determination unit 160 (FIG. 5) of the 3D image generation unit 142 displays the distance between viewpoints included in the shooting information supplied from the shooting information acquisition unit 124 and a display device (not shown). The position of each virtual viewpoint is determined based on the number of corresponding viewpoints. Then, the virtual viewpoint position determining unit 160 generates parallax image specifying information for each of the parallax image warping processing unit 163 and the parallax image warping processing unit 164 based on the position of each virtual viewpoint. Further, the virtual viewpoint position determination unit 160 supplies the position of each virtual viewpoint and the corresponding parallax image specifying information to the parallax image warping processing unit 163 and the parallax image warping processing unit 164.
- step S91 the parallax image separation unit 161 acquires a multiplexed image of the parallax image A1 'of the compatible image and the parallax image of the auxiliary image supplied from the auxiliary decoder 132.
- the parallax image separation unit 161 separates the multiplexed image of the parallax image of the auxiliary image based on the parallax image multiplexing information supplied from the parallax image multiplexing information acquisition unit 125.
- the parallax image separation unit 161 supplies the parallax images of the viewpoint image B and the viewpoint image C, which are auxiliary images, having a resolution that is half the resolution of the compatible image obtained as a result, to the parallax image resolution enhancement unit 162. Further, the parallax image separation unit 161 supplies the parallax image A ⁇ b> 1 ′ as it is to the parallax image resolution enhancement unit 162 based on the compatibility information supplied from the compatibility information acquisition unit 126.
- the parallax image resolution enhancement unit 162 obtains the parallax images of the viewpoint image B and the viewpoint image C, which are auxiliary images, having a resolution half that of the compatible image supplied from the parallax image separation unit 161. Increase the resolution respectively. Thereby, the parallax image resolution increasing unit 162 obtains the parallax images of the viewpoint image B1 and the viewpoint image C1 having the same resolution as the compatible image. Then, the parallax image resolution enhancement unit 162 converts the obtained parallax images of the viewpoint image B1 and the viewpoint image C1 and the parallax image A1 ′ supplied from the parallax image separation unit 161 into the parallax image warping processing unit 163 and the parallax. The image warping processing unit 164 is supplied.
- step S94 the parallax image warping processing unit 163, for each virtual viewpoint, based on the parallax image specifying information from the virtual viewpoint position determining unit 160, the viewpoint image B1 and the viewpoint image supplied from the parallax image resolution increasing unit 162.
- One of the parallax image of C1 and the parallax image A1 ′ is selected.
- the parallax image warping processing unit 164 performs the same processing as the parallax image warping processing unit 163.
- steps S95 and S96 the parallax image warping processing unit 163 and the parallax image warping processing unit 164 perform warping processing on the selected parallax image.
- step S95 the parallax image warping processing unit 163 (164) for each virtual viewpoint, based on the position of the virtual viewpoint, shooting information, and the selected parallax image, According to 1), a pixel on the parallax image of the virtual viewpoint corresponding to each pixel of the parallax image is determined.
- the parallax image warping process part 163 (164) produces
- the parallax image warping processing unit 163 (164) calculates the parallax value of each pixel of the selected parallax image for each virtual viewpoint, and the parallax value of the pixel determined in step S95 of the parallax image of the virtual viewpoint.
- the parallax image warping processing unit 163 (164) supplies the parallax image of each virtual viewpoint generated as a result to the smoothing processing unit 165 (166).
- step S96 After the processing in step S96, in steps S97 and S98, the smoothing processing unit 165 and the smoothing processing unit 166 perform smoothing processing on the parallax image of each virtual viewpoint generated in step S96.
- step S97 the smoothing processing unit 165 (166) selects pixels whose disparity values of peripheral pixels are 0 among pixels corresponding to disparity values other than 0 in the disparity image of each virtual viewpoint. It detects as a pixel corresponding to a parallax value other than 0 in the occlusion area.
- step S98 the smoothing processing unit 165 (166) corrects the parallax value of the pixel detected in step S97 of the parallax image of each virtual viewpoint to 0, and performs the viewpoint image warping process on the parallax image of each virtual viewpoint after correction.
- the unit 167 (168) To the unit 167 (168).
- the viewpoint image separation unit 169 separates the multiplexed image of the auxiliary image supplied from the auxiliary decoder 132 based on the image information supplied from the image information acquisition unit 123. Then, the viewpoint image separation unit 169 supplies the viewpoint image B1 and the viewpoint image C1 having a resolution 1 ⁇ 2 of the resolution of the compatible image obtained as a result to the parallax image high-resolution unit 162. In addition, the viewpoint image separation unit 169 directly converts the viewpoint image A1 that is a compatible image supplied from the compatibility decoder 131 to the viewpoint image resolution increasing unit 170 based on the compatibility information supplied from the compatibility information acquisition unit 126. Supply.
- the viewpoint image resolution increasing unit 170 increases the resolution of the viewpoint image B1 and the viewpoint image C1, which are auxiliary images, each having a resolution that is half the resolution of the compatible image supplied from the viewpoint image separation unit 169. To do. Thereby, the viewpoint image resolution increasing unit 170 obtains the viewpoint image B1 and the viewpoint image C1 having the same resolution as the compatible image. Then, the viewpoint image resolution increasing unit 170 converts the obtained viewpoint image B1 and viewpoint image C1 and the viewpoint image A1 supplied from the viewpoint image separation unit 169 into a viewpoint image warping processing unit 167 and a viewpoint image warping processing unit. 168.
- step S102 for each virtual viewpoint, the multi-viewpoint image generation unit 171 supplies the occlusion area of the virtual viewpoint image supplied from either the viewpoint image warping processing unit 167 or the viewpoint image warping processing unit 168 from the other. Is interpolated with the virtual viewpoint image.
- the multi-viewpoint image generation unit 171 supplies the image of each virtual viewpoint obtained as a result to the multi-viewpoint image composition processing unit 172 as a multi-viewpoint image.
- step S ⁇ b> 103 the multi-view image composition processing unit 172 converts the resolution of the multi-view image supplied from the multi-view image generation unit 171 into a resolution of the number of virtual viewpoints of the resolution of the compatible image or the auxiliary image. Are combined to generate a composite image of multi-viewpoint images. And a process returns to step S70 of FIG. 11, and a process progresses to step S71. *
- the decoding apparatus 120 since the decoding apparatus 120 performs the smoothing process on the virtual viewpoint parallax image obtained as a result of the warping process, it is possible to prevent the pixel value from existing in the occlusion area of the virtual viewpoint image. it can. As a result, the image of the occlusion area of one of the two images of the virtual viewpoint can be interpolated with the other image, and a highly accurate virtual viewpoint image can be generated without failure.
- FIG. 13 is a block diagram illustrating a configuration example of a second embodiment of an encoding device corresponding to an image processing device to which the present technology is applied.
- the configuration of the encoding device 200 of FIG. 13 is different from the configuration of FIG. 1 mainly in that a boundary information generation unit 201 is newly provided and that an encoder 202 is provided instead of the encoder 58. .
- the encoding apparatus 200 has boundary information (is_depth_edge) that is information indicating whether or not the resolution-encoded parallax image is a pixel adjacent to a boundary position that is a position where the parallax value of the parallax image greatly changes. ) Is added for transmission.
- boundary information is information indicating whether or not the resolution-encoded parallax image is a pixel adjacent to a boundary position that is a position where the parallax value of the parallax image greatly changes.
- the boundary information generation unit 201 of the encoding device 200 detects the boundary position from the parallax image B1 ′ and the parallax image C1 ′ generated by the parallax image generation unit 53, respectively.
- the boundary information generation unit 201 generates boundary information in units of pixels or macroblocks based on the detected boundary position and supplies the boundary information to the encoder 202.
- a macroblock is a unit of encoding.
- the encoder 202 includes a compatible encoder 61 and an auxiliary encoder 211.
- the auxiliary encoder 211 of the encoder 202 is a multiplexed image of the auxiliary image from the image conversion unit 52, and the parallax image A1 ′ of the compatible image from the parallax image generation unit 53 and the auxiliary image.
- a multiplexed image of parallax images of the image is encoded by a predetermined method.
- the auxiliary encoder 211 adds, to the encoded image obtained as a result of encoding, image information from the image information generation unit 54, compatibility information from the compatibility information generation unit 55, shooting information from the shooting information generation unit 56, and parallax.
- the encoded stream is generated by adding the parallax image multiplexing information from the image multiplexing information generation unit 57 and the boundary information from the boundary information generation unit 201.
- the auxiliary encoder 211 supplies the encoded stream to the multiplexing unit 59 as an auxiliary stream.
- FIG. 14 is a diagram illustrating an example of boundary information.
- the small circles represent pixels, and the round pattern represents the parallax value.
- the boundary information of two pixels adjacent to the boundary position is 1 indicating that the pixel is adjacent to the boundary position. Further, the boundary information of the other pixels is 0 indicating that the pixel is not adjacent to the boundary position.
- boundary information of a macroblock (MB) including two pixels adjacent to the boundary position is adjacent to the boundary position. 1 indicating that this is a pixel to be applied.
- boundary information of other macroblocks is 0 indicating that the pixel is not adjacent to the boundary position.
- FIG. 15 and 16 are flowcharts for explaining the encoding process by the encoding apparatus 200 of FIG. This encoding process is started, for example, when the viewpoint image A1 to the viewpoint image C1 are output from the imaging unit 51A to the imaging unit 51C.
- step 15 is the same as the process from step S10 in FIG. 2 to step S18 in FIG. 3, and thus the description thereof is omitted.
- step S129 of FIG. 16 the boundary information generation unit 201 performs boundary information generation processing for generating boundary information of the parallax image B1 'and the parallax image C1' supplied from the parallax image generation unit 53. Details of this boundary information generation processing will be described with reference to FIG.
- step S130 the boundary information generation unit 201 inputs the boundary information generated in step S129 to the encoder 58.
- steps S131 through S136 is the same as the processing in steps S19 through S24 in FIG.
- step S137 the auxiliary encoder 211 adds the image information from the image information generation unit 54, the compatibility information from the compatibility information generation unit 55, and the shooting information generation unit to the encoded image obtained as a result of the encoding in step S136.
- the encoded information is generated by adding the shooting information from 56, the parallax image multiplexing information from the parallax image multiplexing information generation unit 57, the boundary information from the boundary information generation unit 201, and the like.
- the auxiliary encoder 211 supplies the encoded stream to the multiplexing unit 59 as an auxiliary stream.
- step S138 the multiplexing unit 59 generates TSs from the compatible stream supplied from the compatible encoder 61 and the auxiliary stream supplied from the auxiliary encoder 211, multiplexes them, and transmits them. Then, the process ends.
- FIG. 17 is a flowchart for explaining the details of the boundary information generation processing in step S129 of FIG. Note that FIG. 17 describes boundary information generation processing when boundary information is generated in units of pixels. In addition, the boundary information generation process in FIG. 17 is performed for each parallax image.
- step S141 of FIG. 17 the boundary information generation unit 201 acquires the parallax image supplied from the parallax image generation unit 53.
- step S142 the boundary information generation unit 201 obtains a difference in parallax value between pixels adjacent in the horizontal direction of the parallax image that have not yet been processed in step S142.
- step S143 the boundary information generation unit 201 determines whether the difference between the parallax values obtained in step S142 is greater than a predetermined threshold.
- step S144 the boundary information generation unit 201 detects a boundary position between two adjacent pixels that are targets of the process in step S142. Then, the boundary information of the two pixels is set to 1.
- step S145 the boundary information generation unit 201 sets boundary information between two adjacent pixels that are targets of the process in step S142. Of these, the boundary information for which 1 is not set is set to 0.
- step S146 the boundary information generation unit 201 determines whether or not the difference between the parallax values of two pixels adjacent in the horizontal direction of all the parallax images has been obtained.
- step S146 When it is determined in step S146 that the difference between the parallax values of all the pixels adjacent in the horizontal direction of the parallax image has not yet been obtained, the process returns to step S142, and all the pixels adjacent in the horizontal direction of the parallax image are detected. Steps S142 to S146 are repeated until the difference between the two parallax values is obtained.
- step S146 determines whether the difference between the parallax values of all the pixels adjacent in the horizontal direction of the parallax image has been obtained. If it is determined in step S146 that the difference between the parallax values of all the pixels adjacent in the horizontal direction of the parallax image has been obtained, the process returns to step S129 in FIG. 16, and the process proceeds to step S130.
- the boundary information generation unit 201 performs the processing of steps S142 to S146 in units of macroblocks in the boundary information generation processing in FIG.
- the boundary information generation unit 201 obtains a difference in disparity values between pixels adjacent in the horizontal direction in a predetermined macroblock of the disparity image in step S142, and the difference in at least one disparity value in step S143. It is determined whether it is larger than a predetermined threshold. If it is determined that the difference between at least one parallax value is greater than the predetermined threshold, the boundary information generation unit 201 sets the boundary information of the macroblock corresponding to the difference to 1 in step S144. On the other hand, when it is determined that the difference between all the parallax values is equal to or less than the predetermined threshold, the boundary information generation unit 201 does not set 1 of the boundary information of the macroblock corresponding to the difference in step S145. Set boundary information to 0.
- the encoding device 200 reduces the resolution of the parallax image, and transmits the reduced resolution parallax image and the boundary information. Therefore, in the decoding device described later, based on the boundary information, the parallax image of the virtual viewpoint is transmitted. Smoothing processing can be performed. As a result, the decoding device can generate the viewpoint image of the virtual viewpoint with high accuracy, as will be described later.
- FIG. 18 is a diagram illustrating a configuration example of a decoding device as an image processing device to which the present technology is applied, which decodes a multiplexed stream transmitted from the encoding device 200 in FIG.
- the configuration of the decoding device 220 in FIG. 18 mainly includes a boundary information acquisition unit 222 and a decoder 221 and an image generation unit 223 instead of the decoder 122 and the image generation unit 127. This is different from the configuration of FIG.
- the decoding device 220 decodes the multiplexed stream transmitted from the encoding device 200, smooths the parallax image after warping processing based on the boundary information, generates a multi-viewpoint image, and displays it on a display device (not shown). Display.
- the decoder 221 of the decoding device 220 includes a compatible decoder 131 and an auxiliary decoder 231.
- the auxiliary decoder 231 of the decoder 221 supplies compatible information included in the auxiliary stream supplied from the separation unit 121 to the compatible decoder 131, similarly to the auxiliary decoder 132 of FIG. 4. Similar to the auxiliary decoder 132, the auxiliary decoder 231 identifies an auxiliary stream among the compatible stream and the auxiliary stream supplied from the separation unit 121 based on the compatibility information.
- the auxiliary decoder 231 functions as a decoding unit and multiplexes the encoded auxiliary image multiplexed image, the compatible image parallax image, and the auxiliary image parallax image included in the auxiliary stream supplied from the separation unit 121.
- the image is decoded by a method corresponding to the auxiliary encoder 211 of FIG.
- the auxiliary decoder 231 supplies the image generation unit 223 with the multiplexed image of the auxiliary image, the parallax image of the compatible image, and the multiplexed image of the parallax image of the auxiliary image obtained as a result of the decoding.
- the auxiliary decoder 231 supplies the image information included in the auxiliary stream to the image information acquisition unit 123 and supplies the shooting information to the shooting information acquisition unit 124.
- the auxiliary decoder 231 supplies the parallax image multiplexing information included in the auxiliary stream to the parallax image multiplexing information acquisition unit 125 and supplies the compatibility information to the compatibility information acquisition unit 126.
- the auxiliary decoder 231 supplies boundary information included in the auxiliary stream to the boundary information acquisition unit 222.
- the boundary information acquisition unit 222 acquires boundary information supplied from the auxiliary decoder 231 and supplies it to the image generation unit 223.
- the image generation unit 223 includes a 2D image generation unit 141 and a 3D image generation unit 241.
- the 3D image generation unit 241 of the image generation unit 223 uses the viewpoint image A1, the multiplexed image of the auxiliary image, the parallax image A1 ′ of the compatible image, and the multiplexed image of the parallax image of the auxiliary image supplied from the decoder 221. Based on the image information, shooting information, parallax image multiplexing information, compatibility information, boundary information, and the like, an image with the same resolution as the compatible image is generated with the number of viewpoints corresponding to a display device (not shown) of 3 or more. .
- the 3D image generation unit 241 converts the resolution of the generated multi-viewpoint image to 1 / the number of viewpoints of the resolution of the compatible image or the auxiliary image, similarly to the 3D image generation unit 142 of FIG. These are combined and displayed on a display device (not shown).
- the combined multi-viewpoint image is displayed so that the viewable angle differs for each viewpoint.
- a viewer can view a 3D image without wearing glasses by viewing each image of two arbitrary viewpoints with the left and right eyes.
- FIG. 19 is a block diagram illustrating a detailed configuration example of the 3D image generation unit 241 in FIG.
- the configuration of the 3D image generation unit 241 in FIG. 19 is different from the configuration in FIG. 5 in that a smoothing processing unit 251 and a smoothing processing unit 252 are provided instead of the smoothing processing unit 165 and the smoothing processing unit 166.
- the smoothing processing unit 251 of the 3D image generation unit 241 functions as a correction unit, and generates a parallax image of each virtual viewpoint supplied from the parallax image warping processing unit 163 based on the boundary information supplied from the boundary information acquisition unit 222.
- a smoothing process is performed on the image. Specifically, the smoothing processing unit 251 detects a parallax value other than 0 in the occlusion area from the parallax images of each virtual viewpoint.
- the smoothing processing unit 251 corrects the parallax value of the pixel corresponding to the boundary information indicating the boundary position among the detected parallax values to 0.
- the smoothing processing unit 251 supplies the parallax images of each virtual viewpoint after the smoothing process to the viewpoint image warping processing unit 167.
- the smoothing processing unit 252 functions as a correction unit, and performs a smoothing process on the parallax images of each virtual viewpoint supplied from the parallax image warping processing unit 164 in the same manner as the smoothing processing unit 251. Then, the smoothing processing unit 252 supplies the parallax images of each virtual viewpoint after the smoothing process to the viewpoint image warping processing unit 168.
- FIG. 20 is a diagram for explaining the smoothing process based on the boundary information by the smoothing processing unit 251 (252) of FIG.
- a small circle represents a pixel
- a round pattern represents a parallax value.
- the smoothing processing unit 251 As shown on the left side of FIG. 20, when the parallax image warping processing unit 163 (164) of FIG. 19 generates a parallax image of the virtual viewpoint of C of FIG. 9, the smoothing processing unit 251 (252) A pixel having a parallax value other than 0 in the occlusion area is detected. Then, when the boundary information corresponding to the pixel is 1, the smoothing processing unit 251 (252) corrects the parallax value of the pixel to 0 as illustrated on the right side of FIG.
- the smoothing processing unit 251 (252) can be erroneously arranged in the occlusion area by increasing the resolution among the parallax values of pixels detected as pixels whose parallax value in the occlusion area is other than 0. Only the parallax value of the pixel adjacent to the boundary position having a characteristic is corrected to zero. Accordingly, it is possible to prevent a pixel outside the occlusion area from being detected as a pixel corresponding to a parallax value other than 0 in the occlusion area and correcting the parallax value of the pixel to 0. As a result, a virtual viewpoint image can be generated with higher accuracy.
- FIG. 21 is a flowchart for explaining a decoding process by the decoding device 220 in FIG. This decoding process is started, for example, when a multiplexed stream transmitted from the encoding device 200 of FIG. 13 is input to the decoding device 220.
- step S210 the boundary information acquisition unit 222 acquires boundary information supplied from the auxiliary decoder 231 and inputs the boundary information to the image generation unit 223.
- step S211 the 3D image generation unit 241 of the image generation unit 223 performs multi-viewpoint image generation processing. Details of the multi-viewpoint image generation processing will be described with reference to FIG.
- FIG. 22 is a flowchart for explaining the details of the multi-viewpoint image generation process in step S211 of FIG.
- step S237 the smoothing processing unit 251 (252), among the pixels corresponding to the parallax values other than 0 in the parallax image of each virtual viewpoint, the boundary information is 1, and the peripheral pixels A pixel having a parallax value of 0 is detected as a pixel corresponding to a parallax value other than 0 in the occlusion area.
- the unit 167 (168) To the unit 167 (168).
- steps S239 to S243 is the same as that of steps S99 to S103 in FIG.
- the decoding apparatus 220 performs the smoothing process more accurately on the parallax image of the virtual viewpoint obtained as a result of the warping process based on the boundary information transmitted from the encoding apparatus 200. Accordingly, it is possible to prevent a pixel outside the occlusion area from being detected as a pixel corresponding to a parallax value other than 0 in the occlusion area and correcting the parallax value of the pixel to 0. As a result, a virtual viewpoint image can be generated with higher accuracy.
- the image processing apparatus of the present technology can be applied to, for example, a display apparatus and a reproduction apparatus such as a television receiver.
- the number of pixels adjacent to the boundary position is not limited to two.
- the pixels adjacent to the boundary position are a plurality of pixels adjacent to the left of the boundary position and a plurality of pixels adjacent to the right. There may be.
- the image information, the shooting information, the parallax image multiplexing information, the compatibility information, and the boundary information may be transmitted in a system different from the encoded stream without being encoded. Further, the image information, the shooting information, the parallax image multiplexing information, the compatibility information, and the boundary information may be encoded and transmitted by a system different from the encoded stream.
- the image information, the shooting information, the parallax image multiplexing information, the compatibility information, and the boundary information can be described in a predetermined region of the encoded stream without being encoded, or can be encoded and encoded. It can also be described in a predetermined area of the stream.
- FIG. 24 shows a configuration example of an embodiment of a computer in which a program for executing the series of processes described above is installed.
- the program can be recorded in advance in a storage unit 808 or a ROM (Read Only Memory) 802 as a recording medium built in the computer.
- ROM Read Only Memory
- the program can be stored (recorded) in the removable medium 811.
- a removable medium 811 can be provided as so-called package software.
- the removable media 811 includes, for example, a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, a semiconductor memory, and the like.
- the program can be installed on the computer from the removable medium 811 as described above via the drive 810, or can be downloaded to the computer via a communication network or a broadcast network, and installed in the built-in storage unit 808. That is, the program is transferred from a download site to a computer wirelessly via a digital satellite broadcasting artificial satellite, or wired to a computer via a network such as a LAN (Local Area Network) or the Internet. be able to.
- LAN Local Area Network
- the computer incorporates a CPU (Central Processing Unit) 801, and an input / output interface 805 is connected to the CPU 801 via a bus 804.
- a CPU Central Processing Unit
- an input / output interface 805 is connected to the CPU 801 via a bus 804.
- the CPU 801 executes a program stored in the ROM 802 according to an instruction input by the user operating the input unit 806 via the input / output interface 805. Alternatively, the CPU 801 loads a program stored in the storage unit 808 to a RAM (Random Access Memory) 803 and executes it.
- a RAM Random Access Memory
- the CPU 801 performs processing according to the flowchart described above or processing performed by the configuration of the block diagram described above. Then, for example, the CPU 801 outputs the processing result from the output unit 807 via the input / output interface 805, transmits it from the communication unit 809, or records it in the storage unit 808 as necessary.
- the input unit 806 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 807 includes an LCD (Liquid Crystal Display), a speaker, and the like.
- the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the program may be processed by one computer (processor), or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the present technology processes when receiving via network media such as satellite broadcasting, cable TV (television), the Internet, and mobile phones, or on storage media such as optical, magnetic disk, and flash memory.
- the present invention can be applied to an encoding device and a decoding device used at the time.
- the above-described encoding device and decoding device can be applied to any electronic device. Examples thereof will be described below.
- FIG. 25 illustrates a schematic configuration of a television apparatus to which the present technology is applied.
- the television apparatus 900 includes an antenna 901, a tuner 902, a demultiplexer 903, a decoder 904, a video signal processing unit 905, a display unit 906, an audio signal processing unit 907, a speaker 908, and an external interface unit 909. Furthermore, the television apparatus 900 includes a control unit 910, a user interface unit 911, and the like.
- the tuner 902 selects a desired channel from the broadcast wave signal received by the antenna 901, demodulates it, and outputs the obtained encoded bit stream to the demultiplexer 903.
- the demultiplexer 903 extracts video and audio packets of the program to be viewed from the encoded bit stream, and outputs the extracted packet data to the decoder 904.
- the demultiplexer 903 supplies a packet of data such as EPG (Electronic Program Guide) to the control unit 910. If scrambling is being performed, descrambling is performed by a demultiplexer or the like.
- the decoder 904 performs packet decoding processing, and outputs video data generated by the decoding processing to the video signal processing unit 905 and audio data to the audio signal processing unit 907.
- the video signal processing unit 905 performs noise removal, video processing according to user settings, and the like on the video data.
- the video signal processing unit 905 generates video data of a program to be displayed on the display unit 906, image data by processing based on an application supplied via a network, and the like.
- the video signal processing unit 905 generates video data for displaying a menu screen for selecting an item and the like, and superimposes the video data on the video data of the program.
- the video signal processing unit 905 generates a drive signal based on the video data generated in this way, and drives the display unit 906.
- the display unit 906 drives a display device (for example, a liquid crystal display element or the like) based on a drive signal from the video signal processing unit 905 to display a program video or the like.
- a display device for example, a liquid crystal display element or the like
- the audio signal processing unit 907 performs predetermined processing such as noise removal on the audio data, performs D / A conversion processing and amplification processing on the processed audio data, and outputs the audio data to the speaker 908.
- the external interface unit 909 is an interface for connecting to an external device or a network, and transmits and receives data such as video data and audio data.
- a user interface unit 911 is connected to the control unit 910.
- the user interface unit 911 includes an operation switch, a remote control signal receiving unit, and the like, and supplies an operation signal corresponding to a user operation to the control unit 910.
- the control unit 910 is configured using a CPU (Central Processing Unit), a memory, and the like.
- the memory stores a program executed by the CPU, various data necessary for the CPU to perform processing, EPG data, data acquired via a network, and the like.
- the program stored in the memory is read and executed by the CPU at a predetermined timing such as when the television device 900 is activated.
- the CPU executes the program to control each unit so that the television apparatus 900 performs an operation according to the user operation.
- the television device 900 is provided with a bus 912 for connecting the tuner 902, the demultiplexer 903, the video signal processing unit 905, the audio signal processing unit 907, the external interface unit 909, and the control unit 910.
- the decoder 904 is provided with the function of the image processing apparatus (image processing method) of the present application. For this reason, it is possible to decode an encoded bitstream including a parallax image whose resolution has been reduced, and to generate an image of a predetermined viewpoint with high accuracy using the parallax image.
- FIG. 26 illustrates a schematic configuration of a mobile phone to which the present technology is applied.
- the cellular phone 920 includes a communication unit 922, an audio codec 923, a camera unit 926, an image processing unit 927, a demultiplexing unit 928, a recording / reproducing unit 929, a display unit 930, and a control unit 931. These are connected to each other via a bus 933.
- an antenna 921 is connected to the communication unit 922, and a speaker 924 and a microphone 925 are connected to the audio codec 923. Further, an operation unit 932 is connected to the control unit 931.
- the mobile phone 920 performs various operations such as transmission / reception of voice signals, transmission / reception of e-mail and image data, image shooting, and data recording in various modes such as a voice call mode and a data communication mode.
- the voice signal generated by the microphone 925 is converted into voice data and compressed by the voice codec 923 and supplied to the communication unit 922.
- the communication unit 922 performs audio data modulation processing, frequency conversion processing, and the like to generate a transmission signal.
- the communication unit 922 supplies a transmission signal to the antenna 921 and transmits it to a base station (not shown).
- the communication unit 922 performs amplification, frequency conversion processing, demodulation processing, and the like of the reception signal received by the antenna 921, and supplies the obtained audio data to the audio codec 923.
- the audio codec 923 performs data expansion of the audio data and conversion to an analog audio signal and outputs the result to the speaker 924.
- the control unit 931 receives character data input by operating the operation unit 932 and displays the input characters on the display unit 930.
- the control unit 931 generates mail data based on a user instruction or the like in the operation unit 932 and supplies the mail data to the communication unit 922.
- the communication unit 922 performs mail data modulation processing, frequency conversion processing, and the like, and transmits the obtained transmission signal from the antenna 921.
- the communication unit 922 performs amplification, frequency conversion processing, demodulation processing, and the like of the reception signal received by the antenna 921, and restores mail data. This mail data is supplied to the display unit 930 to display the mail contents.
- the mobile phone 920 can also store the received mail data in a storage medium by the recording / playback unit 929.
- the storage medium is any rewritable storage medium.
- the storage medium is a removable medium such as a semiconductor memory such as a RAM or a built-in flash memory, a hard disk, a magnetic disk, a magneto-optical disk, an optical disk, a USB memory, or a memory card.
- the image data generated by the camera unit 926 is supplied to the image processing unit 927.
- the image processing unit 927 performs encoding processing of image data and generates encoded data.
- the demultiplexing unit 928 multiplexes the encoded data generated by the image processing unit 927 and the audio data supplied from the audio codec 923 by a predetermined method, and supplies the multiplexed data to the communication unit 922.
- the communication unit 922 performs modulation processing and frequency conversion processing of multiplexed data, and transmits the obtained transmission signal from the antenna 921.
- the communication unit 922 performs amplification, frequency conversion processing, demodulation processing, and the like of the reception signal received by the antenna 921, and restores multiplexed data. This multiplexed data is supplied to the demultiplexing unit 928.
- the demultiplexing unit 928 performs demultiplexing of the multiplexed data, and supplies the encoded data to the image processing unit 927 and the audio data to the audio codec 923.
- the image processing unit 927 performs a decoding process on the encoded data to generate image data.
- the image data is supplied to the display unit 930 and the received image is displayed.
- the audio codec 923 converts the audio data into an analog audio signal, supplies the analog audio signal to the speaker 924, and outputs the received audio.
- the image processing unit 927 is provided with the function of the image processing device (image processing method) of the present application. For this reason, in image data communication, encoded data including a parallax image with a reduced resolution can be decoded, and an image of a predetermined viewpoint can be generated with high accuracy using the parallax image.
- FIG. 27 illustrates a schematic configuration of a recording / reproducing apparatus to which the present technology is applied.
- the recording / reproducing apparatus 940 records, for example, audio data and video data of a received broadcast program on a recording medium, and provides the recorded data to the user at a timing according to a user instruction.
- the recording / reproducing device 940 can also acquire audio data and video data from another device, for example, and record them on a recording medium. Further, the recording / reproducing apparatus 940 decodes and outputs the audio data and video data recorded on the recording medium, thereby enabling image display and audio output on the monitor apparatus or the like.
- the recording / reproducing apparatus 940 includes a tuner 941, an external interface unit 942, an encoder 943, an HDD (Hard Disk Drive) unit 944, a disk drive 945, a selector 946, a decoder 947, an OSD (On-Screen Display) unit 948, a control unit 949, A user interface unit 950 is included.
- Tuner 941 selects a desired channel from a broadcast signal received by an antenna (not shown).
- the tuner 941 outputs an encoded bit stream obtained by demodulating the received signal of a desired channel to the selector 946.
- the external interface unit 942 includes at least one of an IEEE 1394 interface, a network interface unit, a USB interface, a flash memory interface, and the like.
- the external interface unit 942 is an interface for connecting to an external device, a network, a memory card, and the like, and receives data such as video data and audio data to be recorded.
- the encoder 943 performs encoding by a predetermined method when the video data and audio data supplied from the external interface unit 942 are not encoded, and outputs an encoded bit stream to the selector 946.
- the HDD unit 944 records content data such as video and audio, various programs, and other data on a built-in hard disk, and reads them from the hard disk during playback.
- the disk drive 945 records and reproduces signals with respect to the mounted optical disk.
- An optical disk such as a DVD disk (DVD-Video, DVD-RAM, DVD-R, DVD-RW, DVD + R, DVD + RW, etc.), Blu-ray disk, or the like.
- the selector 946 selects one of the encoded bit streams from the tuner 941 or the encoder 943 and supplies it to either the HDD unit 944 or the disk drive 945 when recording video or audio. Further, the selector 946 supplies the encoded bit stream output from the HDD unit 944 or the disk drive 945 to the decoder 947 at the time of reproduction of video and audio.
- the decoder 947 performs a decoding process on the encoded bit stream.
- the decoder 947 supplies the video data generated by performing the decoding process to the OSD unit 948.
- the decoder 947 outputs audio data generated by performing the decoding process.
- the OSD unit 948 generates video data for displaying a menu screen for selecting an item and the like, and superimposes it on the video data output from the decoder 947 and outputs the video data.
- a user interface unit 950 is connected to the control unit 949.
- the user interface unit 950 includes an operation switch, a remote control signal receiving unit, and the like, and supplies an operation signal corresponding to a user operation to the control unit 949.
- the control unit 949 is configured using a CPU, a memory, and the like.
- the memory stores programs executed by the CPU and various data necessary for the CPU to perform processing.
- the program stored in the memory is read and executed by the CPU at a predetermined timing such as when the recording / reproducing apparatus 940 is activated.
- the CPU executes the program to control each unit so that the recording / reproducing device 940 operates according to the user operation.
- FIG. 28 illustrates a schematic configuration of an imaging apparatus to which the present technology is applied.
- the imaging device 960 images a subject, displays an image of the subject on a display unit, and records it on a recording medium as image data.
- the imaging device 960 includes an optical block 961, an imaging unit 962, a camera signal processing unit 963, an image data processing unit 964, a display unit 965, an external interface unit 966, a memory unit 967, a media drive 968, an OSD unit 969, and a control unit 970. Have. In addition, a user interface unit 971 is connected to the control unit 970. Furthermore, the image data processing unit 964, the external interface unit 966, the memory unit 967, the media drive 968, the OSD unit 969, the control unit 970, and the like are connected via a bus 972.
- the optical block 961 is configured using a focus lens, a diaphragm mechanism, and the like.
- the optical block 961 forms an optical image of the subject on the imaging surface of the imaging unit 962.
- the imaging unit 962 is configured using a CCD or CMOS image sensor, generates an electrical signal corresponding to the optical image by photoelectric conversion, and supplies the electrical signal to the camera signal processing unit 963.
- the camera signal processing unit 963 performs various camera signal processing such as knee correction, gamma correction, and color correction on the electrical signal supplied from the imaging unit 962.
- the camera signal processing unit 963 supplies the image data after the camera signal processing to the image data processing unit 964.
- the image data processing unit 964 performs an encoding process on the image data supplied from the camera signal processing unit 963.
- the image data processing unit 964 supplies the encoded data generated by performing the encoding process to the external interface unit 966 and the media drive 968. Further, the image data processing unit 964 performs a decoding process on the encoded data supplied from the external interface unit 966 and the media drive 968.
- the image data processing unit 964 supplies the image data generated by performing the decoding process to the display unit 965. Further, the image data processing unit 964 superimposes the processing for supplying the image data supplied from the camera signal processing unit 963 to the display unit 965 and the display data acquired from the OSD unit 969 on the image data. To supply.
- the OSD unit 969 generates display data such as a menu screen and icons made up of symbols, characters, or figures and outputs them to the image data processing unit 964.
- the external interface unit 966 includes, for example, a USB input / output terminal, and is connected to a printer when printing an image.
- a drive is connected to the external interface unit 966 as necessary, a removable medium such as a magnetic disk or an optical disk is appropriately mounted, and a computer program read from them is installed as necessary.
- the external interface unit 966 has a network interface connected to a predetermined network such as a LAN or the Internet.
- the control unit 970 reads the encoded data from the memory unit 967 in accordance with an instruction from the user interface unit 971, and supplies the encoded data to the other device connected via the network from the external interface unit 966. it can.
- the control unit 970 may acquire encoded data and image data supplied from another device via the network via the external interface unit 966 and supply the acquired data to the image data processing unit 964. it can.
- any readable / writable removable medium such as a magnetic disk, a magneto-optical disk, an optical disk, or a semiconductor memory is used.
- the recording medium may be any type of removable medium, and may be a tape device, a disk, or a memory card. Of course, a non-contact IC card or the like may be used.
- media drive 968 and the recording medium may be integrated and configured by a non-portable storage medium such as a built-in hard disk drive or an SSD (Solid State Drive).
- a non-portable storage medium such as a built-in hard disk drive or an SSD (Solid State Drive).
- the control unit 970 is configured using a CPU, a memory, and the like.
- the memory stores programs executed by the CPU, various data necessary for the CPU to perform processing, and the like.
- the program stored in the memory is read and executed by the CPU at a predetermined timing such as when the imaging device 960 is activated.
- the CPU executes the program to control each unit so that the imaging device 960 operates according to the user operation.
- the image data processing unit 964 is provided with the function of the image processing apparatus (image processing method) of the present application. Therefore, it is possible to decode the encoded data including the reduced resolution parallax image recorded in the memory unit 967 or the recording medium, and generate an image of a predetermined viewpoint with high accuracy using the parallax image. .
- 120 decoding device 120 decoding device, 121 separation unit, 132 auxiliary decoder, 162 parallax image resolution enhancement unit, 163,164 parallax image warping processing unit, 165,166 smoothing processing unit, 167,168 viewpoint image warping processing unit, 171 multi-viewpoint image Generation unit, 220 decoding device, 231 auxiliary decoder, 251, 252 smoothing processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Architecture (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Image Processing (AREA)
Abstract
Description
図23は、視差と奥行きについて説明する図である。
[符号化装置の第1実施の形態の構成例]
図1は、本技術を適用した画像処理装置に対応する符号化装置の第1実施の形態の構成例を示すブロック図である。
図2および図3は、図1の符号化装置50による符号化処理を説明するフローチャートである。この符号化処理は、例えば、撮影部51A乃至撮影部51Cから視点画像A1乃至視点画像C1が出力されたとき開始される。
図4は、図1の符号化装置50から送信される多重化ストリームを復号する、本技術を適用した画像処理装置としての復号装置の構成例を示す図である。
図5は、図4の3D画像生成部142の詳細構成例を示すブロック図である。
図7は、視差画像のワーピング処理を説明する図である。
s(x',y',1)T=A’R'-1[(X,Y,Z)T-(t'x,t'y,t'z)T]
・・・(1)
図10は、図5のスムージング処理部165(166)によるスムージング処理を説明する図である。
図11は、図4の復号装置120による復号処理を説明するフローチャートである。この復号処理は、例えば、図1の符号化装置50から送信される多重化ストリームが復号装置120に入力されたとき、開始される。
[符号化装置の第2実施の形態の構成例]
図13は、本技術を適用した画像処理装置に対応する符号化装置の第2実施の形態の構成例を示すブロック図である。
図14は、境界情報の例を示す図である。
図15および図16は、図13の符号化装置200による符号化処理を説明するフローチャートである。この符号化処理は、例えば、撮影部51A乃至撮影部51Cから視点画像A1乃至視点画像C1が出力されたとき開始される。
図18は、図13の符号化装置200から送信される多重化ストリームを復号する、本技術を適用した画像処理装置としての復号装置の構成例を示す図である。
図19は、図18の3D画像生成部241の詳細構成例を示すブロック図である。
図20は、図19のスムージング処理部251(252)による境界情報に基づくスムージング処理を説明する図である。なお、図20において、小さい丸は画素を表し、丸の柄は視差値を表す。
図21は、図18の復号装置220による復号処理を説明するフローチャートである。この復号処理は、例えば、図13の符号化装置200から送信される多重化ストリームが復号装置220に入力されたとき、開始される。
[本技術を適用したコンピュータの説明]
次に、上述した一連の処理は、ハードウェアにより行うこともできるし、ソフトウェアにより行うこともできる。一連の処理をソフトウェアによって行う場合には、そのソフトウェアを構成するプログラムが、汎用のコンピュータ等にインストールされる。
[テレビジョン装置の構成例]
図25は、本技術を適用したテレビジョン装置の概略構成を例示している。テレビジョン装置900は、アンテナ901、チューナ902、デマルチプレクサ903、デコーダ904、映像信号処理部905、表示部906、音声信号処理部907、スピーカ908、外部インタフェース部909を有している。さらに、テレビジョン装置900は、制御部910、ユーザインタフェース部911等を有している。
[携帯電話機の構成例]
図26は、本技術を適用した携帯電話機の概略構成を例示している。携帯電話機920は、通信部922、音声コーデック923、カメラ部926、画像処理部927、多重分離部928、記録再生部929、表示部930、制御部931を有している。これらは、バス933を介して互いに接続されている。
[記録再生装置の構成例]
図27は、本技術を適用した記録再生装置の概略構成を例示している。記録再生装置940は、例えば受信した放送番組のオーディオデータとビデオデータを、記録媒体に記録して、その記録されたデータをユーザの指示に応じたタイミングでユーザに提供する。また、記録再生装置940は、例えば他の装置からオーディオデータやビデオデータを取得し、それらを記録媒体に記録させることもできる。さらに、記録再生装置940は、記録媒体に記録されているオーディオデータやビデオデータを復号して出力することで、モニタ装置等において画像表示や音声出力を行うことができるようにする。
[撮像装置の構成例]
図28は、本技術を適用した撮像装置の概略構成を例示している。撮像装置960は、被写体を撮像し、被写体の画像を表示部に表示させたり、それを画像データとして、記録媒体に記録する。
Claims (9)
- 低解像度化されたデプス画像を受け取る受け取り部と、
前記受け取り部により受け取られた前記デプス画像を高解像度化する高解像度化部と、
仮想視点の位置に基づいて、前記高解像度化部により高解像度化された前記デプス画像に対してワーピング処理を行うことにより、前記仮想視点のデプス画像を生成するデプス画像ワーピング処理部と、
前記デプス画像ワーピング処理部により生成された前記仮想視点のデプス画像を対象として、前記仮想視点の視点画像には存在するが、その仮想視点のデプス画像の生成に用いられた前記デプス画像に対応する視点画像には存在しない領域であるオクルージョン領域の画素値を補正する補正部と
を備える画像処理装置。 - 前記補正部は、前記仮想視点のデプス画像内の前記オクルージョン領域の前記所定値以外の画素値に対応する画素を検出し、検出した画素の画素値を前記所定値に補正する
請求項1に記載の画像処理装置。 - 前記補正部は、前記仮想視点のデプス画像内の前記所定値以外の画素値に対応する画素のうち、周辺画素の画素値が前記所定値である画素を、前記オクルージョン領域の前記所定値以外の画素値に対応する画素として検出する
請求項2に記載の画像処理装置。 - 前記受け取り部は、前記デプス画像の画素値が変化する位置である境界位置に隣接する画素を示す境界情報を受け取り、
前記補正部は、前記仮想視点のデプス画像内の前記所定値以外の画素値に対応する画素のうち、周辺画素の画素値が前記所定値であり、前記境界情報に示される画素を、前記オクルージョン領域の前記所定値以外の画素値に対応する画素として検出する
請求項2に記載の画像処理装置。 - 低解像度化され、符号化された前記デプス画像である符号化デプス画像を復号する復号部
をさらに備え、
前記受け取り部は、前記符号化デプス画像を受け取り、
前記復号部は、前記受け取り部により受け取られた前記符号化デプス画像を復号し、
前記高解像度化部は、前記復号部により復号された前記符号化デプス画像を高解像度化する
請求項1に記載の画像処理装置。 - 前記補正部により補正された前記仮想視点のデプス画像に基づいて、視点画像に対してワーピング処理を行うことにより、前記仮想視点の視点画像を生成する視点画像ワーピング処理部
をさらに備える
請求項1に記載の画像処理装置。 - 前記視点画像ワーピング処理部により生成された前記視点画像の前記オクルージョン領域を補間する補間部
をさらに備える
請求項6に記載の画像処理装置。 - 前記受け取り部は、第1の視点の前記視点画像および第2の視点の前記視点画像と、前記第1の視点の前記低解像度化されたデプス画像および前記第2の視点の前記低解像度化されたデプス画像とを受け取り、
前記デプス画像ワーピング処理部は、前記仮想視点の位置に基づいて、高解像度化された前記第1の視点のデプス画像に対してワーピング処理を行うことにより、前記仮想視点の第1のデプス画像を生成するとともに、前記仮想視点の位置に基づいて、高解像度化された前記第2の視点のデプス画像に対してワーピング処理を行うことにより、前記仮想視点の第2のデプス画像を生成し、
前記視点画像ワーピング処理部は、前記補正部により補正された前記仮想視点の前記第1のデプス画像に基づいて、前記第1の視点の視点画像に対してワーピング処理を行うことにより、前記仮想視点の第1の視点画像を生成するとともに、前記第2のデプス画像に基づいて、前記第2の視点の視点画像に対してワーピング処理を行うことにより、前記仮想視点の第2の視点画像を生成し、
前記補間部は、前記視点画像ワーピング処理部により生成された前記仮想視点の前記第1の視点画像および前記第2の視点画像のうちの一方の前記オクルージョン領域を、他方で補間する
請求項7に記載の画像処理装置。 - 画像処理装置が、
低解像度化されたデプス画像を受け取る受け取りステップと、
前記受け取りステップの処理により受け取られた前記デプス画像を高解像度化する高解像度化ステップと、
仮想視点の位置に基づいて、前記高解像度化ステップの処理により高解像度化された前記デプス画像に対してワーピング処理を行うことにより、前記仮想視点のデプス画像を生成するデプス画像ワーピング処理ステップと、
前記デプス画像ワーピング処理ステップの処理により生成された前記仮想視点のデプス画像を対象として、前記仮想視点の視点画像には存在するが、その仮想視点のデプス画像の生成に用いられた前記デプス画像に対応する視点画像には存在しない領域であるオクルージョン領域の画素値を補正する補正ステップと
を含む画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012558012A JPWO2012111756A1 (ja) | 2011-02-18 | 2012-02-16 | 画像処理装置および画像処理方法 |
CN2012800082706A CN103348689A (zh) | 2011-02-18 | 2012-02-16 | 图像处理装置和图像处理方法 |
US13/982,815 US9361734B2 (en) | 2011-02-18 | 2012-02-16 | Image processing device and image processing method |
US15/146,173 US9716873B2 (en) | 2011-02-18 | 2016-05-04 | Image processing device and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-033848 | 2011-02-18 | ||
JP2011033848 | 2011-02-18 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/982,815 A-371-Of-International US9361734B2 (en) | 2011-02-18 | 2012-02-16 | Image processing device and image processing method |
US15/146,173 Continuation US9716873B2 (en) | 2011-02-18 | 2016-05-04 | Image processing device and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012111756A1 true WO2012111756A1 (ja) | 2012-08-23 |
Family
ID=46672668
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/053676 WO2012111756A1 (ja) | 2011-02-18 | 2012-02-16 | 画像処理装置および画像処理方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US9361734B2 (ja) |
JP (2) | JP2012186781A (ja) |
CN (1) | CN103348689A (ja) |
WO (1) | WO2012111756A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014050830A1 (ja) * | 2012-09-25 | 2014-04-03 | 日本電信電話株式会社 | 画像符号化方法、画像復号方法、画像符号化装置、画像復号装置、画像符号化プログラム、画像復号プログラム及び記録媒体 |
JP2015005978A (ja) * | 2013-06-18 | 2015-01-08 | シズベル テクノロジー エス.アール.エル. | 3次元ビデオストリームに属する画像のカラーコンポーネントを用いることにより、深度マップを生成、格納、送信、受信および再生する方法およびデバイス |
CN104992416A (zh) * | 2015-06-30 | 2015-10-21 | 小米科技有限责任公司 | 图像增强方法和装置、智能设备 |
JP2019159886A (ja) * | 2018-03-14 | 2019-09-19 | 凸版印刷株式会社 | 画像生成方法および画像生成装置 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011115142A1 (ja) * | 2010-03-19 | 2011-09-22 | Okiセミコンダクタ株式会社 | 画像処理装置、方法、プログラム及び記録媒体 |
BR112013004454A2 (pt) * | 2010-09-03 | 2016-06-07 | Sony Corp | dispositivos e métodos de codificação e de decodificação |
US10122986B2 (en) * | 2011-02-18 | 2018-11-06 | Sony Corporation | Image processing device and image processing method |
JP2012186781A (ja) | 2011-02-18 | 2012-09-27 | Sony Corp | 画像処理装置および画像処理方法 |
JPWO2012111757A1 (ja) | 2011-02-18 | 2014-07-07 | ソニー株式会社 | 画像処理装置および画像処理方法 |
JP6021541B2 (ja) * | 2012-09-13 | 2016-11-09 | キヤノン株式会社 | 画像処理装置及び方法 |
US9578300B2 (en) | 2012-10-25 | 2017-02-21 | Lg Electronics Inc. | Method and apparatus for processing edge violation phenomenon in multi-view 3DTV service |
US9348495B2 (en) | 2014-03-07 | 2016-05-24 | Sony Corporation | Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone |
CN104978554B (zh) * | 2014-04-08 | 2019-02-05 | 联想(北京)有限公司 | 信息的处理方法及电子设备 |
JP6476833B2 (ja) * | 2014-12-19 | 2019-03-06 | 富士通株式会社 | 管理システム |
JP6494402B2 (ja) * | 2015-04-30 | 2019-04-03 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、プログラム |
US10748264B2 (en) | 2015-09-09 | 2020-08-18 | Sony Corporation | Image processing apparatus and image processing method |
EP3367326A4 (en) * | 2015-12-01 | 2018-10-03 | Sony Corporation | Image-processing device and image-processing method |
US20180047177A1 (en) * | 2016-08-15 | 2018-02-15 | Raptor Maps, Inc. | Systems, devices, and methods for monitoring and assessing characteristics of harvested specialty crops |
EP3422708A1 (en) * | 2017-06-29 | 2019-01-02 | Koninklijke Philips N.V. | Apparatus and method for generating an image |
JP2020178307A (ja) * | 2019-04-22 | 2020-10-29 | 株式会社ジャパンディスプレイ | 表示装置 |
BR102020027013A2 (pt) * | 2020-12-30 | 2022-07-12 | Samsung Eletrônica da Amazônia Ltda. | Método para gerar uma imagem multiplano adaptativa a partir de uma única imagem de alta resolução |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07282259A (ja) * | 1994-04-13 | 1995-10-27 | Matsushita Electric Ind Co Ltd | 視差演算装置及び画像合成装置 |
JP2001184497A (ja) * | 1999-10-14 | 2001-07-06 | Komatsu Ltd | ステレオ画像処理装置及び記録媒体 |
JP2004525437A (ja) * | 2000-12-01 | 2004-08-19 | サーノフ コーポレイション | 一群の実際のビデオおよび/または静止画像から新規ビデオおよび/または静止画像を合成する方法および装置 |
JP2008182669A (ja) * | 2006-10-13 | 2008-08-07 | Victor Co Of Japan Ltd | 多視点画像符号化装置、多視点画像符号化方法、多視点画像符号化プログラム、多視点画像復号装置、多視点画像復号方法、及び多視点画像復号プログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5768404A (en) | 1994-04-13 | 1998-06-16 | Matsushita Electric Industrial Co., Ltd. | Motion and disparity estimation method, image synthesis method, and apparatus for implementing same methods |
US7257272B2 (en) * | 2004-04-16 | 2007-08-14 | Microsoft Corporation | Virtual image generation |
CA2553473A1 (en) * | 2005-07-26 | 2007-01-26 | Wa James Tam | Generating a depth map from a tw0-dimensional source image for stereoscopic and multiview imaging |
JP2009238117A (ja) * | 2008-03-28 | 2009-10-15 | Toshiba Corp | 多視差画像生成装置および方法 |
KR101506926B1 (ko) * | 2008-12-04 | 2015-03-30 | 삼성전자주식회사 | 깊이 추정 장치 및 방법, 및 3d 영상 변환 장치 및 방법 |
US8780172B2 (en) * | 2009-01-27 | 2014-07-15 | Telefonaktiebolaget L M Ericsson (Publ) | Depth and video co-processing |
US8744121B2 (en) * | 2009-05-29 | 2014-06-03 | Microsoft Corporation | Device for identifying and tracking multiple humans over time |
JP4966431B2 (ja) * | 2009-09-18 | 2012-07-04 | 株式会社東芝 | 画像処理装置 |
US9445072B2 (en) * | 2009-11-11 | 2016-09-13 | Disney Enterprises, Inc. | Synthesizing views based on image domain warping |
WO2011123174A1 (en) * | 2010-04-01 | 2011-10-06 | Thomson Licensing | Disparity value indications |
JP5556394B2 (ja) * | 2010-06-07 | 2014-07-23 | ソニー株式会社 | 立体画像表示システム、視差変換装置、視差変換方法およびプログラム |
US8488870B2 (en) * | 2010-06-25 | 2013-07-16 | Qualcomm Incorporated | Multi-resolution, multi-window disparity estimation in 3D video processing |
US8774267B2 (en) * | 2010-07-07 | 2014-07-08 | Spinella Ip Holdings, Inc. | System and method for transmission, processing, and rendering of stereoscopic and multi-view images |
US8428342B2 (en) * | 2010-08-12 | 2013-04-23 | At&T Intellectual Property I, L.P. | Apparatus and method for providing three dimensional media content |
EP2451164B1 (en) * | 2010-11-08 | 2017-05-03 | Telefonaktiebolaget LM Ericsson (publ) | Improved view synthesis |
JPWO2012111757A1 (ja) | 2011-02-18 | 2014-07-07 | ソニー株式会社 | 画像処理装置および画像処理方法 |
JP2012186781A (ja) | 2011-02-18 | 2012-09-27 | Sony Corp | 画像処理装置および画像処理方法 |
US8711141B2 (en) * | 2011-08-28 | 2014-04-29 | Arcsoft Hangzhou Co., Ltd. | 3D image generating method, 3D animation generating method, and both 3D image generating module and 3D animation generating module thereof |
AU2011224051B2 (en) * | 2011-09-14 | 2014-05-01 | Canon Kabushiki Kaisha | Determining a depth map from images of a scene |
WO2013068457A1 (en) * | 2011-11-11 | 2013-05-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Concept for determining a measure for a distortion change in a synthesized view due to depth map modifications |
-
2011
- 2011-08-31 JP JP2011188992A patent/JP2012186781A/ja not_active Abandoned
-
2012
- 2012-02-16 JP JP2012558012A patent/JPWO2012111756A1/ja active Pending
- 2012-02-16 US US13/982,815 patent/US9361734B2/en active Active
- 2012-02-16 WO PCT/JP2012/053676 patent/WO2012111756A1/ja active Application Filing
- 2012-02-16 CN CN2012800082706A patent/CN103348689A/zh active Pending
-
2016
- 2016-05-04 US US15/146,173 patent/US9716873B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07282259A (ja) * | 1994-04-13 | 1995-10-27 | Matsushita Electric Ind Co Ltd | 視差演算装置及び画像合成装置 |
JP2001184497A (ja) * | 1999-10-14 | 2001-07-06 | Komatsu Ltd | ステレオ画像処理装置及び記録媒体 |
JP2004525437A (ja) * | 2000-12-01 | 2004-08-19 | サーノフ コーポレイション | 一群の実際のビデオおよび/または静止画像から新規ビデオおよび/または静止画像を合成する方法および装置 |
JP2008182669A (ja) * | 2006-10-13 | 2008-08-07 | Victor Co Of Japan Ltd | 多視点画像符号化装置、多視点画像符号化方法、多視点画像符号化プログラム、多視点画像復号装置、多視点画像復号方法、及び多視点画像復号プログラム |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014050830A1 (ja) * | 2012-09-25 | 2014-04-03 | 日本電信電話株式会社 | 画像符号化方法、画像復号方法、画像符号化装置、画像復号装置、画像符号化プログラム、画像復号プログラム及び記録媒体 |
JP5934375B2 (ja) * | 2012-09-25 | 2016-06-15 | 日本電信電話株式会社 | 画像符号化方法、画像復号方法、画像符号化装置、画像復号装置、画像符号化プログラム、画像復号プログラム及び記録媒体 |
JP2015005978A (ja) * | 2013-06-18 | 2015-01-08 | シズベル テクノロジー エス.アール.エル. | 3次元ビデオストリームに属する画像のカラーコンポーネントを用いることにより、深度マップを生成、格納、送信、受信および再生する方法およびデバイス |
CN104992416A (zh) * | 2015-06-30 | 2015-10-21 | 小米科技有限责任公司 | 图像增强方法和装置、智能设备 |
CN104992416B (zh) * | 2015-06-30 | 2018-04-27 | 小米科技有限责任公司 | 图像增强方法和装置、智能设备 |
JP2019159886A (ja) * | 2018-03-14 | 2019-09-19 | 凸版印刷株式会社 | 画像生成方法および画像生成装置 |
Also Published As
Publication number | Publication date |
---|---|
JP2012186781A (ja) | 2012-09-27 |
US20160249035A1 (en) | 2016-08-25 |
CN103348689A (zh) | 2013-10-09 |
US20130315472A1 (en) | 2013-11-28 |
JPWO2012111756A1 (ja) | 2014-07-07 |
US9361734B2 (en) | 2016-06-07 |
US9716873B2 (en) | 2017-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012111756A1 (ja) | 画像処理装置および画像処理方法 | |
WO2012111757A1 (ja) | 画像処理装置および画像処理方法 | |
JP6365635B2 (ja) | 画像処理装置および画像処理方法 | |
US9235749B2 (en) | Image processing device and image processing method | |
WO2012111755A1 (ja) | 画像処理装置および画像処理方法 | |
WO2012147621A1 (ja) | 符号化装置および符号化方法、並びに、復号装置および復号方法 | |
JP6206559B2 (ja) | 復号装置、復号方法、プログラム、および記録媒体 | |
US9338430B2 (en) | Encoding device, encoding method, decoding device, and decoding method | |
WO2012128069A1 (ja) | 画像処理装置および画像処理方法 | |
WO2013031575A1 (ja) | 画像処理装置および画像処理方法 | |
WO2013115024A1 (ja) | 画像処理装置および画像処理方法 | |
US9762884B2 (en) | Encoding device, encoding method, decoding device, and decoding method for encoding multiple viewpoints for compatibility with existing mode allowing fewer viewpoints |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12747249 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012558012 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13982815 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12747249 Country of ref document: EP Kind code of ref document: A1 |