CN104767986A - Depth of Field (DOF) image correction method and system - Google Patents

Depth of Field (DOF) image correction method and system Download PDF

Info

Publication number
CN104767986A
CN104767986A CN201410008359.3A CN201410008359A CN104767986A CN 104767986 A CN104767986 A CN 104767986A CN 201410008359 A CN201410008359 A CN 201410008359A CN 104767986 A CN104767986 A CN 104767986A
Authority
CN
China
Prior art keywords
pixel
depth
depth map
image
displacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410008359.3A
Other languages
Chinese (zh)
Inventor
涂日升
高荣扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN201410008359.3A priority Critical patent/CN104767986A/en
Publication of CN104767986A publication Critical patent/CN104767986A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

Provided is a DOF image correction method and system. The method comprises the steps that visual angle displacement is carried out on a left view and a right view of a practical visual image according to corresponding DOF images to obtain a left virtual view, a right virtual view, left hole information and right hole information; redundancy difference of pixel points which are not holes is obtained in a point to point manner; and if the obtained redundancy difference is greater than a first threshold, displacement inversion is carried out to obtain the coordinate of at least one pixel point in the practical visual image, and the depth of the pixel point is obtained according to the depth(s) of one or multiple pixel points in the range near to the coordinate.

Description

Depth map bearing calibration and system
Technical field
The application relates in general to depth map bearing calibration and a system.
Background technology
According to dynamic image expert group (Moving Picture Experts Group, be called for short MPEG) current stereoscopic video standard (the 3D Video Coding formulated of tissue, be called for short 3DVC), wish to reach under the flow restriction meeting Digital Transmission environment now, present the stereoscopic visual effect of various visual angles (multi-view).3DVC compares with multi-view coded method (multi-view video coding, MVC) before, and it need not note down the Viewing-angle information of substantial amounts, but goes to set up these visual angles in the mode of visual angle synthesis, can save a large amount of data volumes.
The framework of 3DVC entirety is mainly wished can in multiple texture images (texture image), only use the texture image of wherein several picture frames (frame) (real visual angle image) to coordinate the scene depth figure (depth map, depth map) of their correspondences, synthesize multiple virtual visual angles (virtual view) image in the mode at visual angle synthesis (view synthesis).For example, (Depth Image Based Rendering is played up based on depth image, be called for short DIBR) algorithm, three real visual angle images can be used to add corresponding these three groups of information of depth map, produce and open different visual angles image (including real visual angle image and virtual multi-view image) coldest days of the year end, no matter spectators from which angle position watch, as long as allow right and left eyes receive corresponding multi-view image, just can watch the stereo-picture of 3D.
Texture image is the real image of video camera photographed scene, and depth map can be considered one 8 gray scale images of corresponding real image, the pixel value (between 0 ~ 255) of depth map represents the distance of object distance video camera in scene, that is depth map represents is corresponding relation in space coordinates between object, and have nothing to do with the actual texture information of object itself.
For example, if in definition texture image: it is belonged to prospect object by the depth value that pixel is corresponding larger (color is brighter), and it is belonged to background by depth value smaller (color is darker).Can simplify to explain, the process at visual angle synthesis (view synthesis) can regard how many distances of the pixel view angle displacement (view warping) inside real visual angle image as, go on virtual multi-view image, each texture image pixel (pixel) wants the how many distance of displacement to be then by the depth map pixel value (pixel value) of coordinate corresponding with this pixel on depth map, determine referred to as depth value (depth value), according to the theory of visual angle synthesis, depth value on depth map corresponding to texture image pixel is larger, so the displacement of this pixel also can be larger.
Carry out in the building-up process of visual angle, the pixel that depth value is larger, many distances (warp) in displacement, and the pixel that depth value is smaller, the fewer distance of displacement.Because the distance of displacement is not etc., so may some pixel void value above virtual multi-view image, these empty pixels be called " hole " (hole).Generally speaking for example, hole information can be labeled in the pixel coordinate position of so-called hole shade (hole mask) in correspondence, and for reference in down-stream, fills up (hole filling) algorithm to fill up hole with hole.
Generally speaking, if the prospect background of depth map and texture image matches, the image of synthesis would not produce noise; If but misfit, then the image synthesized can form a little broken noise (noise).
Summary of the invention
Implement example according to one of this exposure, a kind of depth map bearing calibration is provided.The method comprises the following steps.One LOOK LEFT image of one real visual angle image, a LOOK RIGHT image are carried out a view angle displacement according to each self-corresponding depth map, to obtain on the left of a left virtual multi-view image, a right virtual multi-view image, hole information on the right side of hole information and one.The pixel value of left and right virtual multi-view image coordinate of at least one pixel not being hole is subtracted each other, to obtain the redundancy difference of aforementioned pixel point-to-pointly.If the redundancy difference obtained is greater than first threshold, then carry out that a displacement is counter to be pushed away, to obtain the coordinate that aforementioned pixel is positioned at corresponding real visual angle image, and correct the depth value of aforementioned pixel with the depth value of this coordinate nearby sphere of gained, to carry out a depth of field correction.
Implement example according to one of this exposure, a kind of depth map corrective system is provided.Depth map corrective system comprises a view angle displacement unit, a redundancy difference computational unit, one first judging unit, a displacement is counter pushes away unit and a depth of field correcting unit.View angle displacement unit in order to a LOOK LEFT image of a real visual angle image is carried out a view angle displacement according to each self-corresponding depth map with a LOOK RIGHT image, to obtain on the left of a left virtual multi-view image, a right virtual multi-view image, hole information on the right side of hole information and one.Redundancy difference computational unit is subtracted each other in order to point-to-point the pixel value of left and right virtual multi-view image coordinate by least one pixel not being hole, to obtain a redundancy difference of aforementioned at least one pixel.If the first judging unit is in order to judge that the redundancy difference that obtains is greater than a first threshold, then carry out that a displacement is counter to be pushed away.Displacement is counter to be pushed away unit this displacement is counter to be pushed away in order to carry out, to obtain the coordinate that aforementioned at least one pixel is positioned at corresponding real visual angle image.Depth of field correcting unit, in order to carry out a depth of field correction, corrects the depth value of aforementioned at least one pixel with the depth value of one or more pixel of nearby sphere of gained coordinate.In order to have better understanding to above-mentioned and other aspect of the present invention, some enforcement examples cited below particularly, and coordinate accompanying drawing, be described in detail below:
Accompanying drawing explanation
The one exemplary embodiment of described graphic explanation the application, and together with the description in order to explain the principle of the application.
Figure 1A, Figure 1B, Fig. 1 C illustrates the illustration figure that edge that display foreground object and background have a common boundary produces noise.
Fig. 2 illustrates the flow chart that example is implemented in a depth map bearing calibration.
Fig. 3 illustrates the thin portion flow chart of the depth map bearing calibration enforcement example of Fig. 2.
Fig. 4 illustrates and checks hole, left side embodiment schematic diagram with Fig. 3 method.
Fig. 5 illustrates and checks hole, right side embodiment schematic diagram with Fig. 3 method.
Fig. 6 illustrates and implements example schematic via depth map corrective system.
In order to more understand the above-mentioned and other feature & benefits of the application, special embodiment below also coordinates accompanying drawing, is described further.Should be understood that above general description and following detailed description are all exemplary, and desirable to provide to as the detailed explanation of the application of advocating.
[symbol description]
301: left virtual multi-view image
302: right virtual multi-view image
303: redundancy difference
304: hole, left side
305: LOOK LEFT depth map
306: LOOK LEFT depth map after correcting
401: left virtual multi-view image
402: right virtual multi-view image
403: redundancy difference
404: hole, right side
405: LOOK RIGHT depth map
406: LOOK RIGHT depth map after correcting
500: depth map corrective system
510: view angle displacement unit
520: redundancy difference computational unit
530: the first judging units
540: the second judging units
550: displacement is counter pushes away unit
560: the three judging units
570: depth of field correcting unit
BI: background
FI: prospect object
NS: noise
S210, S220, S230, S240, S250, S260, S270: process step
Embodiment
The method for compressing image that stereoscopic video standard (3D Video Coding is called for short 3DVC) is such, depends on the correctness of depth map to a certain degree on image quality presents.When in depth map, if the depth information of image object is not accurate like that, according to observing, for example, if the pixel of prospect part originally should be belonged to, because the inaccurate relation of depth map, its depth value may be allowed to become the depth value of background.In such event, this pixel would not be displaced to its this position gone originally, and becomes a part for background.Please refer to Figure 1A, Figure 1B, Fig. 1 C, Figure 1A, Figure 1B, Fig. 1 C illustrates the illustration figure that edge that display foreground object FI and background BI has a common boundary produces noise NS.Reaction is on the virtual visual angle picture be synthesized, and this just may make the virtual image that is synthesized, such as in the edge that prospect object FI and background BI has a common boundary, generation is as the illustration noise NS of Figure 1A or Figure 1B or Fig. 1 C.
Therefore, the application proposes a depth map bearing calibration, system, the compatible current video compression standard of present techniques H.264/AVC with H.265/HEVC organizational requirements.
Please refer to Fig. 2 and Fig. 3, Fig. 2 illustrates the flow chart that example is implemented in a depth map bearing calibration.Fig. 3 illustrates the thin portion flow chart of the depth map bearing calibration enforcement example of Fig. 2.Virtual multi-view image noise is eliminated via correction depth map.In step S210, the one LOOK LEFT image (real visual angle image) of one real visual angle image, a LOOK RIGHT image (real visual angle image) (S211) are carried out view angle displacement (view warping) (S212) separately according to the LOOK LEFT depth map of its correspondence, LOOK RIGHT depth map (S211), to obtain on the left of a left virtual multi-view image (virtual view), a right virtual multi-view image, hole information (S213) on the right side of hole information and one.In step S220, point-to-pointly, by be hole (hole) at least one pixel corresponding to the left and right virtual multi-view image pixel value of coordinate subtract each other (S221), obtain the redundancy difference DiffLRwarp(S222 of at least one pixel).In step S230, judge whether the redundancy difference (residual difference) of a pixel is greater than first threshold.If so, then in step s 250, this pixel is carried out to displacement is counter pushes away (reverse warping) computing, obtain the coordinate that this pixel corresponding is positioned at this real visual angle image.That is this pixel near zone creates noise, it is which coordinate displacement is come from real visual angle image originally that Extrapolation goes out this pixel.In step S270, correct the depth value of this pixel according to this with the depth value of this one or more pixel of coordinate nearby sphere, and obtain LOOK LEFT image and the LOOK RIGHT image depth map respectively through correction process.
During actual operation for example in one embodiment, obtain redundancy difference and be greater than after first threshold may be with noisy all pixels, then just further counter push away view angle displacement before real visual angle image coordinate and correct this coordinate depth value.Also can pointwise carry out again such as in another embodiment, after each pixel is judged to be the noisy pixel of band, with that further counter push away view angle displacement before real visual angle image coordinate and correct this pixel depth value.Selectablely when these are all implementations carry out mode, all can practising way in this application.And the application's so-called " nearby sphere " makes a general reference in a setting range, around one or more pixel that a setpoint distance on the arbitrary direction initialization of this pixel includes.Direction can be set as such as horizontal direction, diagonal, vertical direction etc., but not as limit, distance can set for example XB length in pixels scope, and XB is a positive integer being more than or equal to 1.Then a corrected value is calculated to replace the former depth value of this pixel with statistical analysis algorithms.Can be such as get: arithmetic mean, median or similar characteristics statistical analysis value etc., but not as limit.
In time synthesizing left and right virtual multi-view image, other left and right virtual multi-view image information can be synthesized.For example, in one embodiment, the depth map correspondence of available LOOK LEFT image and/or LOOK RIGHT synthesizes the depth map (depth map) at left virtual multi-view image and/or right virtual visual angle, just can corresponding take the depth map of left virtual multi-view image and/or right virtual multi-view image as foundation thus, the pixel creating noise to left side and/or right side carries out that displacement is counter to be pushed away.And in another embodiment, then produce left look-up table (look-up table) and/or right look-up table, each pixel that have recorded in look-up table on left and right virtual multi-view image is that how many distance projection is come by real visual angle picture displacement, when carrying out that displacement is counter to be pushed away with reference to look-up table, just instead can push away and learning that certain pixel above virtual visual angle from which the pixel displacement real visual angle image is come.
Please refer to the thin portion flow process of Fig. 3, in one embodiment, after finding at least one pixel that may produce noise in step S230, step S240 optionally judges near near on the left of aforementioned at least one pixel or right side, whether the quantity in hole is greater than Second Threshold respectively.That is, judge whether the hole, left side of aforementioned at least one pixel or the width (that is quantity) in hole, right side are not less than Second Threshold respectively.In hole, left side information described in Fig. 2 and hole, right side information, comprise hole, left side quantity and hole, the right side quantity of at least one pixel respectively.When the words judging to set up, in step S250 aforementioned at least one pixel done that displacement is counter pushes away (reverse warping) computing, originally be from which coordinate displacement of real visual angle image come to calculate a pixel, the depth of field of then carrying out this pixel in step S270 according to this with the depth value of this one or more pixel of coordinate nearby sphere corrects.
Please refer to Fig. 4, it illustrates and checks hole, left side embodiment schematic diagram with Fig. 3 method.The pixel value of the virtual multi-view image 302 of the virtual multi-view image in a left side 301 and the right side of at least one pixel not being hole is subtracted each other point-to-pointly, obtain the redundancy difference (DiffLRwarp) 303 of aforementioned at least one pixel, and find at least one pixel that may produce noise in step S230.When checking hole 304, left side, when step S240 judges that the width in the hole, left side 304 of a pixel is greater than Second Threshold, then according to LOOK LEFT image information this pixel to be carried out in step S250 to displacement is counter pushes calculation, calculating this pixel is from which coordinate displacement of LOOK LEFT image come originally.Then find the depth value of this coordinate left one or more pixel of nearby sphere in step S270 from LOOK LEFT depth map 305, correct the depth value of this pixel according to this, and obtain correcting rear LOOK LEFT depth map 306.
Please refer to Fig. 5, it illustrates and checks hole, right side embodiment schematic diagram with Fig. 3 method.The pixel value of at least one pixel not being hole subtracts each other by left virtual multi-view image 401 and the virtual multi-view image in the right side 402 point-to-pointly, obtain the redundancy difference (DiffLRwarp) 403 of aforementioned at least one pixel, and find at least one pixel that may produce noise in step S230.When checking hole 404, right side, when hole, right side 404 width of judgement one pixel is greater than Second Threshold (S240), then according to LOOK RIGHT image information this pixel to be carried out in step S250 to displacement is counter pushes calculation, calculating this pixel is from which coordinate displacement of LOOK RIGHT image come originally.Then find the depth value of this coordinate right one or more pixel of nearby sphere in step S270 from LOOK RIGHT depth map 405, correct the depth value of this pixel according to this, and obtain correcting rear LOOK RIGHT depth map 406.
As shown in Figure 3, in another embodiment, optionally after the anti-step S250 pushed away of displacement, step S260 is carried out.In step S260, in LOOK LEFT image and/or the current pixel point position at LOOK RIGHT image, to the left and/or to the right scope looks for one or more pixel, for example XB pixel, judge that whether current pixel point depth value is all less than this XB pixel depth value, XB is a positive integer being more than or equal to 1.If so, then the depth of field of carrying out step S270 corrects.Now think that current pixel point belongs to noise, and the depth value of scenery part corrected the depth value of current pixel point according to this in the past.
After performing each pixel depth of field correction that may produce noise and terminating, left eye visual angle and right-eye perspectives can be obtained individually through the depth map of correction process, can as the source at follow-up re-compression and encoding (encode) or visual angle synthesis (view synthesis).
According to a depth map corrective system 500 of this exposure, virtual multi-view image noise can be eliminated via correction depth map, please refer to the enforcement example of Fig. 6.Fig. 6 illustrates and implements example schematic via depth map corrective system.According to each self-corresponding left and right depth map, respectively view angle displacement is carried out to LOOK LEFT image, LOOK RIGHT image by view angle displacement unit 510, output left virtual multi-view image, right virtual multi-view image, hole, left side information and hole, right side information.Hole, left side information comprises hole, the left side quantity of at least one pixel, and hole, right side information comprises hole, the right side quantity of at least one pixel.Redundancy difference computational unit 520 point-to-pointly, by be not hole at least one pixel corresponding to the left and right virtual multi-view image pixel value of coordinate subtract each other, obtain the redundancy difference (DiffLRwarp) of aforementioned at least one pixel.When the first judging unit 530 judges that the redundancy difference (DiffLRwarp) of a pixel is if when exceeding first threshold, then transfer to the anti-unit 550 that pushes away of displacement to carry out that the displacement of this pixel is counter pushes calculation, obtain the coordinate that this pixel corresponding is positioned at real visual angle image.Depth of field correcting unit 570 then corrects the depth value of this pixel according to this with the depth value of this one or more pixel of coordinate nearby sphere, obtain LOOK LEFT image and the LOOK RIGHT image depth map respectively through correction process, to carry out a depth of field correction.View angle displacement unit 510, redundancy difference computational unit 520, first judging unit 530, displacement is counter pushes away unit 550 and depth of field correcting unit 570, can be such as the recording medium etc. of a chip, a firmware circuitry, a circuit board or storage array program code, but not as limit.
In one embodiment, when obtaining at least one pixel that may produce noise, can select to judge whether the hole, left side of a pixel or the width (that is quantity) in hole, right side are greater than Second Threshold, check whether the quantity in hole or hole, right side on the left of this pixel is greater than Second Threshold respectively by the second judging unit 540.Second judging unit 540 is such as the recording medium etc. of a chip, a firmware circuitry, a circuit board or storage array program code, but not as limit.When the words judging to set up, the anti-unit 550 that pushes away of displacement is transferred to by this pixel to carry out that displacement is counter to be pushed away, obtain the coordinate that this pixel corresponding is positioned at real visual angle image, then depth of field correcting unit 570 carries out depth of field correction according to this with the depth value of one or more pixel of coordinate nearby sphere therewith.
In one embodiment, undertaken that displacement is counter to be pushed away by the anti-unit 550 that pushes away of displacement, obtain after this pixel corresponding is positioned at the coordinate of real visual angle image, can select by the 3rd judging unit 560, postpone in LOOK LEFT image and/or the current pixel point position at LOOK RIGHT image according to current displacement is counter, correspondence to the left nearby sphere and/or to the right nearby sphere look for one or more point, for example XB pixel, judge that whether current pixel point depth value is all less than this XB pixel, XB is a positive integer being more than or equal to 1.When judgement is set up, then depth of field correcting unit 570 carries out depth of field correction according to this with the depth value of coordinate nearby sphere therewith.3rd judging unit 560 can be such as the recording medium etc. of a chip, a firmware circuitry, a circuit board or storage array program code, but not as limit.
In addition, the depth map corrective system 500 of this exposure can be electrically connected (or be called a couple) processor and at least one memory (figure does not show), the unit of depth map corrective system 500 can coordinate processor by ball bearing made using or firmware, send signal, message or data, the unit of depth map corrective system 500 also can coordinate this processor to perform handling procedure.
When depth map corrective system 500 performs after each pixel depth of field that may produce noise corrects and terminate, left eye visual angle and right-eye perspectives can be obtained individually through the depth map of correction process, can as the source at follow-up re-compression and encoding (encode) or visual angle synthesis (view synthesis).
And how to determine first threshold? in certain embodiments, first threshold can be greater than 0 integer, when first threshold is larger, less pixel can be treated as is the point that possible have error, and this is that the pixel of noise can be missed originally more probability, do not carry out the program corrected.But if first threshold is established less, situation then may become comparatively responsive to noise, has more probability, originally out of question, correct pixel may on the contrary by be made into mistake result, maybe first threshold can be called noise sensitivity threshold value.
Generally speaking, first threshold is set to a set point by the simplest method exactly, and such as 3,4,5,6,7 ... etc. constant; Secondly, also the mode of mathematical operation can be adopted to calculate first threshold, such as the redundancy difference (DiffLRwarp) of this at least one pixels all is cut into MxN block (M, N are positive integer), and obtain the arithmetic mean of each block, median or similar characteristics statistical analysis value, be used as first threshold when judging this block of pixel point; Other method also can have similar machine to learn the means of (Machine Learning), automatically adjusts first threshold at leisure to a suitable value.
When the virtual angular distance that will synthesize is distant time (when such as removing synthesis the 2nd virtual multi-view image by the information of the 5th real image), now some hole can not occur in prospect background intersection, and the size in this hole can't be too large, therefore a suitable threshold value is got to screen hole, to avoid the situation that erroneous judgement occurs, that is Second Threshold, maybe Second Threshold can be called object edge broken hole threshold value.In some embodiments, Second Threshold can be set to a set point, such as a constant.In further embodiments, also can calculate Second Threshold by formula (1), and make Second Threshold be more than or equal to 1 positive integer.
Th 2 = 1 n × f × b × ( 1 Z near - 1 Z far ) · · · ( 1 )
Wherein, Th2 is Second Threshold, and relevant to following variable.F is the focal length (focal length) of video camera (camera).B is parallax range (baseline distance), and the general distance representing two visual angles, refers to the distance of virtual visual angle and real visual angle here.N is the positive integer made by oneself.And when doing view angle displacement (view warping) and calculating, in fact can two dimensional image be transformed on three dimensions, this three-dimensional scope determines distance recently and farthest, namely close shot distance Z by two planes near(near clipping distance) and distant view distance Z far(far clipping distance), generally speaking, above variable all takes absolute value.
Comprehensive the above, although the application is with embodiment openly as above, so itself and be not used to limit the application.The application one of ordinary skill in the art, not departing from the spirit and scope of the application, when being used for a variety of modifications and variations.Therefore, the protection range of the application is when being as the criterion depending on appended claims confining spectrum.

Claims (20)

1. a depth map bearing calibration, comprising:
One LOOK LEFT image of one real visual angle image, a LOOK RIGHT image are carried out a view angle displacement according to each self-corresponding depth map, to obtain on the left of a left virtual multi-view image, a right virtual multi-view image, hole information on the right side of hole information and one;
The pixel value of left and right virtual multi-view image coordinate of at least one pixel not being hole is subtracted each other, to obtain a redundancy difference of this at least one pixel point-to-pointly;
If judge that this redundancy difference of this at least one pixel is greater than a first threshold, then carry out that a displacement is counter to be pushed away, to obtain the coordinate that this at least one pixel is positioned at this real visual angle image; And
This at least one depth value stating pixel is corrected according to this, to carry out a depth of field correction with the depth value of one or more pixel of nearby sphere of this coordinate obtained.
2. depth map bearing calibration as claimed in claim 1, wherein on the left of this hole information comprise this at least one pixel one on the left of hole quantity, and if wherein judge that this redundancy difference is greater than the step of this first threshold, also comprises:
If judge that hole, the left side quantity of this at least one pixel is greater than a Second Threshold, then carry out that this displacement is counter to be pushed away, calculating this at least one pixel is from which coordinate displacement of this LOOK LEFT image come originally.
3. depth map bearing calibration as claimed in claim 1, wherein on the right side of this hole information comprise this at least one pixel one on the right side of hole quantity, and if wherein judge that this redundancy difference is greater than the step of this first threshold, also comprises:
If judge that hole, the right side quantity of this at least one pixel is greater than a Second Threshold, then carrying out this displacement Extrapolation, to go out this at least one pixel be from which coordinate displacement of this LOOK RIGHT image come originally.
4. depth map bearing calibration as claimed in claim 1, wherein this nearby sphere of this coordinate is the setpoint distance around this at least one pixel one direction initialization.
5. depth map bearing calibration as claimed in claim 1, this first threshold is wherein a set point.
6. depth map bearing calibration as claimed in claim 1, also comprises:
These redundancy differences of this at least one pixels all are cut into MxN block, and M, N are positive integer; And
Obtain an arithmetic mean or a median of respectively this block, using as this first threshold when judging this block of pixel point.
7. depth map bearing calibration as claimed in claim 1 or 2, wherein carries out that this displacement is counter to be postponed, and also comprises:
In a current pixel point position of this LOOK LEFT image, nearby sphere looks for this one or more pixel to the left; And
If judge that the depth value of this current pixel point is all less than the depth value of this one or more pixel, then carry out this depth of field correction.
8. depth map bearing calibration as claimed in claim 2, also comprises:
With the depth map of the depth map of this LOOK LEFT image synthesis one this left virtual multi-view image, and carry out that this displacement is counter to be pushed away according to this.
9. depth map bearing calibration as claimed in claim 2, also comprises:
Produce a left look-up table, each pixel on record one left virtual multi-view image is that how many distance projection is come by this real visual angle picture displacement, and carries out that this displacement is counter to be pushed away according to this.
10. depth map bearing calibration as claimed in claim 3, also comprises:
With the depth map of the depth map of this LOOK RIGHT image synthesis one this right virtual multi-view image, and carry out that this displacement is counter to be pushed away according to this.
11. depth map bearing calibrations as claimed in claim 3, also comprise:
Produce a right look-up table, each pixel on record one right virtual multi-view image is that how many distance projection is come by this real visual angle picture displacement, and carries out that this displacement is counter to be pushed away according to this.
12. depth map bearing calibrations as described in claim 1 or 3, wherein carry out that this displacement is counter to be postponed, and also comprise:
In a current pixel point position of this LOOK RIGHT image, nearby sphere looks for this one or more pixel to the right; And
If judge that the depth value of this current pixel point is all less than the depth value of this one or more pixel, then carry out this depth of field correction.
13. depth map bearing calibrations as claimed in claim 2 or claim 3, wherein this Second Threshold is a set point.
14. depth map bearing calibrations as claimed in claim 2 or claim 3, wherein this Second Threshold be more than or equal to 1 positive integer, and
F is the focal length of a video camera, b is a parallax range, n is a positive integer, Z neara close shot distance, Z farit is a distant view distance.
15. depth map bearing calibrations as described in claim 1 or 4, wherein get an arithmetic mean or a median of this this one or more pixel depth value of coordinate nearby sphere, correct the depth value of this at least one pixel.
16. 1 kinds of depth map corrective systems, comprising:
One view angle displacement unit, in order to a LOOK LEFT image of a real visual angle image, a LOOK RIGHT image are carried out a view angle displacement according to each self-corresponding depth map, to obtain on the left of a left virtual multi-view image, a right virtual multi-view image, hole information on the right side of hole information and one;
One redundancy difference computational unit, subtracts each other in order to point-to-point the pixel value of left and right virtual multi-view image coordinate by least one pixel not being hole, to obtain a redundancy difference of this at least one pixel;
One first judging unit, if in order to judge that this redundancy difference is greater than a first threshold, then carry out that a displacement is counter to be pushed away;
One displacement is counter pushes away unit, and in order to carry out, this displacement is counter to be pushed away, to obtain the coordinate that this at least one pixel is positioned at this real visual angle image; And
One depth of field correcting unit, in order to carry out a depth of field correction, with the depth value of one or more pixel of nearby sphere of this coordinate obtained to correct the depth value of this at least one pixel.
17. depth map corrective systems as claimed in claim 16, on the left of this hole information comprise this at least one pixel one on the left of hole quantity, and wherein on the right side of this hole information comprise this at least one pixel one on the right side of hole quantity, this system also comprises:
One second judging unit, in order to when this redundancy difference is greater than this first threshold, if judge that hole, the left side quantity of this at least one pixel and/or hole, right side quantity are greater than a Second Threshold, then carries out that this displacement is counter to be pushed away.
18. depth map corrective systems as described in claim 16 or 17, also comprise:
One the 3rd judging unit, in order to postpone this displacement is counter, in a current pixel point position of this LOOK LEFT image and/or this LOOK RIGHT image, correspondence to the left and/or right nearby sphere look for this one or more pixel, if and in order to judge that the depth value of this current pixel point is all less than the depth value of this one or more pixel, then carry out this depth of field correction.
19. depth map corrective systems as claimed in claim 16, wherein this nearby sphere is the setpoint distance around this at least one pixel one direction initialization.
20. depth map corrective systems as described in claim 16 or 19, wherein get an arithmetic mean or a median of this this one or more pixel depth value of coordinate nearby sphere, correct the depth value of this at least one pixel.
CN201410008359.3A 2014-01-02 2014-01-02 Depth of Field (DOF) image correction method and system Pending CN104767986A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410008359.3A CN104767986A (en) 2014-01-02 2014-01-02 Depth of Field (DOF) image correction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410008359.3A CN104767986A (en) 2014-01-02 2014-01-02 Depth of Field (DOF) image correction method and system

Publications (1)

Publication Number Publication Date
CN104767986A true CN104767986A (en) 2015-07-08

Family

ID=53649557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410008359.3A Pending CN104767986A (en) 2014-01-02 2014-01-02 Depth of Field (DOF) image correction method and system

Country Status (1)

Country Link
CN (1) CN104767986A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107248137A (en) * 2017-04-27 2017-10-13 努比亚技术有限公司 A kind of method and mobile terminal for realizing image procossing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630408A (en) * 2009-08-14 2010-01-20 清华大学 Depth map treatment method and device
JP2012194751A (en) * 2011-03-16 2012-10-11 Nippon Telegr & Teleph Corp <Ntt> Image processing method, image processing system and computer program
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
CN102957923A (en) * 2011-08-24 2013-03-06 陈良基 Three-dimensional image depth map correction system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630408A (en) * 2009-08-14 2010-01-20 清华大学 Depth map treatment method and device
JP2012194751A (en) * 2011-03-16 2012-10-11 Nippon Telegr & Teleph Corp <Ntt> Image processing method, image processing system and computer program
CN102760234A (en) * 2011-04-14 2012-10-31 财团法人工业技术研究院 Depth image acquisition device, system and method
CN102957923A (en) * 2011-08-24 2013-03-06 陈良基 Three-dimensional image depth map correction system and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107248137A (en) * 2017-04-27 2017-10-13 努比亚技术有限公司 A kind of method and mobile terminal for realizing image procossing

Similar Documents

Publication Publication Date Title
US9986258B2 (en) Efficient encoding of multiple views
JP3826236B2 (en) Intermediate image generation method, intermediate image generation device, parallax estimation method, and image transmission display device
Rahaman et al. Virtual view synthesis for free viewpoint video and multiview video compression using Gaussian mixture modelling
JP5197683B2 (en) Depth signal generation apparatus and method
US20150334365A1 (en) Stereoscopic image processing apparatus, stereoscopic image processing method, and recording medium
WO2016172385A1 (en) Methods for full parallax compressed light field synthesis utilizing depth information
KR20130038386A (en) Vision-based quality metric for three dimensional video
CN111480342B (en) Encoding device, encoding method, decoding device, decoding method, and storage medium
CN101610425A (en) A kind of method and apparatus of evaluating stereo image quality
US20170064279A1 (en) Multi-view 3d video method and system
US9462251B2 (en) Depth map aligning method and system
CN103369331A (en) Image hole filling method, image hole filling device, video image processing method and video image processing device
US20150003724A1 (en) Picture processing apparatus, picture processing method, and picture processing program
CN114401391B (en) Virtual viewpoint generation method and device
CN105791795A (en) Three-dimensional image processing method and device and three-dimensional video display device
WO2015115946A1 (en) Methods for encoding and decoding three-dimensional video content
Kao Stereoscopic image generation with depth image based rendering
CN109218706B (en) Method for generating stereoscopic vision image from single image
JP5931062B2 (en) Stereoscopic image processing apparatus, stereoscopic image processing method, and program
CN104767986A (en) Depth of Field (DOF) image correction method and system
US11558625B2 (en) Method for generating residual image of multi-view video and apparatus using the same
Akin et al. Real-time free viewpoint synthesis using three-camera disparity estimation hardware
JP5488482B2 (en) Depth estimation data generation device, depth estimation data generation program, and pseudo-stereoscopic image display device
CN113132706A (en) Controllable position virtual viewpoint generation method and device based on reverse mapping
CN112565623A (en) Dynamic image display system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150708

WD01 Invention patent application deemed withdrawn after publication