WO2014135400A1 - Multi frame motion and disparity estimator - Google Patents

Multi frame motion and disparity estimator Download PDF

Info

Publication number
WO2014135400A1
WO2014135400A1 PCT/EP2014/053634 EP2014053634W WO2014135400A1 WO 2014135400 A1 WO2014135400 A1 WO 2014135400A1 EP 2014053634 W EP2014053634 W EP 2014053634W WO 2014135400 A1 WO2014135400 A1 WO 2014135400A1
Authority
WO
WIPO (PCT)
Prior art keywords
vector candidates
roundtrip
estimation
picture
picture frame
Prior art date
Application number
PCT/EP2014/053634
Other languages
French (fr)
Inventor
Piergiorgio Sartor
Original Assignee
Sony Corporation
Sony Deutschland Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation, Sony Deutschland Gmbh filed Critical Sony Corporation
Publication of WO2014135400A1 publication Critical patent/WO2014135400A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Definitions

  • the present disclosure relates to a method, particularly for multiframe motion and disparity estimation.
  • the present disclosure also relates to a device for image processing of a number of spatially and/or temporarily separated picture frames, a computer program and a non-transitory computer-readable recording medium.
  • estimation vector candidates array containing a plurality of vector candidates, said estimation vector candidates array indicating at least one motion and/or disparity relation between two of said picture frames;
  • a device for image processing of a number of spatially and/or temporarily separated picture frames comprising a generating unit adapted to generate vector candidates for each estimation vector of an estimation vector array of a picture frame,
  • a roundtrip path determining unit adapted to concatenate the vector candidates of said picture frames
  • a selection unit adapted to select one of the roundtrip paths according to a predetermined criterion.
  • a computer program comprising program code means for causing a computer to perform the steps of the method disclosed herein when said computer program is carried out on a computer is provided.
  • a non-transitory computer-readable recording medium that stores therein a computer program product, which when executed by a processor causes the method disclosed herein to be performed is provided as well.
  • One of the aspects of the present disclosure is to implement a generic motion and disparity estimation using multiple frames in time and space.
  • the disclosed object takes advantage of the spatial-temporal redundancy between multiple picture frames in order to estimate correspondences between two frames (motion or disparity).
  • the estimations across the multiple picture frames are "chained'Vconcatenated" and it is then evaluated whether the estimation chain which goes across different picture frames and comes back to the first picture frame ends where it started. In an optimum case the chain should end where it starts, provided that the respective areas in the picture frame are non- occluded areas. This evaluation allows to find the "best" vector candidates in the estimation vector candidates array which in turn allows to form a better estimation vector array.
  • Fig. 1 shows a block diagram of multiple, preferably four picture frames separated in time and space
  • Fig. 2 shows a block diagram illustrating the concatenation of estimation vectors between several picture frames
  • Fig. 3 shows a decision tree to illustrate how to find a best roundtrip path
  • Fig. 4 shows two block diagrams illustrating how to relate the picture frames of multiple cameras and multiple time instances
  • Fig. 5 shows a block diagram of a device for image processing.
  • Fig. 1 shows an example of four picture frames (in the following only called "frames") captured by two cameras as left and right frames in two time instances t, t+1.
  • the frames in Fig. 1 are indicated with reference numerals 10, 12, 14 and 16.
  • a frame is built up of pixels, each pixel carrying information for example on color, etc.
  • a pixel block 18 is schematically shown in a first position within the frame 10 captured by a left camera.
  • the corresponding pixel block 18 is located in a different position, wherein the difference between both positions characterizes the disparity between both corresponding pixel blocks 18.
  • the disparity is indicated by the arrow d(t).
  • Fig. 1 it is also shown that the position of the pixel block 18 has changed from frame 10 to frame 16 as well as from frame 12 to frame 14. This change of position is caused by movement of the pixel block 18 between time instances t and t+1. This change of position of the pixel block 18 is called motion and is indicated by arrows vl(t, t+1) and vr(t, t+1), respectively.
  • motion is indicated by arrows vl(t, t+1) and vr(t, t+1), respectively.
  • disparity and motion between two frames is represented by a two dimensional vector indicating the difference between the position of corresponding matching pixel blocks.
  • disparity and motion vectors are necessary, for example for interpolating frames or interpolating areas of a frame.
  • disparity vectors and motion vectors have to be estimated by using motion and disparity estimator circuits.
  • the main object of such estimators is to find corresponding/matching pixel blocks in two frames.
  • motion estimation and disparity estimation are carried out independently of each other in prior art solutions.
  • the vector chain/concatenation starting from pixel block 18 of frame 10 should end at pixel block 18 again.
  • the sum of the four vectors 21, 22, 23 and 24 should be 0 in an ideal estimation.
  • the "best" roundtrip path may be for example that path ending closest to the starting pixel block 18 of frame 10.
  • the above mentioned roundtrip path may now be advantageously used for example for evaluating the result of a motion and disparity estimation and as a supplement thereof for selecting the "best" motion and disparity estimations out of a plurality of different estimations.
  • FIG. 2 In Fig. 2, three frames (out of four frames for example) 10, 12 and 14 are shown. In the frames 10, 12, 14 different pixel blocks 18 are shown as well.
  • disparity vectors 21.1 and 21.2 pointing to different pixel blocks 18 within frame 12 of the same time instance.
  • One of both vectors 21.1, 21.2 may be the result of a disparity estimation.
  • the other vector may be the result of a certain modification of the other vector.
  • the first vector may also be the result of a former estimation or any other prediction/estimation step.
  • Such vectors are called vector candidates and are provided as an estimation vector candidates array.
  • Such an estimation vector candidates array may be considered as a common estimation vector array which additionally contains estimation vector candidates for at least one of said estimation vectors.
  • each pixel block of frames 12, 14, 16 is assigned at least two vector candidates indicating motion or disparity.
  • two motion estimation vector candidates are provided for the pixel blocks of frame 12
  • two disparity vector candidates are provided for the pixel blocks of frame 14
  • two motion vector candidates are provided for the pixel blocks of frame 16.
  • All possible roundtrip paths are determined for at least some of the pixel blocks of frame 10. This means that starting from frame 10 two paths 21.1, 21.2 go to two pixel blocks of frame 12. Then there are again two paths from the respective pixel blocks to frame 14, so that four paths in total are going to frame 14. Then eight paths from four pixel blocks of frame 14 are going to frame 16. Finally, sixteen paths are going back to different pixel blocks of frame 10.
  • This best roundtrip path comprises four vector candidates which are then used for generating the respective disparity or motion estimation array for each of the four frames 10, 12, 14 and 16.
  • one of the vector candidates may be determined by an estimation step or may be the result of a temporarily previously estimated vector.
  • the other vector candidates are generated by modifying the first vector candidate in a predetermined manner.
  • the amount of modification or variation of the vectors could be fixed or variable.
  • the number of vector candidates in the present example two can be fixed or also variable.
  • the first graph shows the sequence of frames for forming the roundtrip path in an approach with four cameras and two time instances.
  • the approach uses four cameras and three time instances.
  • hardware current 40 comprises at least a generating unit 42 adapted to generate vector candidates for each estimation vector, a roundtrip path determining unit 44 adapted to concatenate vector candidates of said picture frames and a selection unit 46 adapted to select the roundtrip path according to a predetermined criterion.
  • One of the advantages of the method described above is that is exploits the spatial-temporal dependency of the motion disparity constraint.
  • one of the main aspects is to take advantage of the spatial-temporal redundancy in order to estimate correspondences between two frames (motion or disparity).
  • a circuit is a structural assemblage of electronic components including conventional circuit elements, integrated circuits including application specific integrated circuits, standard integrated circuits, application specific standard products, and field programmable gate arrays. Further a circuit includes central processing units, graphics processing units, and microprocessors which are programmed or configured according to software code. A circuit does not include pure software, although a circuit includes the above-described hardware executing software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method comprising the steps of: providing a number n x m picture frames; providing for each picture frame an estimation vector candidates array containing a plurality of vector candidates, said estimation vector candidates array indicating at least one motion and/or disparity relation between two of said picture frames; determining roundtrip paths from a starting block in a starting picture frame to the next picture frame and ending at an ending block in the starting picture frame, a path being formed by a concatenation of vector candidates of the respective picture frames; selecting one of said roundtrip paths according to a predetermined criterion, and using the respective vector candidates of the selected roundtrip path.

Description

Multiframe motion and disparity estimator
BACKGROUND
Field of the Disclosure
[0001] The present disclosure relates to a method, particularly for multiframe motion and disparity estimation. The present disclosure also relates to a device for image processing of a number of spatially and/or temporarily separated picture frames, a computer program and a non-transitory computer-readable recording medium. Description of Related Art
[0002] There is an increasing demand for 3D and multiple view applications, all of them requiring image processing, like motion estimation, disparity estimation or picture frame interpolation. In the art several methods for estimating motion and disparity are known. Most of them are working independently of each other and do not use spatial and temporal information between multiple picture frames captured by e.g. two or more cameras.
[0003] In the paper "Edge-preserving joint motion disparity estimation in stereo image sequences", Dongbo Min et al., Proceedings of the 6th IASTED International Conference, Signal and Image Processing, August 23 to 25, 2004, Honolulu, Hawai, USA, an approach for motion and disparity estimation using a constraint between motion and disparity in stereo image sequences is described.
[0004] Nevertheless there is still a demand for providing better motion and disparity estimations in 3D applications.
[0005] The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventor(s), to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
SUMMARY
[0006] It is an object to provide a method which achieves improved motion and disparity estimations. It is a further object to provide a device for image processing which achieves improved motion and disparity estimations, as well as a corresponding computer program for implementing the method and a non-transitory computer-readable recording medium for implementing the method.
[0007] According to an aspect there is provided a method comprising:
providing a number n x m picture frames;
providing for each picture frame an estimation vector candidates array containing a plurality of vector candidates, said estimation vector candidates array indicating at least one motion and/or disparity relation between two of said picture frames;
determining roundtrip paths from a starting block in a starting picture frame to the next picture frame and ending at an ending block in the starting picture frame, a path being formed by a concatenation of vector candidates of the respective picture frames;
selecting one of said roundtrip paths according to a predetermined criterion, and using the respective vector candidates of the selected roundtrip path.
[0008] According to a further aspect there is provided a device for image processing of a number of spatially and/or temporarily separated picture frames, comprising a generating unit adapted to generate vector candidates for each estimation vector of an estimation vector array of a picture frame,
a roundtrip path determining unit adapted to concatenate the vector candidates of said picture frames, and
a selection unit adapted to select one of the roundtrip paths according to a predetermined criterion.
[0009] According to still further aspects a computer program comprising program code means for causing a computer to perform the steps of the method disclosed herein when said computer program is carried out on a computer is provided. Further, a non-transitory computer-readable recording medium that stores therein a computer program product, which when executed by a processor causes the method disclosed herein to be performed is provided as well. [0010] Preferred embodiments are defined in the dependent claims. It shall be understood that the claimed device, the claimed computer program and the claimed computer-readable recording medium have similar and/or identical preferred embodiments as the claimed method and as defined in the dependent claims.
[0011] One of the aspects of the present disclosure is to implement a generic motion and disparity estimation using multiple frames in time and space. The disclosed object takes advantage of the spatial-temporal redundancy between multiple picture frames in order to estimate correspondences between two frames (motion or disparity). The estimations across the multiple picture frames are "chained'Vconcatenated" and it is then evaluated whether the estimation chain which goes across different picture frames and comes back to the first picture frame ends where it started. In an optimum case the chain should end where it starts, provided that the respective areas in the picture frame are non- occluded areas. This evaluation allows to find the "best" vector candidates in the estimation vector candidates array which in turn allows to form a better estimation vector array.
[0012] It is to be understood that both the foregoing general description of the invention and the following detailed description are exemplary, but are not restrictive of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Fig. 1 shows a block diagram of multiple, preferably four picture frames separated in time and space; Fig. 2 shows a block diagram illustrating the concatenation of estimation vectors between several picture frames;
Fig. 3 shows a decision tree to illustrate how to find a best roundtrip path;
Fig. 4 shows two block diagrams illustrating how to relate the picture frames of multiple cameras and multiple time instances; and
Fig. 5 shows a block diagram of a device for image processing.
DESCRIPTION OF THE EMBODIMENTS
[0014] Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, Fig. 1 shows an example of four picture frames (in the following only called "frames") captured by two cameras as left and right frames in two time instances t, t+1. The frames in Fig. 1 are indicated with reference numerals 10, 12, 14 and 16. As generally known, in a digital electronics environment, a frame is built up of pixels, each pixel carrying information for example on color, etc. In Fig. 1 a pixel block 18 is schematically shown in a first position within the frame 10 captured by a left camera. In the right frame 12 captured by a right camera, the corresponding pixel block 18 is located in a different position, wherein the difference between both positions characterizes the disparity between both corresponding pixel blocks 18. The disparity is indicated by the arrow d(t).
[0015] In Fig. 1 it is also shown that the position of the pixel block 18 has changed from frame 10 to frame 16 as well as from frame 12 to frame 14. This change of position is caused by movement of the pixel block 18 between time instances t and t+1. This change of position of the pixel block 18 is called motion and is indicated by arrows vl(t, t+1) and vr(t, t+1), respectively. [0016] It is known that disparity and motion between two frames is represented by a two dimensional vector indicating the difference between the position of corresponding matching pixel blocks.
[0017] In many image processing applications, disparity and motion vectors are necessary, for example for interpolating frames or interpolating areas of a frame. However, disparity vectors and motion vectors have to be estimated by using motion and disparity estimator circuits. The main object of such estimators is to find corresponding/matching pixel blocks in two frames. As mentioned before, motion estimation and disparity estimation are carried out independently of each other in prior art solutions.
[0018] However, there is a so called motion-disparity-constraint saying that a path starting at pixel block 18 of frame 10 to the corresponding pixel block of frame 12 to the corresponding pixel block of frame 14 to the corresponding pixel block of frame 16 and back to pixel block 18 of first frame 10 forms a closed roundtrip in a theoretical consideration. This roundtrip which is indicated by reference numeral 20 is built up by four vectors 21, 22, 23 and 24 in the present embodiment, wherein the vectors 21 and 23 are disparity vectors and vectors 22 and 24 are motion vectors.
[0019] Hence, in an optimal motion and disparity estimation, the vector chain/concatenation starting from pixel block 18 of frame 10 should end at pixel block 18 again. In other words, the sum of the four vectors 21, 22, 23 and 24 should be 0 in an ideal estimation.
[0020] Considering this constraint, motion and disparity estimations of multiple frames can be evaluated and the estimations with the "best" roundtrip path can then be used. The "best" roundtrip path may be for example that path ending closest to the starting pixel block 18 of frame 10. [0021] The above mentioned roundtrip path may now be advantageously used for example for evaluating the result of a motion and disparity estimation and as a supplement thereof for selecting the "best" motion and disparity estimations out of a plurality of different estimations.
[0022] In the following and with reference to Figs. 2, 3 and 4, it is described in detail how to use the roundtrip path approach for improving the motion and disparity estimations.
[0023] In Fig. 2, three frames (out of four frames for example) 10, 12 and 14 are shown. In the frames 10, 12, 14 different pixel blocks 18 are shown as well.
[0024] With respect to pixel block 18 of frame 10, there are two disparity vectors 21.1 and 21.2 pointing to different pixel blocks 18 within frame 12 of the same time instance. One of both vectors 21.1, 21.2 may be the result of a disparity estimation. The other vector may be the result of a certain modification of the other vector. The first vector may also be the result of a former estimation or any other prediction/estimation step.
[0025] In the following, such vectors are called vector candidates and are provided as an estimation vector candidates array. Such an estimation vector candidates array may be considered as a common estimation vector array which additionally contains estimation vector candidates for at least one of said estimation vectors. In this example, there are two vector candidates assigned to a pixel block and pointing to two different pixel blocks of the next frame 12. It should be understood that it is also possible to increase the number of vector candidates, e.g. to 10 vector candidates or more.
[0026] As a result, there are at least two disparity vector candidates for different pixel blocks of frame 10. [0027] The same applies to the other frames 12, 14 and 16 when using four picture frames as a basis for estimation (as shown in Fig. 1). Consequently, each pixel block of frames 12, 14, 16 is assigned at least two vector candidates indicating motion or disparity.
[0028] In other words, two motion estimation vector candidates are provided for the pixel blocks of frame 12, two disparity vector candidates are provided for the pixel blocks of frame 14 and two motion vector candidates are provided for the pixel blocks of frame 16.
[0029] As a result, there are 24 possibilities to form a roundtrip. These possibilities are illustrated in Fig. 3 by a tree 30. In this example using two candidates and four frames, 16 roundtrip paths are possible indicated by 16 leaf nodes 32.
[0030] All possible roundtrip paths are determined for at least some of the pixel blocks of frame 10. This means that starting from frame 10 two paths 21.1, 21.2 go to two pixel blocks of frame 12. Then there are again two paths from the respective pixel blocks to frame 14, so that four paths in total are going to frame 14. Then eight paths from four pixel blocks of frame 14 are going to frame 16. Finally, sixteen paths are going back to different pixel blocks of frame 10.
[0031] After having processed these different roundtrip paths, it is then determined which of the roundtrip paths ends closest to the starting pixel block 18 of frame 10. This best roundtrip path comprises four vector candidates which are then used for generating the respective disparity or motion estimation array for each of the four frames 10, 12, 14 and 16.
[0032] These steps are then repeated for the other pixel blocks of the four frames 10, 12, 14, and 16. [0033] If multiple roundtrip paths ending at the starting pixel point, another criterion can be used for selecting one of these roundtrip paths. One criterion is for example minimum error between the respective pixel blocks of the roundtrip path. Minimum error may be calculated by the sum of absolute differences (SAD), for example. It shall be understood that the above mentioned method is repeated for every pixel block as to generate a complete disparity and/or motion estimation array.
[0034] As already mentioned before, one of the vector candidates may be determined by an estimation step or may be the result of a temporarily previously estimated vector. The other vector candidates are generated by modifying the first vector candidate in a predetermined manner. The amount of modification or variation of the vectors could be fixed or variable. Also, the number of vector candidates (in the present example two) can be fixed or also variable.
[0035] In order to have a better quality with respect to the roundtrip path evaluation, closed frames (temporarily or spatially) should be used. This means that a Hamiltoni- an circuit has to exist in the graph representing the frames. This graph is illustrated in Fig. 4, for example. This also means that either the number of cameras or the number of time instances (or both) must be even (as a consequence of Grinberg's theorem) in a typical m x n setup with n cameras and m time instances.
[0036] In Fig. 4, the first graph shows the sequence of frames for forming the roundtrip path in an approach with four cameras and two time instances. In the second graph of Fig. 4, the approach uses four cameras and three time instances.
[0037] The above described method is preferably used in off line image processing and the described method could be implemented in hardware as well as software. For example, a method using two frames could be implemented in hardware perfectly. [0038] As shown in Fig. 5, hardware current 40 comprises at least a generating unit 42 adapted to generate vector candidates for each estimation vector, a roundtrip path determining unit 44 adapted to concatenate vector candidates of said picture frames and a selection unit 46 adapted to select the roundtrip path according to a predetermined criterion.
[0039] One of the advantages of the method described above is that is exploits the spatial-temporal dependency of the motion disparity constraint. Hence, one of the main aspects is to take advantage of the spatial-temporal redundancy in order to estimate correspondences between two frames (motion or disparity).
[0040] Obviously, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
[0041] In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
[0042] In so far as embodiments of the invention have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present invention. Further, such a software may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. [0043] A circuit is a structural assemblage of electronic components including conventional circuit elements, integrated circuits including application specific integrated circuits, standard integrated circuits, application specific standard products, and field programmable gate arrays. Further a circuit includes central processing units, graphics processing units, and microprocessors which are programmed or configured according to software code. A circuit does not include pure software, although a circuit includes the above-described hardware executing software.

Claims

1. Method comprising
providing a number n x m picture frames;
providing for each picture frame an estimation vector candidates array containing a plurality of vector candidates, said estimation vector candidates array indicating at least one motion and/or disparity relation between two of said picture frames;
determining roundtrip paths from a starting block in a starting picture frame to the next picture frame and ending at an ending block in the starting picture frame, a path being formed by a concatenation of vector candidates of the respective picture frames;
selecting one of said roundtrip paths according to a predetermined criterion, and using the respective vector candidates of the selected roundtrip path.
2. Method of claim 1, wherein said number n indicates the number of cameras used for capturing the picture frames.
3. Method of claim 1, wherein said number m indicates a number of time instances (t, t+1).
4. Method of claim 2 or 3, wherein said number n and/or said number m are even.
5. Method of claim 1, wherein said criterion is a minimum error between the starting block and the ending block.
6. Method of claim 5, wherein said criterion is the sum of absolute differences.
7 Method of claim 1, wherein said number of picture frames contain at least two spatially separated (left and right; number n) and at least two temporarily separated (t and t+1; number m) picture frames.
8. Method of claim 6, wherein said roundtrip is defined from one picture frame to the subsequent within the same time instance t, then from picture frame to the subsequent within the next time instance t+1 and then back to the time instance t.
9. Method of claim 1, wherein providing for each picture frame an estimation vector candidates array comprises
providing at least one estimation vector array, and
generating a number of p vector candidates for each estimation vector of the at least one estimation vector array.
10. Method of claim 9, wherein a vector candidate is determined by modifying an estimation vector of said at least one estimation vector array.
11. Method of claim 9, wherein said number p is variable.
12. Device for image processing of a number of spatially and/or temporarily separated picture frames, comprising
a generating unit adapted to generate vector candidates for each estimation vector of an estimation vector array of a picture frame,
a roundtrip path determining unit adapted to concatenate vector candidates of said picture frames,
a selection unit adapted to select one of the roundtrip paths according to a predetermined criterion, and
a correction unit adapted to correct the estimation vector in response to the selected roundtrip.
13. A computer program comprising program code means for causing a computer to perform the steps of said method as claimed in claim 1 when said computer program is carried out on a computer.
14. A non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method according to claim 1 to be performed.
PCT/EP2014/053634 2013-03-05 2014-02-25 Multi frame motion and disparity estimator WO2014135400A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP13157717.3 2013-03-05
EP13157717 2013-03-05

Publications (1)

Publication Number Publication Date
WO2014135400A1 true WO2014135400A1 (en) 2014-09-12

Family

ID=47900632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/053634 WO2014135400A1 (en) 2013-03-05 2014-02-25 Multi frame motion and disparity estimator

Country Status (1)

Country Link
WO (1) WO2014135400A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3043316A1 (en) * 2015-01-08 2016-07-13 Thomson Licensing Method and apparatus for generating superpixels for multi-view images
WO2017143572A1 (en) * 2016-02-25 2017-08-31 Intel Corporation Calculation of temporally coherent disparity from sequence of video frames

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANITA SELLENT ET AL: "A loop-consistency measure for dense correspondences in multi-view video", IMAGE AND VISION COMPUTING, vol. 30, no. 9, September 2012 (2012-09-01), pages 641 - 654, XP055115409, ISSN: 0262-8856, DOI: 10.1016/j.imavis.2012.06.011 *
DARIBO I ET AL: "Joint depth-motion dense estimation for multiview video coding", JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, ACADEMIC PRESS, INC, US, vol. 21, no. 5-6, 6 January 2010 (2010-01-06), pages 487 - 497, XP027067825, ISSN: 1047-3203, [retrieved on 20100106], DOI: 10.1016/J.JVCIR.2009.12.004 *
DONGBO MIN ET AL.: "Edge-preserving joint motion disparity estimation in stereo image sequences", PROCEEDINGS OF THE 6TH IASTED INTERNATIONAL CONFERENCE, SIGNAL AND IMAGE PROCESSING, 23 August 2004 (2004-08-23)
ZHI-PIN DENG ET AL: "Iterative search strategy with selective bi-directional prediction for low complexity multiview video coding", JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, vol. 23, no. 3, 3 February 2012 (2012-02-03), pages 522 - 534, XP055115333, ISSN: 1047-3203, DOI: 10.1016/j.jvcir.2012.01.016 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3043316A1 (en) * 2015-01-08 2016-07-13 Thomson Licensing Method and apparatus for generating superpixels for multi-view images
EP3043315A1 (en) * 2015-01-08 2016-07-13 Thomson Licensing Method and apparatus for generating superpixels for multi-view images
WO2017143572A1 (en) * 2016-02-25 2017-08-31 Intel Corporation Calculation of temporally coherent disparity from sequence of video frames
US10701335B2 (en) 2016-02-25 2020-06-30 Intel Corporation Calculation of temporally coherent disparity from sequence of video frames

Similar Documents

Publication Publication Date Title
CN106254885B (en) Data processing system, method of performing motion estimation
US9661227B2 (en) Method, circuit and system for stabilizing digital image
JP6998388B2 (en) Methods and equipment for processing image property maps
JP6562197B2 (en) Image processing method and image processing system
US9148622B2 (en) Halo reduction in frame-rate-conversion using hybrid bi-directional motion vectors for occlusion/disocclusion detection
KR101938205B1 (en) Method for depth video filtering and apparatus thereof
US20160080770A1 (en) Encoding system using motion estimation and encoding method using motion estimation
US9158994B2 (en) Apparatus and method for real-time capable disparity estimation for virtual view rendering suitable for multi-threaded execution
WO2013079660A9 (en) Disparity map generation including reliability estimation
KR20120032560A (en) Techniques to perform video stabilization and detect video shot boundaries based on common processing elements
US20100232509A1 (en) Method and apparatus to improve the convergence speed of a recursive motion estimator
WO2014135400A1 (en) Multi frame motion and disparity estimator
US20170085912A1 (en) Video sequence processing
CN106303545B (en) Data processing system and method for performing motion estimation in a sequence of frames
JP2009213161A (en) Video coding method, video decoding method, video coding program, video decoding program, and computer-readable recording medium with the programs recorded thereon
WO2014023641A1 (en) Refinement of a disparity or motion map based on user interaction
KR101050135B1 (en) Intermediate image generation method using optical information
EP2237559B1 (en) Background motion estimate based halo reduction
WO2014135401A1 (en) System for frame interpolation
US10448043B2 (en) Motion estimation method and motion estimator for estimating motion vector of block of current frame
IL274103A (en) Image restoration method
JP4779904B2 (en) Stereo video processing apparatus and stereo video processing method program
Okade et al. A novel motion vector outlier removal technique based on adaptive weighted vector median filtering for global motion estimation
JP7185496B2 (en) Video interpolation device and program
TWI485651B (en) Method for depth estimation and device usnig the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14706590

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14706590

Country of ref document: EP

Kind code of ref document: A1