EP2522140A1 - Estimation de mouvement global et dense - Google Patents
Estimation de mouvement global et denseInfo
- Publication number
- EP2522140A1 EP2522140A1 EP11704249A EP11704249A EP2522140A1 EP 2522140 A1 EP2522140 A1 EP 2522140A1 EP 11704249 A EP11704249 A EP 11704249A EP 11704249 A EP11704249 A EP 11704249A EP 2522140 A1 EP2522140 A1 EP 2522140A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- motion
- estimated
- previous
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 4
- 230000001960 triggered effect Effects 0.000 claims description 2
- 230000007246 mechanism Effects 0.000 description 13
- 239000011159 matrix material Substances 0.000 description 11
- 238000013519 translation Methods 0.000 description 9
- 239000013598 vector Substances 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002620 method output Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/527—Global motion vector estimation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
Definitions
- the present invention relates to the field of image processing, and more particularly to the field of motion estimation within a captured image sequence.
- a sensor When a sensor captures a sequence of images successively, such as in the case of a video, it is conventional to perform a global motion estimation inter-image.
- This motion estimation aims at determining the global movement affecting the sequence of images between two successive images. It can correspond to the determination of the movement of the line of sight of the sensor used.
- Such an overall motion estimation makes it possible, in particular, to implement image stabilization, or image denoising, or to set up a super-resolution mechanism.
- this type of mechanism can be substantially disturbed when the captured scene corresponds to one or more large objects that are movable during the captured image sequence or when the captured image sequence has little contrast. Thus, in these cases in particular, it is possible that the overall motion estimate is ultimately wrong.
- Some image processing systems rely on a sequential implementation of a global motion estimation and a dense motion estimation or 'local motion estimation'.
- a dense motion estimation consists of an estimation of motion at each point of the images of the sequence captured between two successive images.
- the latter estimate may be performed on recalibrated images based on the previous overall motion estimate.
- the dense motion estimation makes it possible to calculate a residual motion at any point of an image.
- This sequence of the global motion estimation and the estimation of Dense motion can provide a powerful motion estimation in the case where the global motion estimation is itself reliable.
- the invention aims to improve the situation.
- a first aspect of the invention provides a motion estimation method in a series of images captured by an image sensor, said series of images comprising at least one previous image and one subsequent image;
- said estimation method comprising the following steps:
- Ibl obtain a recalibrated image based on the first estimated motion, from one of the preceding and following images;
- an estimated second motion by performing a dense motion estimation between the recalibrated image and the other one of the preceding and following images; there to determine a residual value of global movement (for example on the basis of the estimated second movement); and lel if the residual value of movement is less than a threshold value, provide the estimated second motion, otherwise repeat the steps lal-lel;
- the first estimated motion is determined by applying a bit mask image
- step lel steps / a / lel are repeated, they are performed by applying an updated image bit mask according to the estimated second motion.
- the steps / a / to lel executed the first time can include a null mask (not hiding anything), or even not to include a mask at all.
- Such an absence of masking can make it possible to improve performance by avoiding a masking step (superfluous when it comprises a null mask), at the cost of a possibly more complex implementation comprising a particular treatment of the first iteration.
- previous image and “next image” are understood to mean a series of captured images, two images that follow each other chronologically in the series of images considered. These two images that follow can be consecutive or even spaced between them by one or more intermediate captured images. No limitation is attached to the spacing between the previous image and the next image considered here.
- the term 'global motion estimation' is understood to mean a motion estimation mechanism which makes it possible to represent the movement affecting the series of images captured between a previous image and a subsequent image, in a global manner.
- This global movement may correspond to an estimate of the movement of the line of sight of the sensor used.
- the global movement may also correspond, for example, to the movement, in three rotational dimensions and three translation dimensions, of a camera incorporating an image sensor, with respect to a scene filmed by said camera. No limitation is attached to the type of estimation mechanism used here.
- the Lucas-Kanade method published in 1984 in the thesis "Generalized image matching by the method of differences" and originally used for dense motion estimation, can be adapted for global motion. It consists in determining a restricted number of parameters (translation, roll, zoom %) by least-squares resolution of the equation of apparent motion on significant points of the image.
- Recalibrated Image is a captured image that has been recalibrated based on the estimated overall motion.
- the recalibrated image can thus correspond, for example, to the estimation of a next image, according to a previous image and to an estimate of a global movement between the previous image and the next image.
- the recalibrated image is not necessarily identical to the next image, because the estimation of the global motion does not necessarily make it possible to deduce exactly the next image from a previous image.
- estimating overall motion generally gives an indication of average motion that typically does not account for moving elements possibly located within a filmed scene. The position of these moving elements in the imaged image can therefore be erroneous.
- the term 'dense motion estimation' is understood to mean a motion estimation mechanism which makes it possible to represent the movement affecting the series of images captured between a previous image and a subsequent image, partially within the image. .
- such an estimation mechanism makes it possible to provide motion vectors for each part of the image, with more or less precision. It is thus possible to estimate an image movement at each image point, for example.
- a scene can be divided into different parts each corresponding to a particular element of the scene. Each part can correspond to the smallest element of the image (point, also called pixel). But a part can also correspond to a particular form present in the image. For example, objects, such as automobiles, can be recognized on series of images captured by road radar. Each car can then correspond to a part of the image.
- these parts are not necessarily superimposable (even in the absence of global motion) since the respective movements of the different parts are not necessarily the same.
- the relative positions in a first image and a second image of two parts of the scene are not necessarily the same (for example the two parts may be farther apart from each other in the first image only in the second).
- Scene parts may disappear (for example when they leave the image field), others may appear.
- the different parts can be defined using a grid of the image, for example a regular grid in the form of small rectangles. An image portion can then correspond to a subset of rectangles having common movement characteristics (for example, a dense movement above a certain threshold for this part).
- This type of motion estimation provides a second estimated motion that can be used to represent the movement of moving objects that could pass through the images of the series of captured images.
- the first method consists in solving the equation of the least-squares motion on a local window.
- the second method extracted from the article "Determining optical flow” (1981), consists of a minimization of functionals whose term of attachment to the data corresponds to the square of the equation of the apparent motion, and a regularization term is the square of the local variation of the field.
- the estimated second motion may correspond to a triplet of values for each image portion (or each image point): a value of a translational vector according to a first direction U, a value of a translation vector according to a second sense V, and an associated reliability value N.
- the estimated second motion corresponds to a set of estimated sub-movements for each image portion in the image sequence.
- this second estimated motion measures a dense movement between the imaged image and the real image, which is advantageous because it allows an effective measurement of the residual value of movement allowing (if necessary) to effectively converge the iterations possible towards a relevant estimate of global movement.
- residual value of motion is intended to mean a measure relating to the residual global translation, that is to say which is always present at the end of the step Here.
- This residual value of movement can be determined for example by performing an average of this translational movement on all points of the image for which there are values of U and V whose reliability N exceeds a threshold.
- Such an estimation method proposes not only to sequentially perform a global motion estimation followed by a dense motion estimation during a series of steps / a / - / e /, but it proposes to besides this series of steps is iterative.
- this estimation method is an iterative process that guarantees a certain determined level of quality.
- a bitmask may be used to ignore portions of the dumped image (and the image to which the dumped image is compared) corresponding to moving elements of the filmed scene.
- the estimated first motion corresponding to a global motion
- the estimated first motion is determined by applying a bit mask image.
- step lel it is decided to repeat the steps lal-lel, then these steps are performed by applying an updated image bit mask according to the estimated second motion.
- this image bitmask can be initialized to 0 for the first iteration of the series of process steps, i.e. no image portion is initially masked. Then, at the end of each iteration of the series of steps, it is then possible to update this image bit mask on the basis of the estimated second estimated motion. Thus, at the next iteration, the updated image bitmask can be applied to obtain a reliable overall motion estimate. So, advantageously here, it is intended to apply to the global motion estimation an image bitmask that is updated on the basis of the dense motion estimation. By proceeding in this way, it is possible to integrate within the global motion estimation mechanism information derived from the dense motion estimation mechanism. This combination of iterative mechanisms provides reliable results while remaining simple to apply.
- the bit mask can be obtained in the following way: a residual value of movement is calculated on all the points where the computation of the dense estimate is made. Then, the bit mask indicates the value 1 for the points at which the dense movement moves away from said calculated residual by more than a determined threshold value.
- the steps / wing / are reiterated, they are applied to the same previous and next images. Thus, until the quality threshold level is reached, the series of steps is reiterated on the same images.
- the recalibrated image may advantageously be obtained by resetting the next image on the previous image on the basis of the estimated first motion; and in step lel, the updated image bitmask may be a mask to be applied to the previous image.
- the registration of one of the two following and previous images is done in a chronological sense since it is a matter of placing a next image on an image that precedes it.
- the bit mask image updated in this context can be applied directly to the previous image during the next iteration of the series of steps.
- steps lai -lel are reiterated, at each reiteration, they are applied by considering the following image of the previous iteration as than previous picture and in considering an image that follows said next image of the previous iteration as the next image.
- successive iterations of the series of steps are applied to a previous image pair and subsequent different image. Indeed, it is planned here to apply the following iteration to the old next image, which is then considered as the previous image, and to an image that follows it in the captured image sequence, which is then considered as than next picture.
- This embodiment is advantageously adapted for real-time applications that require rapid processing.
- step Ibl the recalibrated image is obtained by resetting the previous image on the next image on the basis of the inverse of the first estimated motion.
- the updated image bit mask may be a mask to be applied to the next image of the current iteration, which becomes the previous image in the next iteration. Thanks to this anti-chronological mechanism, the application of the method is simplified in a fast mode.
- a second aspect of the present invention provides an image processing device comprising means for implementing a motion estimation method according to the first aspect of the present invention.
- a third aspect of the present invention provides a computer program including instructions for implementing the method according to the first aspect of the present invention, when the program is executed by a processor.
- a fourth aspect of the present invention provides a recording medium on which the computer program according to the third aspect of the present invention is stored.
- Figure 1 illustrates the main steps of a motion estimation method according to an embodiment of the present invention
- Figure 2 illustrates an implementation of an estimation method according to an embodiment of the present invention
- Figure 3 illustrates another implementation of an estimation method according to an embodiment of the present invention
- FIG. 4 illustrates a processing device comprising means adapted to implement a method according to an embodiment of the present invention.
- Figure 1 illustrates the main steps of the motion estimation method according to an embodiment of the present invention.
- This series of images comprises at least one previous image I n -i and a subsequent image I n .
- a first estimated movement H is obtained by performing a global motion estimation from the previous image l n -i to the following image l n .
- This first estimated movement can be represented in the form of a homographic registration matrix H.
- a recalibrated image, the n- i or the n is obtained on the basis of the estimated first motion, respectively from the previous image I n- i or from the following image l n .
- a second estimated motion is obtained by performing a dense motion estimation between the recalibrated image and the other one of the previous and next images.
- a residual value r of global motion is determined on the basis of an average performed on all the calculation points of the dense field U and V for which the reliability N is greater than a threshold value.
- a step 105 if the residual value r is less than a threshold value S, the second estimated movement is provided during the current iteration in a step 106, otherwise steps 101 -105 are repeated.
- FIG. 2 illustrates an implementation of the present invention according to an embodiment of the present invention.
- the present invention implements an image M bit mask which cleverly avoids taking into account information relating to certain parts of the image.
- the bit mask image initially indicates the value zero relative to each part of the image.
- the image bit mask initially applied makes it possible to take into account the initialization of the image as a whole.
- the image bit mask is updated here with each new iteration. It is therefore noted M (i).
- H a matrix making it possible to model the global movement of the image.
- This matrix H can in particular illustrate a first and a second translation representing the global movement according to a first and a second a second respective direction.
- This matrix H n- i, n makes it possible to homographically recalibrate an image I n on an image I n- i.
- the following steps are iteratively applied to two same images which are captured successively, the first captured image being referenced I n -i and the next captured image being referenced l n .
- the iteration index i is initialized to the value 0. It is furthermore provided that M (0) is a null binary image mask, that is to say that no point of the image is masked during the first iteration of the process steps.
- a step 202 the two images l n -i and l n which succeed one another in the sequence of captured images are then selected.
- the mask M (i) is applied to the image I n- i.
- an overall motion estimation of l n -i to l n is performed . This global motion estimation is represented via the homographic registration matrix H (i).
- a step 205 the image l n is recaled on l n- on the basis of the registration matrix H (i).
- a recalibrated image I n ' is obtained.
- a dense motion estimate is made, i.e. for example, determining a value of the translation vector U and a translation value, as well as a reliability value N for these vector values at each point of the image I n- i.
- an overall residual motion value calculation r is performed and this value is compared with a threshold value S. In the case where the residual movement r is lower than at the threshold value S, then the steps are not repeated and the last estimated second movement is provided at a step 208. Two other images of the captured image sequence can then be considered and the same method applied to them.
- the image bit mask M (i) is updated on the basis of the estimated second movement at a step 209. Then, the iteration index is incremented from the value 1 to a step 210 and the previously described steps are performed again on the same images I n- and i n .
- Figure 3 illustrates another implementation of the present invention according to one embodiment of the present invention.
- it is here also proposed to implement a global motion estimation followed by a dense motion estimation in an interleaved and iterative manner.
- each iteration is applied to two different successive images. More precisely, at a previous iteration, it is planned to apply a processing to a first and a second successive image, then to a next iteration, this same processing is applied to this second image and to an image that follows this second image in the sequence of captured images.
- This method according to an embodiment of the present invention is applied to a captured image sequence I, for i integer between 1 and N.
- the first iteration of the series of steps of the processing method is here applied to the images. ⁇ and l 2 .
- the second iteration of the series of steps is applied to the images l 2 and l 3 .
- the first image bit mask applied is a null mask.
- these two images are considered l n -i and l n .
- an image bit mask M (n-1) is applied to the image 1 n- i.
- the application of this mask makes it possible to take into consideration within the image l n -i only the points which are of interest with regard to the motion estimation method.
- an overall motion estimation is made on the two images l n- i and l n on the basis of which a motion homographic matrix H of the movement from the image l n -i to l n .
- a global movement homographic matrix is determined to be applied to the image I n -i to reset it on the image l n .
- This matrix corresponds to 1 / H, that is to say the inverse of the homographic matrix H.
- a dense motion estimation is made from the image 1 n to the recalibrated image ⁇ -- ⁇ . Is deduced at each point of the image I n a value of the motion vector U n, a value of the motion vector V n, and a reliability value associated N n, with respect to the image I n. On the basis of these values, an overall residual motion value r n is determined relative to the image l n . Thus, in a step 306, this residual movement value r n is compared with a threshold value S of quality. If the reliability value r n is less than the value S, then the series of steps is not repeated and the estimation method outputs the values of the following quantities:
- step 308 an image bit mask M (n-1) is determined so as to be able to apply it to the previous image of the next iteration, this image corresponding to the In picture of the previous iteration. Then, at a step 309, the index n is incremented by the value 1.
- the bit mask image M obtained before the next iteration of the series of steps is directly applicable to the image l n -i because it is determined on the basis of information relating to an anti-chronological sense in the sequence of captured images.
- the values of U, V and N provided by the step of estimating dense motion are determined on a comparison from the image l n to the image l n- i, that is to say by considering the two images in a direction anti chronological with respect to their order in the sequence of captured images.
- Fig. 4 illustrates an image processing apparatus adapted to implement a method according to an embodiment of the present invention.
- a device may advantageously correspond to an on-board electronic card.
- Such a device comprises:
- a first estimation unit 41 adapted to obtain a first motion estimated by estimating global motion from the previous image I n-i to the next image l n;
- a registration unit 42 adapted to obtain a n- i or n-based image on the basis of the estimated first motion, from one of the preceding and following images 1 n- 1, n ;
- a second estimation unit 43 adapted to obtain a second estimated movement U, V, N by performing a dense motion estimation between the recalibrated image and the other one of the preceding and following images;
- a determination unit 44 adapted to determine a residual value of global movement r
- a suitable control unit 45 for triggering a sequence of steps by sequentially controlling the first estimation unit, the registration unit, the second estimation unit and the determination unit, and on the other hand, to decide whether the residual value is less than a threshold value S to provide the last estimated second estimated motion U, V, N, and if not to trigger said sequence of steps again.
- the first estimation unit 41 can take into account an image bit mask M; and the determining unit 43 may be adapted to update the image bit mask according to the estimated second motion. If the sequence of steps is triggered again, the steps can be applied to the same previous and next images.
- the rescaled image may be provided by the resetting unit 42 by resetting the next image In on the previous image ln-1 based on the estimated first motion; and the determining unit 43 can provide the updated image bit mask as a mask to be applied to the previous image.
- steps / a / - / e / are repeated, at each reiteration, they are applied by considering the following image of the previous iteration as the preceding image and considering an image that follows said image next iteration as the next image.
- the recalibrated image n can be provided by the registration unit 42 by resetting the previous image I n -i on the next image I n on the basis of the inverse of the first one. estimated movement; and the determining unit 43 may provide the updated image bit mask as a mask to be applied to the next image of the next iteration.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1050014A FR2955007B1 (fr) | 2010-01-04 | 2010-01-04 | Estimation de mouvement global et dense |
PCT/FR2011/050010 WO2011080495A1 (fr) | 2010-01-04 | 2011-01-04 | Estimation de mouvement global et dense |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2522140A1 true EP2522140A1 (fr) | 2012-11-14 |
Family
ID=42735284
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP11704249A Ceased EP2522140A1 (fr) | 2010-01-04 | 2011-01-04 | Estimation de mouvement global et dense |
Country Status (6)
Country | Link |
---|---|
US (1) | US8873809B2 (fr) |
EP (1) | EP2522140A1 (fr) |
BR (1) | BR112012016403A2 (fr) |
FR (1) | FR2955007B1 (fr) |
RU (1) | RU2565515C2 (fr) |
WO (1) | WO2011080495A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3018147B1 (fr) | 2014-03-03 | 2016-03-04 | Sagem Defense Securite | Debruitage video optimise pour systeme multicapteur heterogene |
CN110232361B (zh) * | 2019-06-18 | 2021-04-02 | 中国科学院合肥物质科学研究院 | 基于三维残差稠密网络的人体行为意图识别方法与系统 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6095976A (en) * | 1997-06-19 | 2000-08-01 | Medinol Ltd. | Method for enhancing an image derived from reflected ultrasound signals produced by an ultrasound transmitter and detector inserted in a bodily lumen |
WO2001003157A1 (fr) * | 1999-07-01 | 2001-01-11 | General Nanotechnology, Llc | Systeme et procede servant a inspecter et/ou modifier un objet |
RU2159958C1 (ru) * | 1999-09-21 | 2000-11-27 | Садыков Султан Садыкович | Устройство обработки цветных изображений |
US6809758B1 (en) * | 1999-12-29 | 2004-10-26 | Eastman Kodak Company | Automated stabilization method for digital image sequences |
JP4765194B2 (ja) * | 2001-05-10 | 2011-09-07 | ソニー株式会社 | 動画像符号化装置、動画像符号化方法、動画像符号化プログラム格納媒体及び動画像符号化プログラム |
DE10316208A1 (de) * | 2002-04-12 | 2003-11-20 | Samsung Electro Mech | Navigationssystem und Navigationsverfahren |
GB0229096D0 (en) * | 2002-12-13 | 2003-01-15 | Qinetiq Ltd | Image stabilisation system and method |
US7705884B2 (en) * | 2004-07-21 | 2010-04-27 | Zoran Corporation | Processing of video data to compensate for unintended camera motion between acquired image frames |
EP1755342A1 (fr) * | 2005-08-19 | 2007-02-21 | Thomson Licensing | Méthode et dispositif de calcul itératif de paramètres de mouvement pour une séquence d'images à partir de vecteurs de mouvement de blocs |
US8013895B2 (en) * | 2006-08-07 | 2011-09-06 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Optical motion sensing |
JP5040258B2 (ja) * | 2006-10-23 | 2012-10-03 | 株式会社日立製作所 | 映像監視装置、映像監視システムおよび画像処理方法 |
US8040382B2 (en) * | 2008-01-07 | 2011-10-18 | Dp Technologies, Inc. | Method and apparatus for improving photo image quality |
JP2009290827A (ja) * | 2008-06-02 | 2009-12-10 | Sony Corp | 画像処理装置および画像処理方法 |
EP2489180A1 (fr) * | 2009-10-14 | 2012-08-22 | CSR Technology Inc. | Procédé et appareil de stabilisation d'image |
US8488007B2 (en) * | 2010-01-19 | 2013-07-16 | Sony Corporation | Method to estimate segmented motion |
US8570344B2 (en) * | 2010-04-02 | 2013-10-29 | Qualcomm Incorporated | Augmented reality direction orientation mask |
-
2010
- 2010-01-04 FR FR1050014A patent/FR2955007B1/fr active Active
-
2011
- 2011-01-04 RU RU2012133468/08A patent/RU2565515C2/ru active
- 2011-01-04 US US13/519,944 patent/US8873809B2/en active Active
- 2011-01-04 WO PCT/FR2011/050010 patent/WO2011080495A1/fr active Application Filing
- 2011-01-04 BR BR112012016403-1A patent/BR112012016403A2/pt not_active IP Right Cessation
- 2011-01-04 EP EP11704249A patent/EP2522140A1/fr not_active Ceased
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2011080495A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2011080495A1 (fr) | 2011-07-07 |
FR2955007B1 (fr) | 2012-02-17 |
FR2955007A1 (fr) | 2011-07-08 |
US20130142397A1 (en) | 2013-06-06 |
RU2012133468A (ru) | 2014-03-20 |
RU2565515C2 (ru) | 2015-10-20 |
BR112012016403A2 (pt) | 2018-07-31 |
US8873809B2 (en) | 2014-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2572319B1 (fr) | Procede et systeme pour fusionner des donnees issues de capteurs d'images et de capteurs de mouvement ou de position | |
FR2903200A1 (fr) | Stabilisation hybride d'images pour camera video | |
EP2149108A2 (fr) | Détection et estimation de déplacement d'un appareil de prise de vue | |
WO2013093378A1 (fr) | Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere | |
FR2898757A1 (fr) | Procede et dispositif d'adaptation d'une frequence temporelle d'une sequence d'images video | |
EP1746486B1 (fr) | Procédé de détection de déplacement d'une entité pourvue d'un capteur d'images et dispositif pour le mettre en oeuvre | |
EP3272119A1 (fr) | Procede de reconstruction 3d d'une scene | |
FR2952743A3 (fr) | Procede d'estimation du mouvement d'un instrument d'observation a defilement survolant un corps celeste | |
EP1672585A1 (fr) | Procédé, dispositif et système de traitement d'images par estimation de mouvement | |
WO2019122703A1 (fr) | Procede de determination des bords saillants d'une cible sur une image | |
FR2891686A1 (fr) | Procede et dispositif de detection de transitions dans une sequence video, procede et dispositif de codage, produits programme d'ordinateur et moyens de stockage correspondants. | |
FR3027144A1 (fr) | Procede et dispositif de determination de mouvement entre des images video successives | |
EP2522140A1 (fr) | Estimation de mouvement global et dense | |
EP1998288A1 (fr) | Procédé de détermination du déplacement d'une entité pourvue d'un capteur de séquence d'images, programme d'ordinateur, module et souris optique associés | |
FR2969353A1 (fr) | Procede de realisation d'une image panoramique a partir d'une sequence video et appareil de mise en oeuvre. | |
WO2001043446A1 (fr) | Procede d'estimation de mouvement entre deux images avec gestion des retournements de mailles et procede de codage correspondant | |
WO2009087318A2 (fr) | Procede et dispositif de reconstruction du volume d'un objet a partir d'une sequence d'images de coupes dudit objet | |
EP2943935B1 (fr) | Estimation de mouvement d'une image | |
EP2375724B1 (fr) | Stabilisation d'images captées | |
EP3070505B1 (fr) | Procede et systeme de mise au point automatique d'une camera | |
FR3140181A1 (fr) | Procédé de commande d’un système mécatronique | |
CA3105372C (fr) | Traitement d'un bruit impulsionnel dans une sequence video | |
EP3701492B1 (fr) | Procede de restauration d'images | |
EP1746487A1 (fr) | Procede et dispositif de detection de deplacement d'une entite pourvue d'un capteur d'images | |
FR3143800A1 (fr) | Méthode d’apprentissage non supervisé d’un modèle d’estimation de flux optique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20120628 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SAGEM DEFENSE SECURITE |
|
17Q | First examination report despatched |
Effective date: 20160511 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SAFRAN ELECTRONICS & DEFENSE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20190919 |