WO2020081018A1 - Video processing-based image stabilization method - Google Patents

Video processing-based image stabilization method Download PDF

Info

Publication number
WO2020081018A1
WO2020081018A1 PCT/TR2018/050598 TR2018050598W WO2020081018A1 WO 2020081018 A1 WO2020081018 A1 WO 2020081018A1 TR 2018050598 W TR2018050598 W TR 2018050598W WO 2020081018 A1 WO2020081018 A1 WO 2020081018A1
Authority
WO
WIPO (PCT)
Prior art keywords
shift
calculating
corr
value
values
Prior art date
Application number
PCT/TR2018/050598
Other languages
French (fr)
Inventor
Cevahir ÇIĞLA
Original Assignee
Aselsan Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aselsan Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇ filed Critical Aselsan Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇
Priority to PCT/TR2018/050598 priority Critical patent/WO2020081018A1/en
Publication of WO2020081018A1 publication Critical patent/WO2020081018A1/en

Links

Classifications

    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the invention is related to a method which enables to eliminate the horizontal and vertical motion on live footage depending on the vibration of the camera by utilizing image processing methods.
  • Some of the methods widely used in the prior art for preventing vibrations detect the shifting of the images in the vertical and horizontal direction by looking at the correlation between sequential frames. In almost all of these methods, shifts between consecutive frames are calculated over the entire image or by using all the blocks. This approach works at reasonable levels for the scenes which have enough intensity modulation and texture. At this point, most of the image should have textural areas. However, considering the usage area of the surveillance cameras, this intensity modulation distribution may not always be at the desired level. At this level, the shift values to be calculated over the entire image may not correspond to real vibration values. Especially for the scenes in which the areas having non-textural and similar color distribution are predominant, the pixel vibration values to be calculated over the entire image will be misleading.
  • Projection (profile) based methods widely used for eliminating vibration do not appear to be the optimal choice, especially for the cases in which moving targets are large or numerous.
  • the moving targets may cause irregularities during the extraction of profiles that result in errors in the estimation of the camera vibration movements.
  • the other type of approach is the optical flow-based stabilization that has high computational complexity since it estimates a motion value for each pixel. Besides, calculation of consistent optical flow values for the images in which there is not enough texture difference does not give reliable estimates.
  • the invention has been drawn up by being inspired from the present situations and aims to solve the above-mentioned problems.
  • the main purpose of the invention is to eliminate the vibrations in the image by only processing the images obtained from the camera, without adding additional sensors to the camera assembly.
  • the method realized by the system is designed such that it shall process within the video frame rate the camera generates and shall not cause any disruption in live video streaming. By this means, a more user friendly video stream which will not cause any problems during observation or processing will be provided by means of eliminating camera vibrations to occur depending on any kind of external factors.
  • Another purpose of the invention is to enable to reduce false alarms that occur during vibrations (moving target detection, target tracking) for any automatic video processing algorithm, besides enhancing the video watching experience by the method that has been developed.
  • This method provides significant contribution in terms of reducing false alarms that will be received from the cameras installed for security purposes.
  • the invention is a method which enables to eliminate horizontal and vertical motion on live footage depending on the vibration of the camera; comprising the procedure steps of detecting sub blocks to be used for shifting detection, calculation of the shift values with the optimization process through phase correlation between each consecutive frame for the sub blocks and by accumulating the detected shift amount on the time-scale axis; providing the stabilization by means of applying the cumulative shift values on the recent frame in the reverse direction.
  • Figure 1 is the view of the flow diagram of the method subject to the invention.
  • Figure 2 is the view of the approach in which the image is divided into sub blocks.
  • Figure 3 is the graphical view of the exemplary sharpness calculation on a single dimensional correlation.
  • Figure 4 is the view of the exemplary auto-correlation maps for the blocks which are not suitable for shift estimation and the ones that are.
  • the image stabilization method proposed by the invention is basically composed of three stages.
  • the first stage the sub blocks to be used for shift detection are determined. Since the application area, in principle, is surveillance cameras, the alteration amount in the video is below a certain level under normal conditions. Therefore, the sub block detection process is performed at certain intervals (every other M frame) rather than on each frame and the determined sub blocks are used for a short period of time for sequential frames.
  • the second stage the global pixel shifts are detected with the optimization process through phase correlation between each consecutive frame for the sub blocks.
  • the estimated shifts are accumulated on the time-scale axis; and the stabilization is provided by means of applying the cumulative shift values calculated on the last frame depending on the reference image, to the recent frame in reverse direction.
  • the basic stages of the method are shown in the flow diagram in Figure 1.
  • a specific number of frames are divided into blocks.
  • blocking may be provided by means of blocks overlapping at certain proportions.
  • use of different pixels around by using vertical and horizontal intensity modulation amounts instead of camera output intensity values is provided.
  • Conformity of each block to the phase correlation that will be used in the shift calculation is determined by auto correlation. For this reason, by taking the phase correlation of each block with itself, a measurement on the texture structure it comprises is performed.
  • the phase correlation is obtained by applying the Fourier transform (F) to any block (B,), with the inverse Fourier transform (F 1 ) of its multiplication with its conjugate (F * ).
  • the auto correlation maps ⁇ AC obtained for each block is analyzed according to two criteria.
  • the maximum value ( max_corr ) in the auto correlation maps is expected at the center since no shifting is in question for the block.
  • the first criterion is chosen as the ratio between the maximum value at the center and the maximum value ( max_corr_exclude ) outside a specific region surrounding the center (AC 3 ⁇ 4: ).
  • the typical representation for the central region and the region outside is determined by the limit formed by the dashed line in the correlation maps of Figure 3.
  • the width of the central region may differ in proportion to the block dimension used. In general, values between 5-20% of the block width, yields reasonable results.
  • the first criterion indicated as the sharpness value ⁇ Pi) indicates how high the global maximum value is in comparison to the maximum value outside the specified center.
  • the sharpness value (P) determined for each sub block being high shows that the block can be separated from its surroundings, irrespective of the different shift values and that it is convenient for shift detection.
  • Low sharpness values (P) indicate that the block is unreliable and can lead to false shift detection by depending on local maximum values.
  • the sharpness calculation approach is exemplified through the one dimensional correlation signal and the same processes may be provided by extending to two dimensional for calculation over the image. By disregarding a specific width around the determined maximum value ( max_corr ), a second maximum value ( max_corr_exclude ) is found and the processes provided in Equation 1 are performed.
  • the phase correlation map for the blocks convenient for shift detection should be collected at the center.
  • the second criterion for the block is the total energy amount at the center of the phase correlation map (.4C E ). This value should be high for a reliable block. The main reason for this is that high correlation indicates that there are enough texture changes within the block.
  • a moving target detection algorithm depending on any background modelling should be run.
  • a moving target detection block output is used during sub block detection.
  • the sharpness value (P) of the blocks comprising the possible targets that will be received from motion detection and the center energy values are initialized and these blocks are disregarded. Accumulating the outputs from the motion detection at certain intervals enables to cover the areas which are likely to feature motion and secures a more optimal elimination.
  • the blocks are limited according to the calculated sharpness value ⁇ Pi) and the central energy values.
  • This stage is chosen for N number of block phase correlations having the highest central energy value in the one-third fraction from among the blocks that have been limited initially, according to the sharpness value Pi). At this stage, blocks having sufficient texture changes which can be separated from their surroundings are determined.
  • the stage following the detection of the blocks suitable for shift calculation is to determine the correlation maps between two consecutive frames ⁇ t, t+1 ) over the blocks.
  • the phase correlation (FK) is performed as below, through the intensity modulation values received from two different frames for each block region (index number i in Equation 7).
  • FK phase correlation
  • sub block selection is performed at certain intervals, a moving object appearing in any region can cause an alteration in the related sub blocks.
  • each sub block in which correlation will be performed is checked and blocks which do not comprise any motion are determined. i e ⁇ l, N ⁇ (Equation 7)
  • Correlation maps calculated for each block are added and the cumulative correlation (Equation 8) is found.
  • different conjugation approaches may also be used. For example, blocks providing a similar shift amounts are grouped and the outputs obtained from the incorrect sub blocks can be disregarded.
  • Another alternative is to perform (w s ) weighted addition by calculating the correlation reliability coefficient for each block. Weight calculation can be provided by using the sub block sharpness ratio used in Figure 1 or by similar approaches.
  • the maximum point of the (Equation 8) correlation map (FK) obtained by weighted or cumulative addition provides the vertical and horizontal plane shift amount ( ⁇ D.c, Dn ⁇ *).
  • Equation 9 The shift values obtained in Equation 9 correspond to the modulation between two consecutive frames. Since image stabilization will be performed according to the reference frame for a global stabilization, for n number of frames, these values are accumulated cumulatively and the calculation of how much of a correction is to be applied for the frame at that second is carried out. (Equation 10)
  • the vertical and horizontal shift amounts determined for the / second are applied to the image at that second and stabilization is provided.
  • filtration is provided over the shift values measured instantaneously and the usual average movement (Ax, Ay) the camera comprises is detected.
  • the camera vibration amount can be determined by subtracting these average values from the cumulative value.
  • the first advantage is that the camera resolutions increase together with the improving surveillance systems and fast processing ability is provided, which is necessary for real-time systems depending on the increase in frames per second. In comparison with the phase correlation to be calculated over the entire image, at least 20 times of speed increase is provided.
  • Another advantage is that with the detection of suitable sub blocks which are suitable for shift calculation in terms of content, an optimal shift detection of the image can be obtained which is independent from the general texture distribution. Shift detection approaches to be realized in the areas which do not have texture do not operate properly in any way; hence, any shift value to be calculated over the entire image for the scenes in which non-textural areas are in the majority will probably be incorrect. Detecting the suitable sub blocks provides a useful solution to this problem. Using the moving objects as a criterion in sub block detection prevents the use of sub blockss that include moving objects and are not proper, and provides a scene-sensitive video stabilization.

Abstract

The invention is a method which enables to eliminate the horizontal and vertical motion on live footage depending on the vibration of the camera; comprising the process steps of detecting sub blocks to be used for shift detection, detecting the shift values based on the optimization process through phase correlation between each consecutive frame for the sub blocks and by accumulating the detected shift values on the time- scale axis; providing stabilization by means of applying the cumulative shifts on the recent frame in the reverse direction.

Description

Video Processing-Based Image Stabilization Method Technical Field
The invention is related to a method which enables to eliminate the horizontal and vertical motion on live footage depending on the vibration of the camera by utilizing image processing methods.
Prior Art
Today, vibrations of the camera depending on external factors cause an inconvenience while watching live footage and the vibrations hinder the use of the camera.
Some of the methods widely used in the prior art for preventing vibrations detect the shifting of the images in the vertical and horizontal direction by looking at the correlation between sequential frames. In almost all of these methods, shifts between consecutive frames are calculated over the entire image or by using all the blocks. This approach works at reasonable levels for the scenes which have enough intensity modulation and texture. At this point, most of the image should have textural areas. However, considering the usage area of the surveillance cameras, this intensity modulation distribution may not always be at the desired level. At this level, the shift values to be calculated over the entire image may not correspond to real vibration values. Especially for the scenes in which the areas having non-textural and similar color distribution are predominant, the pixel vibration values to be calculated over the entire image will be misleading.
The above mentioned similar problem is the same with the methods using feature points. For the scenes in which noise and non-textural areas are in the majority, inconsistencies will be formed during the matching of the feature points to be established. Especially the corner points which will appear depending on the noise are sensitive to mismatching.
Projection (profile) based methods widely used for eliminating vibration do not appear to be the optimal choice, especially for the cases in which moving targets are large or numerous. The moving targets may cause irregularities during the extraction of profiles that result in errors in the estimation of the camera vibration movements. The other type of approach is the optical flow-based stabilization that has high computational complexity since it estimates a motion value for each pixel. Besides, calculation of consistent optical flow values for the images in which there is not enough texture difference does not give reliable estimates.
As a result, due to the problems described above and the insufficiency of the present solutions on the subject, an innovation is required in the related technical field.
The Purpose of the Invention
The invention has been drawn up by being inspired from the present situations and aims to solve the above-mentioned problems.
The main purpose of the invention is to eliminate the vibrations in the image by only processing the images obtained from the camera, without adding additional sensors to the camera assembly. The method realized by the system is designed such that it shall process within the video frame rate the camera generates and shall not cause any disruption in live video streaming. By this means, a more user friendly video stream which will not cause any problems during observation or processing will be provided by means of eliminating camera vibrations to occur depending on any kind of external factors.
Another purpose of the invention is to enable to reduce false alarms that occur during vibrations (moving target detection, target tracking) for any automatic video processing algorithm, besides enhancing the video watching experience by the method that has been developed. This method provides significant contribution in terms of reducing false alarms that will be received from the cameras installed for security purposes.
In order to fulfill the above-mentioned purposes, the invention is a method which enables to eliminate horizontal and vertical motion on live footage depending on the vibration of the camera; comprising the procedure steps of detecting sub blocks to be used for shifting detection, calculation of the shift values with the optimization process through phase correlation between each consecutive frame for the sub blocks and by accumulating the detected shift amount on the time-scale axis; providing the stabilization by means of applying the cumulative shift values on the recent frame in the reverse direction. The structural and characteristic features and the advantages of the invention will be understood more clearly from the figures given below and the detailed description written by means of referring to these figures, and for this reason, the evaluation should be made in consideration of these figures and the detailed description.
Figures Illustrating the Invention
Figure 1 is the view of the flow diagram of the method subject to the invention.
Figure 2 is the view of the approach in which the image is divided into sub blocks.
Figure 3 is the graphical view of the exemplary sharpness calculation on a single dimensional correlation.
Figure 4 is the view of the exemplary auto-correlation maps for the blocks which are not suitable for shift estimation and the ones that are.
Detailed Description of the Invention
In this detailed description, the preferred embodiments of the video processing-based image stabilization method subject to the invention are described in order to enable better understanding of the subject.
The image stabilization method proposed by the invention is basically composed of three stages. In the first stage, the sub blocks to be used for shift detection are determined. Since the application area, in principle, is surveillance cameras, the alteration amount in the video is below a certain level under normal conditions. Therefore, the sub block detection process is performed at certain intervals (every other M frame) rather than on each frame and the determined sub blocks are used for a short period of time for sequential frames. At the second stage, the global pixel shifts are detected with the optimization process through phase correlation between each consecutive frame for the sub blocks. The estimated shifts are accumulated on the time-scale axis; and the stabilization is provided by means of applying the cumulative shift values calculated on the last frame depending on the reference image, to the recent frame in reverse direction. The basic stages of the method are shown in the flow diagram in Figure 1.
Determining Sub Blocks for Shift Detection The two basic features of the invention which differentiates the invention from similar methods is that the phase correlation is performed through a specific number of reliable sub blocks which have passed the compliance test, instead of performing it on the entire frame and that the intensity modulation values are used vertically and horizontally by providing edge detection instead of the intensity values in the images. By this means, sturdy optimal correlation values are obtained and the computational complexity is reduced.
For the detection of sub blocks, first, depending on the image resolution, as shown in Figure 2, a specific number of frames are divided into blocks. At this stage, blocking may be provided by means of blocks overlapping at certain proportions. During all of these processes, use of different pixels around by using vertical and horizontal intensity modulation amounts instead of camera output intensity values is provided. Conformity of each block to the phase correlation that will be used in the shift calculation is determined by auto correlation. For this reason, by taking the phase correlation of each block with itself, a measurement on the texture structure it comprises is performed.
As shown in Equation 5, the phase correlation is obtained by applying the Fourier transform (F) to any block (B,), with the inverse Fourier transform (F1) of its multiplication with its conjugate (F*). The auto correlation maps {AC obtained for each block is analyzed according to two criteria. The maximum value ( max_corr ) in the auto correlation maps is expected at the center since no shifting is in question for the block. In this context, the first criterion is chosen as the ratio between the maximum value at the center and the maximum value ( max_corr_exclude ) outside a specific region surrounding the center (AC ¾:). The typical representation for the central region and the region outside is determined by the limit formed by the dashed line in the correlation maps of Figure 3. At this point, the width of the central region may differ in proportion to the block dimension used. In general, values between 5-20% of the block width, yields reasonable results.
In Equation 1 given below, the first criterion indicated as the sharpness value {Pi) indicates how high the global maximum value is in comparison to the maximum value outside the specified center.
(max eorr— min corr)
F, - - - = - = - - -
(m x_ i?rr_e i£ae‘ - mii _corr) (Equation 1 ) raax.corr = maxfAC (Equation 2) vam_c&rr = min (/I ) (Equation 3) max_carr_e?:¾; ¾ = mm ^ Ί (Equation 4) (Equation 5)
Figure imgf000007_0001
(Equation 6)
The sharpness value (P) determined for each sub block being high shows that the block can be separated from its surroundings, irrespective of the different shift values and that it is convenient for shift detection. Low sharpness values (P) indicate that the block is unreliable and can lead to false shift detection by depending on local maximum values. In Figure 3, the sharpness calculation approach is exemplified through the one dimensional correlation signal and the same processes may be provided by extending to two dimensional for calculation over the image. By disregarding a specific width around the determined maximum value ( max_corr ), a second maximum value ( max_corr_exclude ) is found and the processes provided in Equation 1 are performed. As shown in Figure 4, the phase correlation map for the blocks convenient for shift detection should be collected at the center.
The second criterion for the block is the total energy amount at the center of the phase correlation map (.4CE). This value should be high for a reliable block. The main reason for this is that high correlation indicates that there are enough texture changes within the block.
At this stage, in order to prevent the errors that will result from the moving objects in the scene, a moving target detection algorithm depending on any background modelling should be run. As shown in Figure 1 , a moving target detection block output is used during sub block detection. The sharpness value (P) of the blocks comprising the possible targets that will be received from motion detection and the center energy values are initialized and these blocks are disregarded. Accumulating the outputs from the motion detection at certain intervals enables to cover the areas which are likely to feature motion and secures a more optimal elimination. Thereafter, the blocks are limited according to the calculated sharpness value {Pi) and the central energy values. This stage is chosen for N number of block phase correlations having the highest central energy value in the one-third fraction from among the blocks that have been limited initially, according to the sharpness value Pi). At this stage, blocks having sufficient texture changes which can be separated from their surroundings are determined.
Calculatina the Shifts
The stage following the detection of the blocks suitable for shift calculation, is to determine the correlation maps between two consecutive frames {t, t+1 ) over the blocks. In this context, the phase correlation (FK) is performed as below, through the intensity modulation values received from two different frames for each block region (index number i in Equation 7). At this point, attention should be paid that no movement is present within the sub blocks. Since sub block selection is performed at certain intervals, a moving object appearing in any region can cause an alteration in the related sub blocks. In this case, in order to provide a reliable shift detection, each sub block in which correlation will be performed is checked and blocks which do not comprise any motion are determined. i e {l, N} (Equation 7)
Figure imgf000008_0001
Correlation maps calculated for each block are added and the cumulative correlation (Equation 8) is found. At this point, different conjugation approaches may also be used. For example, blocks providing a similar shift amounts are grouped and the outputs obtained from the incorrect sub blocks can be disregarded. Another alternative is to perform (ws) weighted addition by calculating the correlation reliability coefficient for each block. Weight calculation can be provided by using the sub block sharpness ratio used in Figure 1 or by similar approaches. The maximum point of the (Equation 8) correlation map (FK) obtained by weighted or cumulative addition provides the vertical and horizontal plane shift amount ({D.c, Dn}*).
FA'* = åf= 1 WjFK? (Equation 8)
{ c, Dn}* = argmax^ lFK* } (Equation 9) The shift values obtained in Equation 9 correspond to the modulation between two consecutive frames. Since image stabilization will be performed according to the reference frame for a global stabilization, for n number of frames, these values are accumulated cumulatively and the calculation of how much of a correction is to be applied for the frame at that second is carried out.
Figure imgf000009_0001
(Equation 10)
At the last stage, the vertical and horizontal shift amounts determined for the / second are applied to the image at that second and stabilization is provided. In order to ensure that the system works for moving cameras, filtration is provided over the shift values measured instantaneously and the usual average movement (Ax, Ay) the camera comprises is detected. The camera vibration amount can be determined by subtracting these average values from the cumulative value.
Figure imgf000009_0002
(Equation 1 1 )
At this point, an alternative solution exists apart from the shift amount accumulation. Following the selection of sub blocks in a reference frame, the effective shift values of that moment can be calculated over the shift calculation by the reference frame, for each new frame. In this approach, since shift values are calculated according to a specific reference, timewise accumulation is not needed.
By means of using a limited number of sub blocks, two advantages are obtained. The first advantage is that the camera resolutions increase together with the improving surveillance systems and fast processing ability is provided, which is necessary for real-time systems depending on the increase in frames per second. In comparison with the phase correlation to be calculated over the entire image, at least 20 times of speed increase is provided. Another advantage is that with the detection of suitable sub blocks which are suitable for shift calculation in terms of content, an optimal shift detection of the image can be obtained which is independent from the general texture distribution. Shift detection approaches to be realized in the areas which do not have texture do not operate properly in any way; hence, any shift value to be calculated over the entire image for the scenes in which non-textural areas are in the majority will probably be incorrect. Detecting the suitable sub blocks provides a useful solution to this problem. Using the moving objects as a criterion in sub block detection prevents the use of sub blockss that include moving objects and are not proper, and provides a scene-sensitive video stabilization.

Claims

1. A method which eliminates horizontal and vertical motion on live footage depending on the vibration of the camera, characterized in that it comprises the process steps of;
• detecting the sub blocks (B,) to be used for shift detection at certain intervals,
• detecting the shift amounts by means of the optimization process via phase correlation between each consecutive frame for the sub blocks (B,),
• providing stabilization by means of applying the cumulative shift values which is calculated on the last frame depending on the reference image, to the image in reverse direction by collecting the detected shift amount on the time-scale axis.
2. Method according to Claim 1 , characterized in that sub block (B,) detection comprises the process steps of,
• determining the phase correlation map {AC by carrying out the Fourier transform (F(Bi)) of each sub block (B,), and the inverse Fourier transform (F_1) of its multiplication with its conjugate (F*(Bi)),
• calculating the high value ( max_corr ) and low value (min_corf of the phase correlation map (ACt).
• calculating the ratio of the phase correlation map (A ) to the phase correlation central value (AC,center) and calculating the maximum value ( max_corr_exclude ) excluded from a specific region surrounding the center,
• calculating the sharpness value ( P i) with the ratio of the difference between the high value ( max_corr ) and the low value ( min corr ) to the difference between the maximum value ( max_corr_exclude ) excluded from a specific region surrounding the center and the low value ( min corr ),
• calculating the sharpness value ( P ,) suitable for the shift detection process.
3. Method according to Claim 1 , characterized in that the shift amount detection comprises the process steps of,
• calculating the phase correlation (FK) by the inverse Fourier transform (F1) of the multiplication of the conjugate (F*(Bi’)) of the Fourier transform of the sub block (Bi) at t frame and the Fourier transform (F*(Bit_1 )) of the sub block (Bi) at t-1 frame, where f and t+1 are two consecutive frames,
• calculating the cumulative correlation by adding the correlation maps (FK) calculated for each sub block (Bi),
• determining the maximum point of the correlation map (FK) in order to detect the shifting amount ({D,c,AnU) on the vertical and horizontal frame,
• providing the cumulative sum {Ax, Ay} of the shift amounts ({άc,DgU) for n number of frames,
• calculating the usual average values (Si,ly) the camera has by providing filtration over the shift values which are instantaneously measured for moving cameras,
• calculating the camera vibration amount by subtracting the average values (Exr&y) from the cumulative values {Ax, Ay}.
PCT/TR2018/050598 2018-10-17 2018-10-17 Video processing-based image stabilization method WO2020081018A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/TR2018/050598 WO2020081018A1 (en) 2018-10-17 2018-10-17 Video processing-based image stabilization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/TR2018/050598 WO2020081018A1 (en) 2018-10-17 2018-10-17 Video processing-based image stabilization method

Publications (1)

Publication Number Publication Date
WO2020081018A1 true WO2020081018A1 (en) 2020-04-23

Family

ID=64308789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2018/050598 WO2020081018A1 (en) 2018-10-17 2018-10-17 Video processing-based image stabilization method

Country Status (1)

Country Link
WO (1) WO2020081018A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0659022A2 (en) * 1993-12-18 1995-06-21 Kodak Limited Detection of global translations between images
EP2447913A1 (en) * 2010-10-26 2012-05-02 Sony Corporation Phase correlation motion estimation
WO2017031270A1 (en) * 2015-08-19 2017-02-23 Optimum Semiconductor Technologies, Inc. Video image alignment for video stabilization
WO2017200395A1 (en) * 2016-05-18 2017-11-23 Auckland Uniservices Limited Image registration method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0659022A2 (en) * 1993-12-18 1995-06-21 Kodak Limited Detection of global translations between images
EP2447913A1 (en) * 2010-10-26 2012-05-02 Sony Corporation Phase correlation motion estimation
WO2017031270A1 (en) * 2015-08-19 2017-02-23 Optimum Semiconductor Technologies, Inc. Video image alignment for video stabilization
WO2017200395A1 (en) * 2016-05-18 2017-11-23 Auckland Uniservices Limited Image registration method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ERTUERK S: "DIGITAL IMAGE STABILIZATION WITH SUB-IMAGE PHASE CORRELATION BASED GLOBAL MOTION ESTIMATION", IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 49, no. 4, 1 November 2003 (2003-11-01), pages 1320 - 1325, XP001201282, ISSN: 0098-3063, DOI: 10.1109/TCE.2003.1261235 *

Similar Documents

Publication Publication Date Title
US8054881B2 (en) Video stabilization in real-time using computationally efficient corner detection and correspondence
KR101093383B1 (en) System and method for image capture device
EP2622840B1 (en) Continuous autofocus based on face detection and tracking
JP4308319B2 (en) Image processing apparatus and image processing method
KR100919247B1 (en) Apparatus and method for panorama image generation and apparatus and method for object tracking using the same
US8587666B2 (en) Object detection from image profiles within sequences of acquired digital images
US8705894B2 (en) Image rotation from local motion estimates
US8121431B2 (en) Method and apparatus for detecting edge of image and computer readable medium processing method
US20090060277A1 (en) Background modeling with feature blocks
KR100938195B1 (en) Method for distance estimation and apparatus for the same using a stereo matching
US9131155B1 (en) Digital video stabilization for multi-view systems
US8587665B2 (en) Fast rotation estimation of objects in sequences of acquired digital images
JP2003526271A (en) Subjective noise measurement for active video signals
Sakaino Camera-vision-based water level estimation
KR100924906B1 (en) Occlusion detector for and method of detecting occlusion areas
US6122319A (en) Motion compensating apparatus using gradient pattern matching and method thereof
US20160105591A1 (en) Method and apparatus for detecting defects in digitized image sequences
KR101032098B1 (en) Stand-alone environment traffic detecting system using thermal infra-red
WO2020081018A1 (en) Video processing-based image stabilization method
JP5197374B2 (en) Motion estimation
JP3567114B2 (en) Image monitoring apparatus and image monitoring method
WO2017068397A1 (en) A moving object detection method
Farin et al. Misregistration errors in change detection algorithms and how to avoid them
KR20100118811A (en) Shot change detection method, shot change detection reliability calculation method, and software for management of surveillance camera system
JP3585977B2 (en) Moving area detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18803486

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18803486

Country of ref document: EP

Kind code of ref document: A1