WO2010050632A1 - Image stabilization method and device for performing the method - Google Patents

Image stabilization method and device for performing the method Download PDF

Info

Publication number
WO2010050632A1
WO2010050632A1 PCT/KR2008/006368 KR2008006368W WO2010050632A1 WO 2010050632 A1 WO2010050632 A1 WO 2010050632A1 KR 2008006368 W KR2008006368 W KR 2008006368W WO 2010050632 A1 WO2010050632 A1 WO 2010050632A1
Authority
WO
WIPO (PCT)
Prior art keywords
target frame
region
image stabilization
sads
stabilization method
Prior art date
Application number
PCT/KR2008/006368
Other languages
French (fr)
Inventor
Etienne Eccles
Geoff Thiel
Ben White
Original Assignee
Udp Co., Ltd.
Vca Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Udp Co., Ltd., Vca Technology Ltd filed Critical Udp Co., Ltd.
Priority to PCT/KR2008/006368 priority Critical patent/WO2010050632A1/en
Priority to KR1020080106931A priority patent/KR100895385B1/en
Publication of WO2010050632A1 publication Critical patent/WO2010050632A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation

Definitions

  • the present invention relates to a technology that corrects undesired motion occurring in a video sequence due to a camera shake, and the like.
  • Motion may occur in photographed video frames due to camera shake. Such motions are to be precisely measured and corrected. Also, cost-effective technologies are to be developed to estimate such motions.
  • a predetermined template (sub-portion) of a prior reference frame is used to estimate motion of a video frame. That is, motion of a video frame may be estimated by finding the location of a predetermined template in the video frame. In this instance, Sum of Absolute Differences (SADs) between the predetermined template and the video frame may be used as a standard for comparison.
  • SADs Sum of Absolute Differences
  • an image corresponding to SADs between a predetermined template and different locations in a video frame may be retrieved.
  • Motions of the video frame may be estimated based on the location in the video frame where the SAD is minimized.
  • N denotes a maximum search distance
  • the present invention provides an image stabilization method and apparatus which calculates Sum of Absolute Differences (SADs) between a predetermined template and a portion of the video frame, determined according to a particular rule, without calculating SADs between the predetermined template and every possible location in the target frame, and thereby may estimate motion of the target frame at a low cost.
  • SADs Sum of Absolute Differences
  • the present invention also provides an image stabilization method and apparatus which finds the location in the target frame where the SAD is minimized using a gradient based search algorithm which is of order N computation cost, where N is the maximum search distance, and thereby may estimate motion of the target frame more efficiently.
  • the present invention also provides an image stabilization method and apparatus which refines an estimated displacement of a target frame using a quadratic curve, and thereby may estimate motion of the target frame more precisely.
  • the present invention also provides an image stabilization method and apparatus which accumulatively updates a template using a current frame, and thereby may reduce an effect of camera noise in the template and an adverse effect on a frame.
  • an image stabilization method including: providing a target frame; finding the location in the image where the SAD between a predetermined template and that location is minimized using the gradient based search algorithm.
  • an image stabilization apparatus including: a frame providing unit to provide a target frame; a calculation unit to calculate SADs between a predetermined template and location in the target frame according to the gradient based search algorithm; and an estimation unit to estimate a displacement of the target frame based on a location of where the SAD between template and target frame is minimized.
  • FIG. 1 illustrates an original frame and a frame with motion due to a camera shake
  • FIG. 2 is a flowchart illustrating an image stabilization method according to an embodiment of the present invention
  • FIG. 3 illustrates images of a target frame, a template, and Sum of Absolute
  • FIG. 4 illustrates a SAD image while a gradient based search algorithm is executed according to an embodiment of the present invention
  • FIG. 5 is a flowchart illustrating an example of the executing in operation S220 of
  • FIG. 2
  • FIG. 6 illustrates an example of a blank area generated due to a shifted frame
  • FIG. 7 illustrates an example of a blank area processed using a fading effect according to an embodiment of the present invention.
  • FIG. 1 illustrates an original frame 110 and a frame with motion due to a camera shake.
  • an object 111 is located in a center of the original frame 110.
  • [23] 'd' is required to be accurately estimated to correct the shaken frame 120.
  • a variety of methods for estimating 'd' exists.
  • Calculating Sum of Absolute Differences (SADs) between a predetermined template at every location in the shaken frame 120 may be a simplest method. In this instance the location in the image where the SAD between the template and that location in the shaken frame 120 is a minimized may be used to estimate 'd'.
  • SADs with respect to all possible values of 'd', both in vertical and horizontal displacement (up to a predetermined maximum displacement) are required to be calculated, which is disadvantageous.
  • the gradient based search algorithm may estimate 'd' without calculating the SADs between the template with respect to every possible displacement in the shaken frame 120, which will be described in detail.
  • FIG. 2 is a flowchart illustrating an image stabilization method according to an embodiment of the present invention.
  • a target frame is provided to an image stabilization apparatus according to an embodiment of the present invention.
  • the image stabilization method executes a search algorithm to estimate motion of the target frame due to a camera shake.
  • the search algorithm is described in detail with reference to FIG. 3 and FIG. 4.
  • FIG. 3 illustrates a target frame, a template, and a SAD image corresponding to
  • FIG. 3 a target frame 310 and a template 330.
  • the calculated SADs are mapped to elements of a SAD image 320. For example, when it is assumed that a SAD between the target image 310 and the template 330 located at position 311 is mapped to an element A of the SAD image 320, a SAD between the target frame 310 and the template 330 located at position 312 is mapped to an element B, and a SAD between the target frame 313 and the template 330 located at position 313 is mapped to an element C.
  • FIG. 4 illustrates a SAD image while a search algorithm is executed according to an embodiment of the present invention.
  • a target frame 410 SADs between a template at different positions in the target frame 410 are mapped to a SAD image 420.
  • the SAD is calculated at positions having pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i as the central position of the template.
  • pixel e is the start position of the algorithm and is the position where the template was estimated to be in the previous frame
  • the region comprising of pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i are included in a first search region 411 with a size of 3 x 3.
  • the search algorithm calculates the SADs between the template and the target frame at different positions.
  • each of the pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i is location of the central pixel of the template at these positions.
  • the SAD between the template and the target frame at positions having the pixel a in the center of the template is mapped to an element A of a SAD image 420.
  • SADs between the template and target frame at positions having the pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i in the center of the template may be mapped to an element B, element C, element D, element E, element F, element G, element H, and element I of the SAD image 420, respectively.
  • the search algorithm detects a minimum value from among the nine calculated SADs mapped to a region 421.
  • a SAD mapped to a coordinates (2, 2) is a minimum value from among the nine calculated SADs mapped to the region 421, the search algorithm is terminated.
  • the coordinates (2, 2) may correspond to a center of the region 421.
  • a displacement of the target frame 410 is estimated based on a location of the template, corresponding to the SAD mapped to (2, 2) in the SAD image 420, or a location of the pixel e located in the center of the SAD image 420.
  • a SAD mapped to (3, 3) of the SAD image 420 is the minimum value of the nine calculated SADs mapped to the region 421.
  • the search algorithm sets a second region adjacent to a first region 411 of the target frame 410.
  • a size of the second region is 3 x 3, and the pixel i is located in a center of the second region.
  • the search algorithm calculates SADs between the template and target frame at positions of the second region in a center of the template.
  • the second region includes the pixel e, pixel f, pixel h, and pixel i.
  • SADs between the template and the target frame at positions having the pixel e, pixel f, pixel h, and pixel i in a center of the template have been calculated.
  • the search algorithm may calculate the SADs between the template and the target frame at positions having pixels excluding the pixel e, pixel f, pixel h, and pixel i from among the nine pixels of the second region.
  • the search algorithm further calculates SADs between the template and the target frame at the locations of the remaining five pixels of the second region, and thereby may map the SADs between the template and the nine images to a region 422 of the SAD image 420. Also, the search algorithm detects a minimum value of the nine SADs mapped to the region 422. Here, it is assumed that a SAD mapped to (4, 4) of the region 422 is the minimum value.
  • the search algorithm calculates nine SADs mapped to a region 423 from images of the target frame 410.
  • a SAD mapped to (5, 5) from among the nine SADs mapped to the region 423 is a minimum value
  • the search algorithm calculates nine SADs mapped to a region 424.
  • the search algorithm sequentially calculates SADs mapped to each of a region 425 and region 426.
  • a SAD mapped to (7, 6) is a minimum value from among the calculated SADs mapped to the region 426
  • the search algorithm is terminated.
  • (7, 6) may correspond to a center of the region 426.
  • a displacement of the target frame 410 is estimated based on a location of the template corresponding to the SAD mapped to (7, 6) .
  • the image stabilization method may refine the estimated displacement of the target frame 410 for high quality stabilization. That is, when the location of the template corresponding to the minimum value of SAD is ascertained, the image stabilization method may precisely estimate the displacement of the target frame 410 within a fraction of the pixel.
  • a SAD between the template and target frame is a minimum value at a particular location.
  • a quadratic curve may be defined based on the minimum value corresponding to the particular pixel and SADs corresponding to images having at least two pixels in a center of each of the at least two pixels.
  • the at least two pixels of the target frame 410 are adjacent to the particular pixel. That is, at least two elements, adjacent to an element where the minimum value is mapped in the SAD image 42, exist.
  • the quadratic curve may be defined based on the minimum value and SADs corresponding to the at least two elements. Since the at least two elements adjacent to the element where the minimum value is mapped may exist above and below the pixel and to the right and left of the pixel, two quadratic curves may be defined.
  • the image stabilization method according to an embodiment of the present invention may refine the displacement of the target frame 410 using a minimum value of each of the two quadratic curves.
  • the image stabilization method updates the template using a section of the target frame.
  • the section of the target frame is matched with a template retrieved through the search algorithm.
  • the template is to be updated to reflect a change of a lighting condition or long term changes.
  • the image stabilization method may update the template using an exponential decay.
  • a template may be given by,
  • Template(t) denotes a value of pixels in a template in a time t
  • Template(t - ) denotes a value of pixels in a template in a time t
  • Frame(t) denotes a value of pixels of a section of a frame retrieved in the time t through the search algorithm
  • is an adjustable value. For example, ⁇ may be approximately 5 % in 25 frames per second (fps).
  • the image stabilization method estimates a motion vector of the target frame based on the refined displacement of the target frame.
  • the image stabilization method corrects (or shifts) the target frame using the estimated motion vector.
  • the image stabilization method may overcome motion of a target frame due to camera shake in an interlaced video.
  • a single frame includes two fields in the interlaced video. In this instance, two fields may alternatively exist.
  • each of the fields is treated as a separate frame of half the horizontal resolution of the original video, with a half pixel vertical shift between top and bottom fields. These two fields are then processed by the algorithm as consecutive frames, taking into account the half pixel vertical offset. The fields are then put back together to produce the original frame with interlace tearing corrected.
  • FIG. 5 is a flowchart illustrating an example of the executing in operation S220 of
  • an image stabilization method sets the first region to start the search algorithm.
  • SADs between the target frame and a predetermined template positioned on pixels of a first region are calculated.
  • a minimum value is detected from the calculated SADs.
  • a displacement of a target frame within the pixel may be precisely estimated using a quadratic curve.
  • a second region adjacent to the first region is set, and the search algorithm is executed again.
  • FIG. 6 illustrates an example of a blank area generated due to a shifted frame.
  • a shifted frame 610 is shifted from an original location to compensate for motion, and thus a blank area 620 may be generated in a few portions of an edge of the shifted frame 610.
  • the blank area may be filled with information from old frames.
  • a fading effect that fills the blank area based on an age of the old frames may be used according to an embodiment of the present invention, which is described in detail with reference to FIG. 7.
  • FIG. 7 illustrates an example of a blank area processed using a fading effect according to an embodiment of the present invention.
  • the blank area may be filled with information from old frames using the fading effect.
  • the blank area may be filled with the information from the old frames according to an exponential decay function based on an age of the old frames to obtain the fading effect.
  • a blank area in an edge of a shifted frame 710 may be filled with information from old frames 720, 730, 740, and 750.
  • information from the old frame 750 may be used as to fade to high density black.
  • information from the old frame 720 may be used to fade to low density black.
  • a plurality of templates evenly spaced in a target frame may be used.
  • an inferior template, a template that is not well matched to a location in the target frame, and a template with displacements due to the moving objects may be excluded from the plurality of templates.
  • the image stabilization system may ascertain templates to be excluded, based on the minimum value SAD found by the search algorithm. A template which does not match well to any location will exhibit a minimum value of SAD which is not sufficiently small
  • the image stabilization system may ascertain the uniqueness of a match using a distribution of SADs corresponding to elements of a SAD image. Specifically, the image stabilization system may ascertain the uniqueness of a template match using an average gradient of the SADs corresponding to the elements of the SAD image.
  • Generating a global SAD image may be more efficient than calculating every SAD to exclude the template from the plurality of templates.
  • SAD images corresponding to all the templates are combined to generate the global SAD image.
  • all SAD images are multiplied together to generate the global SAD image, and a displacement of a frame may be estimated based on the global SAD image.
  • This process automatically excludes templates which are poor matches or are not unique matches.
  • the aforementioned methods of determining goodness and uniqueness of a match are then applied to the global SAD image to determine if the estimated displacement is valid.
  • the above-described embodiment of the present invention may be recorded in computer-readable media including program instructions to implement various operations embodied by a computer.
  • the media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
  • the media and program instructions may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer- readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVD; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • the described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described embodiments of the present invention.
  • an image stabilization apparatus may include a frame providing unit, a calculation unit, a determination unit, and an estimation unit.
  • the frame providing unit provides a target frame
  • the calculation unit calculates SADs between a predetermined template and the target frame at positions in a first region.
  • the first region has a predetermined size.
  • the determination unit determines whether a the minimum SAD, is located in a center of the first region.
  • the estimation unit estimates a displacement of the target frame based on a location of the minimum SAD value, also, the calculation unit sets a second region adjacent to the first region, when the minimum value of the calculated SADs is not located in the center of the first region.
  • the image stabilization apparatus may further include a correction unit which corrects the target frame based on the refined displacement of the target frame.
  • the image stabilization method and apparatus calculates Sum of Absolute Differences (SADs) between a predetermined template and location in the target image determined according to a particular rule, without calculating SADs between the predetermined template and the target frame at every possible location within the maximum search area, and thereby may estimate motion of the target frame at a low cost.
  • SADs Sum of Absolute Differences
  • the image stabilization method and apparatus determines whether to continue to execute a search algorithm depending on whether the minimum value of SADs is located in a center of the first region, and thereby may estimate motion of the target frame more efficiently.
  • the image stabilization method and apparatus refines an estimated displacement of a target frame using a quadratic curve, and thereby may estimate motion of the target frame more precisely.
  • the image stabilization method and apparatus accumulatively updates a template using a current frame, and thereby may reduce an effect of camera noise in the template and an adverse effect on a frame.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

An image stabilization method. The image stabilization method, including: providing a target frame; estimating the displacement between a reference frame or template and a target frame using a gradient based search algorithm to find the location of the template in the target frame which minimizes the Sum of Absolute Differences (SAD) between target and reference template; refining this estimate by fitting a quadratic curve to the minimum valued SAD and its adjacent values; rejecting image distractions and ambiguities by combining the SAD of numerous templates spaced though out the image frame; correcting interlace tearing in interlaced video caused by camera shake by stabilizing each interlace field consecutively.

Description

Description
IMAGE STABILIZATION METHOD AND DEVICE FOR PERFORMING THE METHOD
Technical Field
[1] The present invention relates to a technology that corrects undesired motion occurring in a video sequence due to a camera shake, and the like. Background Art
[2] Motion may occur in photographed video frames due to camera shake. Such motions are to be precisely measured and corrected. Also, cost-effective technologies are to be developed to estimate such motions.
[3] A predetermined template (sub-portion) of a prior reference frame is used to estimate motion of a video frame. That is, motion of a video frame may be estimated by finding the location of a predetermined template in the video frame. In this instance, Sum of Absolute Differences (SADs) between the predetermined template and the video frame may be used as a standard for comparison.
[4] Specifically, an image corresponding to SADs between a predetermined template and different locations in a video frame may be retrieved. Motions of the video frame may be estimated based on the location in the video frame where the SAD is minimized.
[5] However, it is inefficient to calculate SADs between a predetermined template and the video frame at every possible location. That is, a computation cost of order N x N is required to calculate SADs between a predetermined template and a video frame at every location. Here, N denotes a maximum search distance.
[6] Accordingly, a technology to estimate motion of a video frame more efficiently is required.
Disclosure of Invention Technical Problem
[7] The present invention provides an image stabilization method and apparatus which calculates Sum of Absolute Differences (SADs) between a predetermined template and a portion of the video frame, determined according to a particular rule, without calculating SADs between the predetermined template and every possible location in the target frame, and thereby may estimate motion of the target frame at a low cost.
[8] The present invention also provides an image stabilization method and apparatus which finds the location in the target frame where the SAD is minimized using a gradient based search algorithm which is of order N computation cost, where N is the maximum search distance, and thereby may estimate motion of the target frame more efficiently. [9] The present invention also provides an image stabilization method and apparatus which refines an estimated displacement of a target frame using a quadratic curve, and thereby may estimate motion of the target frame more precisely.
[10] The present invention also provides an image stabilization method and apparatus which accumulatively updates a template using a current frame, and thereby may reduce an effect of camera noise in the template and an adverse effect on a frame. Technical Solution
[11] According to an aspect of the present invention, there is provided an image stabilization method, including: providing a target frame; finding the location in the image where the SAD between a predetermined template and that location is minimized using the gradient based search algorithm.
[12] According to another aspect of the present invention, there is provided an image stabilization apparatus, including: a frame providing unit to provide a target frame; a calculation unit to calculate SADs between a predetermined template and location in the target frame according to the gradient based search algorithm; and an estimation unit to estimate a displacement of the target frame based on a location of where the SAD between template and target frame is minimized. Brief Description of the Drawings
[13] FIG. 1 illustrates an original frame and a frame with motion due to a camera shake;
[14] FIG. 2 is a flowchart illustrating an image stabilization method according to an embodiment of the present invention;
[15] FIG. 3 illustrates images of a target frame, a template, and Sum of Absolute
Difference (SAD) images corresponding to SADs;
[16] FIG. 4 illustrates a SAD image while a gradient based search algorithm is executed according to an embodiment of the present invention;
[17] FIG. 5 is a flowchart illustrating an example of the executing in operation S220 of
FIG. 2;
[18] FIG. 6 illustrates an example of a blank area generated due to a shifted frame; and
[19] FIG. 7 illustrates an example of a blank area processed using a fading effect according to an embodiment of the present invention. Mode for the Invention
[20] Hereinafter, embodiments of the present invention are described in detail by referring to the figures.
[21] FIG. 1 illustrates an original frame 110 and a frame with motion due to a camera shake.
[22] Referring to FIG. 1, an object 111 is located in a center of the original frame 110.
When a camera shakes, an object 121 appears away from a center 122 of a shaken frame 120. In this instance, motion of the shaken frame 120 is represented as 'd'.
[23] 'd' is required to be accurately estimated to correct the shaken frame 120. A variety of methods for estimating 'd' exists.
[24] Calculating Sum of Absolute Differences (SADs) between a predetermined template at every location in the shaken frame 120 may be a simplest method. In this instance the location in the image where the SAD between the template and that location in the shaken frame 120 is a minimized may be used to estimate 'd'. However, SADs with respect to all possible values of 'd', both in vertical and horizontal displacement (up to a predetermined maximum displacement) are required to be calculated, which is disadvantageous.
[25] The gradient based search algorithm according to an embodiment of the present invention may estimate 'd' without calculating the SADs between the template with respect to every possible displacement in the shaken frame 120, which will be described in detail.
[26] FIG. 2 is a flowchart illustrating an image stabilization method according to an embodiment of the present invention.
[27] In operation S210, a target frame is provided to an image stabilization apparatus according to an embodiment of the present invention.
[28] In operation S220, the image stabilization method executes a search algorithm to estimate motion of the target frame due to a camera shake. The search algorithm is described in detail with reference to FIG. 3 and FIG. 4.
[29] FIG. 3 illustrates a target frame, a template, and a SAD image corresponding to
SADs calculated at different displacements.
[30] Referring to FIG. 3, a target frame 310 and a template 330.
[31] SADs between the target frame 310 and a template 330 are calculated at locations
311, 312, and 313. . The calculated SADs are mapped to elements of a SAD image 320. For example, when it is assumed that a SAD between the target image 310 and the template 330 located at position 311 is mapped to an element A of the SAD image 320, a SAD between the target frame 310 and the template 330 located at position 312 is mapped to an element B, and a SAD between the target frame 313 and the template 330 located at position 313 is mapped to an element C.
[32] Calculating SADs between the template 330 and target frame 310 at every position requires a large amount of computation to estimate a displacement of the target frame 310. According to an embodiment of the present invention, the search algorithm calculates SADs between the template 330 and target frame 310 at a much smaller portion of the possible displacements, and thereby may estimate the displacement of the target frame 310, which is described in detail with reference to FIG. 4.
[33] FIG. 4 illustrates a SAD image while a search algorithm is executed according to an embodiment of the present invention.
[34] Referring to FIG. 4, a target frame 410 SADs between a template at different positions in the target frame 410 are mapped to a SAD image 420.
[35] It is assumed that the SAD is calculated at positions having pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i as the central position of the template. In this instance, pixel e is the start position of the algorithm and is the position where the template was estimated to be in the previous frame, and the region comprising of pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i are included in a first search region 411 with a size of 3 x 3. According to an embodiment of the present invention, the search algorithm calculates the SADs between the template and the target frame at different positions. In this instance, each of the pixel a, pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i is location of the central pixel of the template at these positions. The SAD between the template and the target frame at positions having the pixel a in the center of the template is mapped to an element A of a SAD image 420. Similarly, SADs between the template and target frame at positions having the pixel b, pixel c, pixel d, pixel e, pixel f, pixel g, pixel h, and pixel i in the center of the template may be mapped to an element B, element C, element D, element E, element F, element G, element H, and element I of the SAD image 420, respectively.
[36] In this instance, the search algorithm detects a minimum value from among the nine calculated SADs mapped to a region 421. When a SAD mapped to a coordinates (2, 2) is a minimum value from among the nine calculated SADs mapped to the region 421, the search algorithm is terminated. In this instance, the coordinates (2, 2) may correspond to a center of the region 421. Also, a displacement of the target frame 410 is estimated based on a location of the template, corresponding to the SAD mapped to (2, 2) in the SAD image 420, or a location of the pixel e located in the center of the SAD image 420.
[37] In FIG. 3, however, it is assumed that a SAD mapped to (3, 3) of the SAD image 420 is the minimum value of the nine calculated SADs mapped to the region 421. In this instance, the search algorithm sets a second region adjacent to a first region 411 of the target frame 410. A size of the second region is 3 x 3, and the pixel i is located in a center of the second region.
[38] Also, the search algorithm calculates SADs between the template and target frame at positions of the second region in a center of the template. The second region includes the pixel e, pixel f, pixel h, and pixel i. SADs between the template and the target frame at positions having the pixel e, pixel f, pixel h, and pixel i in a center of the template have been calculated. Accordingly, the search algorithm may calculate the SADs between the template and the target frame at positions having pixels excluding the pixel e, pixel f, pixel h, and pixel i from among the nine pixels of the second region.
[39] Accordingly, the search algorithm further calculates SADs between the template and the target frame at the locations of the remaining five pixels of the second region, and thereby may map the SADs between the template and the nine images to a region 422 of the SAD image 420. Also, the search algorithm detects a minimum value of the nine SADs mapped to the region 422. Here, it is assumed that a SAD mapped to (4, 4) of the region 422 is the minimum value.
[40] Similarly, the search algorithm calculates nine SADs mapped to a region 423 from images of the target frame 410. When a SAD mapped to (5, 5) from among the nine SADs mapped to the region 423 is a minimum value, the search algorithm calculates nine SADs mapped to a region 424. Similarly, the search algorithm sequentially calculates SADs mapped to each of a region 425 and region 426.
[41] In this instance, when a SAD mapped to (7, 6) is a minimum value from among the calculated SADs mapped to the region 426, the search algorithm is terminated. In this instance, (7, 6) may correspond to a center of the region 426. Also, a displacement of the target frame 410 is estimated based on a location of the template corresponding to the SAD mapped to (7, 6) .
[42] Also, the image stabilization method may refine the estimated displacement of the target frame 410 for high quality stabilization. That is, when the location of the template corresponding to the minimum value of SAD is ascertained, the image stabilization method may precisely estimate the displacement of the target frame 410 within a fraction of the pixel.
[43] It is assumed that a SAD between the template and target frame is a minimum value at a particular location. In this instance, a quadratic curve may be defined based on the minimum value corresponding to the particular pixel and SADs corresponding to images having at least two pixels in a center of each of the at least two pixels. The at least two pixels of the target frame 410 are adjacent to the particular pixel. That is, at least two elements, adjacent to an element where the minimum value is mapped in the SAD image 42, exist. Accordingly, the quadratic curve may be defined based on the minimum value and SADs corresponding to the at least two elements. Since the at least two elements adjacent to the element where the minimum value is mapped may exist above and below the pixel and to the right and left of the pixel, two quadratic curves may be defined.
[44] The image stabilization method according to an embodiment of the present invention may refine the displacement of the target frame 410 using a minimum value of each of the two quadratic curves.
[45] Referring again to FIG. 2, in operation S230, the image stabilization method updates the template using a section of the target frame. The section of the target frame is matched with a template retrieved through the search algorithm.
[46] The template is to be updated to reflect a change of a lighting condition or long term changes. In this instance, the image stabilization method may update the template using an exponential decay. For example, a template may be given by,
[47] [Equation 1]
[48] Template(t) = (1 - α) * Template(t - 1) + a * Frame{t)
[49] Here, Template(t) denotes a value of pixels in a template in a time t, and Template(t -
1) denotes a value of pixels in a template in a time t-1. Frame(t) denotes a value of pixels of a section of a frame retrieved in the time t through the search algorithm, and α is an adjustable value. For example, α may be approximately 5 % in 25 frames per second (fps).
[50] In operation S240, the image stabilization method estimates a motion vector of the target frame based on the refined displacement of the target frame. In operation S250, the image stabilization method corrects (or shifts) the target frame using the estimated motion vector.
[51] Although it is not illustrated in FIG. 2, according to an embodiment of the present invention, the image stabilization method may overcome motion of a target frame due to camera shake in an interlaced video. A single frame includes two fields in the interlaced video. In this instance, two fields may alternatively exist.
[52] When two fields are combined into a single frame in the interlaced video, a difference between the two fields due to motion may cause a tearing effect. According to an embodiment of the present invention, each of the fields is treated as a separate frame of half the horizontal resolution of the original video, with a half pixel vertical shift between top and bottom fields. These two fields are then processed by the algorithm as consecutive frames, taking into account the half pixel vertical offset. The fields are then put back together to produce the original frame with interlace tearing corrected.
[53] FIG. 5 is a flowchart illustrating an example of the executing in operation S220 of
FIG. 2.
[54] Referring to FIG. 5, an image stabilization method according to an embodiment of the present invention sets the first region to start the search algorithm.
[55] Also, according to an embodiment of the present invention, SADs between the target frame and a predetermined template positioned on pixels of a first region are calculated. In this instance, a minimum value is detected from the calculated SADs. When the minimum value SAD is located in a center of the first region, a displacement of a target frame within the pixel may be precisely estimated using a quadratic curve. Conversely, when the minimum value of SAD is not located in the center of the first region, a second region adjacent to the first region is set, and the search algorithm is executed again.
[56] FIG. 6 illustrates an example of a blank area generated due to a shifted frame.
[57] Referring to FIG. 6, a shifted frame 610 is shifted from an original location to compensate for motion, and thus a blank area 620 may be generated in a few portions of an edge of the shifted frame 610.
[58] When a displacement of a frame is estimated and refined, the frame is shifted by the refined displacement to correct the motion. In this instance, a blank area is generated. The blank area is not filled with any information due to the motion of the frame.
[59] It is necessary to determine how to fill the blank area. According to an embodiment of the present invention, the blank area may be filled with information from old frames. In particular, a fading effect that fills the blank area based on an age of the old frames may be used according to an embodiment of the present invention, which is described in detail with reference to FIG. 7.
[60] FIG. 7 illustrates an example of a blank area processed using a fading effect according to an embodiment of the present invention.
[61] Referring to FIG. 7, the blank area may be filled with information from old frames using the fading effect.
[62] As the old frames continue to get older, the old frames are less similar to a current frame. Accordingly, the blank area may be filled with the information from the old frames according to an exponential decay function based on an age of the old frames to obtain the fading effect.
[63] In FIG. 7, a blank area in an edge of a shifted frame 710 may be filled with information from old frames 720, 730, 740, and 750. In this instance, since the old frame 750 is the oldest, information from the old frame 750 may be used as to fade to high density black. Conversely, since the old frame 720 is the youngest, information from the old frame 720 may be used to fade to low density black.
[64] In an image stabilization system, separating a constantly changing region such as ripping water or moving objects from a static background is significant. Here, a viewer perceives the static background as a reference frame. In this instance, a displacement of frame due to camera shake, and the like is to be estimated for the static background whilst ignoring displacements due to the moving objects or other distractions.
[65] According to an embodiment of the present invention, a plurality of templates evenly spaced in a target frame may be used. Also, an inferior template, a template that is not well matched to a location in the target frame, and a template with displacements due to the moving objects may be excluded from the plurality of templates. In this instance, the image stabilization system according to an embodiment of the present invention may ascertain templates to be excluded, based on the minimum value SAD found by the search algorithm. A template which does not match well to any location will exhibit a minimum value of SAD which is not sufficiently small
[66] A template that does not include detailed features that can be found in the target frame, for example, a white wall or blue sky, will be well matched to many locations in the target frame and is therefore not a reliable measure of the true displacement, is also to be excluded. According to an embodiment of the present invention, the image stabilization system may ascertain the uniqueness of a match using a distribution of SADs corresponding to elements of a SAD image. Specifically, the image stabilization system may ascertain the uniqueness of a template match using an average gradient of the SADs corresponding to the elements of the SAD image.
[67] Generating a global SAD image may be more efficient than calculating every SAD to exclude the template from the plurality of templates. SAD images corresponding to all the templates are combined to generate the global SAD image. Specifically, according to an embodiment of the present invention, all SAD images are multiplied together to generate the global SAD image, and a displacement of a frame may be estimated based on the global SAD image. This process automatically excludes templates which are poor matches or are not unique matches. The aforementioned methods of determining goodness and uniqueness of a match are then applied to the global SAD image to determine if the estimated displacement is valid.
[68] The above-described embodiment of the present invention may be recorded in computer-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The media and program instructions may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of computer- readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVD; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described embodiments of the present invention.
[69] Although it is not illustrated in FIG. 1 through FIG. 7, an image stabilization apparatus according to an embodiment of the present invention may include a frame providing unit, a calculation unit, a determination unit, and an estimation unit. The frame providing unit provides a target frame, and the calculation unit calculates SADs between a predetermined template and the target frame at positions in a first region. In this instance, the first region has a predetermined size. The determination unit determines whether a the minimum SAD, is located in a center of the first region. The estimation unit estimates a displacement of the target frame based on a location of the minimum SAD value, also, the calculation unit sets a second region adjacent to the first region, when the minimum value of the calculated SADs is not located in the center of the first region.
[70] The image stabilization apparatus may further include a correction unit which corrects the target frame based on the refined displacement of the target frame.
[71] According to an embodiment of the present invention, the image stabilization method and apparatus calculates Sum of Absolute Differences (SADs) between a predetermined template and location in the target image determined according to a particular rule, without calculating SADs between the predetermined template and the target frame at every possible location within the maximum search area, and thereby may estimate motion of the target frame at a low cost.
[72] Also, according to an embodiment of the present invention, the image stabilization method and apparatus determines whether to continue to execute a search algorithm depending on whether the minimum value of SADs is located in a center of the first region, and thereby may estimate motion of the target frame more efficiently.
[73] Also, according to an embodiment of the present invention, the image stabilization method and apparatus refines an estimated displacement of a target frame using a quadratic curve, and thereby may estimate motion of the target frame more precisely.
[74] Also, according to an embodiment of the present invention, the image stabilization method and apparatus accumulatively updates a template using a current frame, and thereby may reduce an effect of camera noise in the template and an adverse effect on a frame.
[75] Although a few embodiments of the present invention have been shown and described, the present invention is not limited to the described embodiments. Instead, it would be appreciated by those skilled in the art that changes may be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims

Claims
[I] An image stabilization method, comprising: providing a target frame; calculating Sum of Absolute Differences (SADs) between a predetermined template and the target frame at predetermined positions in a first region; and determining whether a minimum value of the calculated SADs is located in the center of the first region. [2] The image stabilization method of claim 1, further comprising: estimating a displacement of the target frame based on a location of the minimum value of the calculated SADs in the first region. [3] The image stabilization method of claim 2, wherein the estimating comprises: defining a quadratic curve based on the minimum value of the calculated SADs and two adjacent values of the minimum value of the calculated SADs and refining the estimated displacement of the target frame using the defined quadratic curve. [4] The image stabilization method of claim 3, further comprising: correcting the target frame based on the refined displacement of the target frame. [5] The image stabilization method of claim 4, further comprising: updating the predetermined template using the target frame. [6] The image stabilization method of claim 5, wherein the correcting calculates a motion vector of the target frame based on the refined displacement of the target frame, and corrects the target frame using the calculated motion vector. [7] The image stabilization method of claim 1, further comprising: setting a second region adjacent to the first region, when the minimum value of the SADs is not located in the center of the first region. [8] The image stabilization method of claim 7, wherein the setting of the second region sets the second region to be centered on the minimum value of the calculated SADs of the first region. [9] The image stabilization method of claim 7, further comprising: calculating SADs between the predetermined template and the target frame at positions in the second region. [10] The image stabilization method of claim 9, further comprising: determining whether a minimum value of the SADs in the second region is located in a center of the second region.
[I I] The image stabilization method of claim 9, wherein the calculating SADs between the predetermined template and the target frame at positions in the second region calculates the SADs between the predetermined template and the target frame at the positions in the second region, excluding the positions in the first region.
[12] The image stabilization method of claim 5, wherein the updating updates the predetermined template using a section of the target frame, the section matching the predetermined template.
[13] The image stabilization method of claim 4, further comprising: filling a blank area generated in the corrected target frame using information from old frames, the information from the old frames being provided with a fading effect based on an age of the old frames.
[14] The image stabilization method of claim 2, further comprising: calculating SADs with respect to the predetermined template, the predetermined template being a plurality of templates; generating a global SAD image by combining the calculated SADs with respect to the predetermined template; and verifying validity for the estimated the displacement of the target frame using the global SAD image.
[15] The image stabilization method of claim 2, further comprising: when the target frame is an interlaced video frame including at least two fields, processing the at least two fields as consecutive frames to estimate a displacement of the interlaced video frame.
PCT/KR2008/006368 2008-10-29 2008-10-29 Image stabilization method and device for performing the method WO2010050632A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/KR2008/006368 WO2010050632A1 (en) 2008-10-29 2008-10-29 Image stabilization method and device for performing the method
KR1020080106931A KR100895385B1 (en) 2008-10-29 2008-10-30 Image stabilization method and device for performing the method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2008/006368 WO2010050632A1 (en) 2008-10-29 2008-10-29 Image stabilization method and device for performing the method

Publications (1)

Publication Number Publication Date
WO2010050632A1 true WO2010050632A1 (en) 2010-05-06

Family

ID=42128983

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/006368 WO2010050632A1 (en) 2008-10-29 2008-10-29 Image stabilization method and device for performing the method

Country Status (1)

Country Link
WO (1) WO2010050632A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11250549B2 (en) * 2019-04-25 2022-02-15 Megvii (Beijing) Technology Co., Ltd. Method, apparatus and electric device for image fusion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001091054A1 (en) * 2000-05-19 2001-11-29 Koninklijke Philips Electronics N.V. Determination of a block matching candidate vector
JP2007181168A (en) * 2005-12-01 2007-07-12 Sony Corp Image processor and image processing method
JP2007221631A (en) * 2006-02-20 2007-08-30 Sony Corp Pickup image distortion correction method, pickup image distortion correction device, imaging method and imaging device
JP2007241352A (en) * 2006-03-06 2007-09-20 Sony Corp Image processor and image processing method, recording medium, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001091054A1 (en) * 2000-05-19 2001-11-29 Koninklijke Philips Electronics N.V. Determination of a block matching candidate vector
JP2007181168A (en) * 2005-12-01 2007-07-12 Sony Corp Image processor and image processing method
JP2007221631A (en) * 2006-02-20 2007-08-30 Sony Corp Pickup image distortion correction method, pickup image distortion correction device, imaging method and imaging device
JP2007241352A (en) * 2006-03-06 2007-09-20 Sony Corp Image processor and image processing method, recording medium, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11250549B2 (en) * 2019-04-25 2022-02-15 Megvii (Beijing) Technology Co., Ltd. Method, apparatus and electric device for image fusion

Similar Documents

Publication Publication Date Title
US9509971B2 (en) Image processing device, image processing method, and program
KR100814424B1 (en) Device for detecting occlusion area and method thereof
US20070230830A1 (en) Apparatus for creating interpolation frame
EP3023939B1 (en) Method and apparatus for tracking the motion of image content in a video frames sequence using sub-pixel resolution motion estimation
EP1549047B1 (en) Robust camera pan vector estimation using iterative center of mass
WO2014183787A1 (en) Method and apparatus for computing a synthesized picture
US20120154675A1 (en) Frame interpolation apparatus and method
US9525873B2 (en) Image processing circuit and image processing method for generating interpolated image
EP1815441B1 (en) Rendering images based on image segmentation
JP5669599B2 (en) Image processing apparatus and control method thereof
US20130083993A1 (en) Image processing device, image processing method, and program
US20060036383A1 (en) Method and device for obtaining a stereoscopic signal
JP2004356747A (en) Method and apparatus for matching image
US20130100260A1 (en) Video display apparatus, video processing device and video processing method
WO2010050632A1 (en) Image stabilization method and device for performing the method
JP5059855B2 (en) Global motion estimation method
US9894367B2 (en) Multimedia device and motion estimation method thereof
JP4886479B2 (en) Motion vector correction apparatus, motion vector correction program, interpolation frame generation apparatus, and video correction apparatus
KR100895385B1 (en) Image stabilization method and device for performing the method
US20130286289A1 (en) Image processing apparatus, image display apparatus, and image processing method
US8391365B2 (en) Motion estimator and a motion estimation method
KR101548269B1 (en) Apparatus and method for estimating motion by block segmentation and combination
KR102326163B1 (en) Display apparatus and controlling method thereof
JP2003304507A (en) Motion vector detecting apparatus and method
US10057596B2 (en) Motion estimation method and apparatus for periodic pattern

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08877791

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08877791

Country of ref document: EP

Kind code of ref document: A1