US20070172150A1 - Hand jitter reduction compensating for rotational motion - Google Patents
Hand jitter reduction compensating for rotational motion Download PDFInfo
- Publication number
- US20070172150A1 US20070172150A1 US11/534,935 US53493506A US2007172150A1 US 20070172150 A1 US20070172150 A1 US 20070172150A1 US 53493506 A US53493506 A US 53493506A US 2007172150 A1 US2007172150 A1 US 2007172150A1
- Authority
- US
- United States
- Prior art keywords
- frame
- projections
- sums
- sector
- pce
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
- G06V10/507—Summing image-intensity values; Histogram projection analysis
Definitions
- This disclosure relates to digital image processing and, more particularly, hand jitter reduction compensating for rotational motion.
- Blurriness may be caused by hand jitter.
- Hand jitter is caused by the movement of the user's hand when taking a digital picture with a camera. Even if the user is unaware of the movement, the hand may be continually moving. The movements are relatively small, but if the movements are large relative to the exposure time, the digital picture may be blurry. An object or person in the picture may appear to be moving.
- Blurriness may also be caused by an object/person moving when a picture is being taken. Blurriness may also be caused by limitations of the optical system used to capture the pictures.
- a digital camera for example, one found in a mobile unit, takes a longer time to register a picture.
- the longer exposure time increases the probability that the slight movements produced by the hand may lead to blurriness.
- the longer exposure time increases the chance that the movement by the object/person may be large relative to the exposure time.
- Multiple frame registration may be implemented by capturing multiple frames and checking the parity of the frames to determine how to register them. Registration takes place between a base frame and a movement frame. As part of the registration, a region of interest may be identified. A region of interest locator may segment a circle into a set of K sectors. A projection generator may generate a horizontal (or vertical) projection for any L th row (or column) of a sector. A projection is the summing of pixels in a column or row in a sector. The projections in each sector may be formed and summed. Each k th sum of projections is represented by S ⁇ (k).
- the set or subset (if a coarser calculation is used) of a sum of projections may be represented by a vector, and is denoted as S ⁇ .
- Vectors S ⁇ from a base frame and vector S′ ⁇ from a movement frame may be input into a projection correlator.
- the minimum projection correlation may be used to select the rotation angle estimate between the base and movement frames.
- Frame registration may be an iterative process which may be terminated if the rotation angle estimate is within a certain tolerance and cause an early exit condition and terminate the frame registration earlier than the processing of N processed frames. Frame registration also may be governed by the parity of the N processed frames.
- One of the advantages of multiple frame registration is to reduce noise and blurriness due to rotational movements as a result of hand jitter in a digital picture.
- FIG. 1 is a block diagram illustrating a digital imaging process.
- FIG. 2 is a block diagram illustrating the functionality of a pre-processing module in a digital image processing system.
- FIG. 3 is a hardware block diagram of one architectural configuration illustrating a frame registration module for estimating rotational motion.
- FIG. 4 is a hardware block diagram of another architectural configuration illustrating a frame registration module for estimating rotational motion.
- FIG. 5A-5F illustrate frame flow-trees, which may be used in the selection of which frame is a base frame and which frame is a movement frame.
- a region of interest (ROI) in a base frame, and a region of interest (ROI) in a movement frame 334 b are illustrated in FIG. 6A and FIG. 6B , respectively.
- a frame may have M columns and I rows, as illustrated in FIG. 7A .
- FIG. 7B illustrates multiple rows and columns of a sector.
- FIG. 8 illustrates a projection generator that may generate horizontal projections.
- FIG. 9 illustrates each projection by a horizontal line, with an arrow tip, spanning the whole pixels in each row, where the projection is generated over.
- FIG. 10 illustrates a radial integrator summing a set projections of any sector amongst the K sectors in a circle.
- FIG. 11 Illustrated in FIG. 11 is a possible configuration of a rotational motion vector estimator.
- FIG. 12 illustrates more details of the rotational motion vector estimator shown in FIG. 11 .
- FIG. 13 One architectural configuration of a frame registrator is shown in FIG. 13 .
- FIG. 14 illustrates a possible configuration of an early terminator.
- FIG. 15 is a flow chart illustrating a possible method of frame registration of images.
- FIG. 16A A graph of the radial integrated outputs is illustrated in FIG. 16A .
- FIG. 16B A graph of the projection correlation between two input vectors is shown in FIG. 16B .
- the fourier transform of a base frame is shown in in FIG. 17A .
- the fourier transform of a movement frame is shown in FIG. 17B .
- FIG. 17C displays a graph of the radial integration of both frames, and illustrates what the relative rotation angle estimate difference between the two frames is.
- exemplary is used herein to mean “serving as an example, instance, or illustration.” Any embodiment, configuration or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs.
- described herein is a novel method and apparatus to reduce blurriness and/or noise in digital pictures by generating a rotation angle estimate and using the estimate for frame registration.
- camera devices may contain small gyroscopes to compensate for the hand jitter produced by the user.
- digital hand jitter reduction techniques may be used in combination with devices that have gyroscopes.
- Current camera devices may also scale the gain under low lighting conditions. Unfortunately, simply increasing the gain amplifies the noise present as a result of the low light level. The result is often a picture of poor quality.
- digital compensation for hand jitter does not always provide adequate results.
- FIG. 1 is a block diagram illustrating a digital imaging process suitable for a camera device integrated into a mobile unit.
- the mobile unit may be a wireless phone, personal digital assistant (PDA), laptop computer, or any other mobile wireless device.
- a lens (not shown) may be used to focus an image onto an image sensor 102 , in an image sensor module 104 .
- image sensor module 104 may have a memory for storing gain and exposure parameters.
- Image sensor module 104 may also have a control driver for modifying gain and auto-exposure parameters.
- image sensor module 104 may be coupled to an integrated circuit, such as a Mobile Station Modem (MSMTM), or other module which has a memory and/or control driver for storing and modifying gain and auto-exposure parameters.
- MSMTM Mobile Station Modem
- the image sensor 102 may be a charge-coupled device (CCD), a complimentary metal oxide semiconductor (CMOS) image sensor, or any other suitable image sensor.
- CCD charge-coupled device
- CMOS complimentary metal oxide semiconductor
- an array of semiconductors may be used to capture light at different pixels of the image.
- a color filter array (CFA) (not shown) positioned in front of the image sensor 102 may be used to pass a single color (i.e., red, green or blue) to each semiconductor.
- the most common CFAs are RGB and CMYG patterns.
- the image sensor module 104 may drive or control image sensor 102 to modify the gain, and or exposure time.
- a preview mode may capture a series of frames produced by the image sensor 102 .
- the whole frame or a sub-part of the frame is referred to as an image or interchangeably a picture.
- the sequence of frames is also known as a stream.
- the stream may be provided to a front-end image processing module 106 where they are de-mosaiced in order to obtain full RGB resolution as an input to the still image and video compressor 108 .
- statistics may be collected on frames that aid with the production of the digital picture. These statistics may be, but are not limited to, exposure metrics, white balance metrics, and focus metrics.
- the front-end image processing module 106 may feed various signals, which help control the image sensor 102 , back into the image sensor module 104 .
- the still image and video compressor 108 may use JPEG compression, or any other suitable compression algorithm.
- An auto-exposure control module 110 may receive a value proportional to the light level being processed by the front-end image processing module 106 , and compare it to a stored light target, in order to aid in at least one of the functions of the front-end image processing module 106 . Images that are processed through the modules in front-end image processing module 106 are part of digital frames.
- the stream may also be sent to a view finder which may be located in display module 112 . In the preview mode, a preview decision from the display module 112 may be used in the control of the auto-exposure.
- the preview mode in a mobile unit having a digital camera may be used in either a normal mode or a hand jitter reduction (hjr) mode.
- the user may select the hjr mode (shown as hjr select in FIG. 1 ) through a user-interface either through a menu or manually.
- Auto-exposure parameters such as gain, auto-exposure time, frame rate and number of frames to process, may be determined within moments after the user presses the button to take a snapshot and produce a digital picture.
- the collected statistics may be used to determine auto-exposure parameters used during the snapshot in both the normal mode and the hjr mode.
- the image processing may be different between hjr mode and normal mode. Before the user presses the button the preview mode is processing images as it would in normal mode, even if the hjr mode has been selected.
- FIG. 2 is a block diagram illustrating the functionality of one configuration of one front end image processing module 106 a in a digital image processing system.
- the front-end image processing module 106 a may be used to compensate for differences between the responses of human visual system and sensor signals generated by the image sensor 102 . These differences may be corrected using various processing techniques including, by way of example, black correction and lens rolloff 202 , de-mosaic module 204 , white balance and color correction 206 , gamma adjustment 208 , and color conversion 210 .
- black correction and lens rolloff 202 de-mosaic module 204
- white balance and color correction 206 e.g., gamma adjustment 208
- color conversion 210 e.g., color conversion 210 .
- These processes are represented in FIG. 2 as separate processing modules, but alternatively may be performed using a shared hardware or software platform.
- these modules may include multiple image processing modules that perform the same function, thereby allowing the function to be performed in parallel on different images.
- hand jitter control module 212 may be sent to hand jitter control module 212 .
- the various parameters from the auto-exposure control module may be fed into hand jitter control module 212 .
- Hand jitter control module 212 may serve multiple purposes. Hand jitter control module 212 , may determine the image processing that takes place after the snapshot. Hand jitter control module 212 may detect the value of hjr select, and determine if hand jitter reduction (hjr) needs to be performed. Even though the user has selected hjr mode, hand jitter control module 212 may determine that image processing as is done in normal mode may take place.
- Hand jitter control module 212 may determine that image processing in hjr mode take place. Generating a digital picture image processing in hjr mode may include capturing a single frame or multiple frames. If hand jitter control module 212 determines that multiple frames will be captured, after passing through hjr control module, the frames may be sent to noise reduction/frame registration module 214 , along with a parameter which indicates how many frames may be processed by noise reduction/frame registration module 214 . If a single frame is to be processed, noise reduction may take place on the single frame through the use of a noise reduction module 215 . Noise reduction module may be a bayer filter, or other similar filter.
- noise reduction/frame registration module 214 may buffer the number of frames, numf, specified by hand jitter control module 212 , and perform frame registration on them. Depending on how many frames and the light level, the purpose of the multiple frame registration may serve the purpose of noise reduction and/or blur reduction. Multiple frame registration may be done by a frame registration module 216 .
- noise reduction/frame registration module 214 may not be used, and the output from color correction module 210 , for example, may be used, even though the user selected hjr mode.
- a signal (sel) may be used to select which multiplexer 217 output to send to post-process module 218 .
- the output of post-process module 218 may be sent to still and image video compressor 108 and/or display module 112 .
- hand jitter control module 212 may also output other parameters: new auto-exposure frame rate (ae fr_new), new auto-exposure gain (ae gain new), new auto-exposure time (ae time_new), and the number of frames to be processed (numf). These parameters may be sent to image sensor module 104 to control image sensor 102 .
- a digital gain may also be output by hand jitter control module 212 and may be applied at any module after the image sensor module 104 . As an example, the digital gain may be applied during the white-balance/color correction module 206 .
- pixels are normally described, sub-pixels, or multiple pixels may also be used as inputs into front-end image processing module 106 a .
- a sub-set of these image-components or other forms: RGB, and spatial-frequency transformed pixels may also be sent to a hand jitter control module, such as hand jitter control module 212 .
- the frame registration module 216 may be used to reduce the amount of blurriness or reduce noise in a digital picture with efficient processing resources suitable for mobile applications.
- a normal exposure time for a picture may be around 150-300 milli-seconds (ms).
- N frames may be captured and processed at reduced exposure times prior to frame registration.
- frame registration module 216 may compensate for the amount of rotational movement between any two frames amongst the N frames being processed at the reduced exposure times.
- N frames are processed by iteratively selecting a pair of frames at a time: a base frame and a movement frame. Compensation of rotational movement, between the base frame and the movement frame, is accomplished by estimating rotation angle during every iterative frame pair selection and “registering” the movement frame to the base frame. After computing an estimate of the rotational movement between the horizontal and vertical movement frame relative to the base frame, the movement frame is registered to the base frame by adding the estimated rotation angle estimate to the base frame. The registered frame represents the compensation of the base frame due to the estimated rotation angle between the base frame and the movement frame. The registered frame may be used as a new base frame or may be used as a new movement frame. The selection of how any frame, registered or not registered, depends on the parity of the number of frames being processed and may be configurable. The frame selection process is discussed in more detail in FIGS. 5A-5F .
- FIG. 3 is a hardware block diagram of one architectural configuration illustrating a frame registration module 216 a for estimating rotational motion.
- the number of frames being processed by frame registration module 216 may be predetermined prior to frame registration.
- Frame selector control 300 may use a configurable look-up table (see discussion for FIGS. 5A-5F ) to select amongst frames fa, fb, ga, or gb.
- the two unregistered frames being processed are designated as fa and fb.
- the two registered frames being processed are designated as ga and gb.
- three color image-components (Y, Cb, and Cr) may be input into a frame registration module 216 a .
- Y, Cb, and Cr may be input into a frame registration module 216 a .
- R, G, and B may also be used.
- spatial-frequency transformed pixels of these image-components
- pixels are normally used, sub-pixels, or multiple pixels may also be used as inputs.
- Image-component Y may be routed to input Y frame buffer 302
- image-component Cb may be routed to input/merged Cb frame buffer 304
- image-component Cr may be routed to input/merged Cr frame buffer 306 .
- Frame registration may be carried out on all three image-components (Y, Cb, and Cr). Estimation on a rotation angle need only be performed on one of the image-components, although it may be performed on more than one component. Illustrated as an example, the Y image-component may be used to estimate the rotation angle. As such, a registered frame may be routed to merged Y frame buffer 308 . In addition, the frame registration process may be carried out on only part of a frame if desired.
- Frame selector 300 may have up to five outputs, mux_sel, fsel_Cb, fsel_Cr, fsel_fy, and fsel_gY. From frames fa, fb, ga, or gb, mux_sel selects from mux 310 which two pair of frames may be used to estimate the rotation angle between a base frame and a movement frame. A base frame is designated by frame sfy_a, and a movement frame is designated by a movement frame sfy_b. Selection of frame fa and frame fb may be through signal fsel_fy, while selection of frame ga and frame gb may be through signal fsel_gY. Similarly, fsel_Cb and fsel_Cr may be used to select which Cb (sfCb) and Cr (sfCr) frames may be used for frame registration of the Cb and Cr image-components.
- Frame sfy_a may be routed to a region of interest (ROI) locator 312 .
- ROI in the art usually identifies areas in a frame that may be considered to be visually important, examples include objects, a part of an object, or a face.
- the ROI locator 312 described herein may work in conjunction with known or future ROI's algorithms that identify visually important areas.
- the ROI locator 312 may form a set of K sectors that forms a circle. The formation of the K sectors may aid in the estimate of the rotation angle. As can be seen in FIG. 3 , K sectors are input to ROI locator 312 .
- the number of K sectors may be configurable and is determined by what resolution is desired for the rotation angle estimate.
- projection generator 314 may generate a horizontal (or vertical) projection for any L th row (or column) of a sector. Interpolation or decimation of rows in a sector may also be implemented, i.e., the number of horizontal projections generated in a sector may be more or less than L.
- Radial integrator 316 sums a set or a subset of the horizontal (or vertical) projections (1 up to L) for any of the K sectors. Each k th sum of projections is represented by S ⁇ (k).
- Sum of projections (SOP) buffer 318 may store any S ⁇ (k) selected. Selection is accomplished through optional control signal sector_sel.
- a subset of the K sectors may be stored for a more coarse calculation of the rotation angle estimate.
- the set or subset (if a coarser calculation is used) of a sum of projections, i.e., ⁇ S ⁇ (0), S ⁇ (1), . . . S ⁇ (K ⁇ 1) ⁇ , may be represented by a vector, and is denoted as S ⁇ .
- frame sfY_b may be routed to a region of interest (ROI) locator 320 which has a configurable input to form K sectors.
- ROI locator 320 also segments the region of interest into a circle of K sectors, and projection generator 322 may generate a horizontal (or vertical) projection for each L th row (or column) of a sector.
- Interpolation of projections in a sector may also be implemented, i.e., the number of projections generated in a sector may be more than L.
- Decimation of projections in a sector may be implemented, i.e., the number of projections generated in a sector may be less than L.
- Radial integrator 324 sums a set or a subset of the horizontal (or vertical) projections (1 up to L) for any of the K sectors. Each k th sum of projections is represented by S′ ⁇ (k).
- Sum of projections (SOP) buffer 326 may store any S′ ⁇ (k) selected. Selection is accomplished through optional control signal selector_sel. In some cases, a subset of the K sectors may be stored for a more coarse calculation of the rotation angle estimate.
- the set or subset (if a coarser calculation is used) of a sum of projections, i.e., ⁇ S′ ⁇ (0), S′ ⁇ (1), . . . S′ ⁇ (K ⁇ 1) ⁇ may be represented by a vector, and is denoted as S′ ⁇ .
- Rotational motion vector estimator 328 receives two sets of data, namely input vectors S ⁇ and S′ ⁇ , and generates rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 .
- Rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 may be added to sfy_a, sfCb, and sfCr in frame registrator 330 .
- the resulting registered frames may be stored in buffer memories merged_Y frame buffer, input/merged Cb frame buffer, and input/merged Cr frame buffer, respectively.
- frame ga may be available for the second iteration of frame registration(s).
- Early terminator 332 may determine if the rotation angle estimate is within a certain tolerance and cause an early exit condition and terminate the frame registration earlier than the processing of N frames.
- FIG. 4 is a hardware block diagram of another architectural configuration illustrating a frame registration module 216 b for estimating rotational motion.
- the architectural configuration illustrated in FIG. 4 aims to reduce the number of components used to generate any sum of projections, S ⁇ (k). As such, there is only one ROI locator 312 , one projections generator 314 , and one radial integrator 316 , instead of two of each of these components, as shown in FIG. 3 .
- projections Ply through PLY that may be generated by projections generator 314 may be filtered through an optional low pass filter 315 . Filtered projections P 1 ′ y through PL′y may be passed into radial integrator 316 to generate a sum of each set of projections (1 up to L) for any sector.
- an sfy_a frame is either a base frame or a movement frame, that is, sfy_a is of the following form: [base frame, movement frame, base frame, movement frame, . . . base frame, movement frame].
- toggle pass (TP) 311 in the upper left corner of FIG. 3 , allows sfy_a to pass into frame registrator 330 .
- Filtered sector sum of projections (SOP) buffer 317 may store any sum from each set of projections for any sector amongst the set of sectors from a base frame and may store each sum from each set of projections for any sector amongst the set of sectors from a movement frame.
- the set of sectors from a base frame or a movement frame is typically K sectors, less than K sectors may be selected through optional control signal sector_sel.
- Interpolation of projections in a sector may also be implemented, i.e., the number of projections generated in a sector may be more than L.
- Decimation of projections in a sector may be implemented, i.e., the number of projections generated in a sector may be less than L.
- Radial integrator 316 may sum a set or a subset of the horizontal (or vertical) projections (either filtered or unfiltered) for any of the K sectors. Each k th sum of projections is represented by S ⁇ (k).
- Sum of projections (SOP) buffer 317 may store any S ⁇ (k). In some cases, a subset of the K sectors may be summed for a more coarse calculation of the rotational angle estimate.
- the set or subset (if a coarser calculation is used) of a sum of projections may be represented by a vector, and is denoted as vector S ⁇ for a base frame, and as a different vector S′ ⁇ for a movement frame.
- the inputs (S ⁇ and S′ ⁇ ) and the output (rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 ) to motion vector estimator 328 , as well as the components and calculations that follow the estimate of rotation angle ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 , are as disclosed in FIG. 3 .
- FIG. 3 and FIG. 4 what is disclosed illustrates that for any base frame and movement frame pair, on at least one image-component (e.g. Y), an iteration of frame registration takes place. For every iteration, a rotation angle estimate angle ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 may be generated.
- Selection of which frame is a base frame and which frame is a movement frame may be designated by frame flow-trees such as those illustrated in FIG. 5A-5F , and which may be implemented in a block such as frame selector control 300 .
- Frame flow-trees may be implemented by using a configurable look-up-table (LUT) designating what frames to register in each row of a frame flow-tree, depending on the parity of the number of frames in the row.
- LUT look-up-table
- the frame flow-tree 332 a illustrated in FIG. 5A has four rows. Row 1 shows six initial unregistered frames: f 1 (base), f 2 (movement), B (base), f 4 (movement), f 5 (base) and f 6 (movement).
- Each of the six unregistered frames may represent an image-component, for example, the Y image-component.
- Frame registration of frame f 2 to f 1 generates registered frame g 1 a in row 2
- frame registration of frame f 4 to f 3 generates registered frame g 2 a in row 2
- frame registration of frame f 6 to f 5 generates registered frame g 3 a in row 2 .
- the number of frame registrations yielding the subsequent row may be the even number divided by two.
- the mid-frame in the row may be a base frame or a movement frame.
- g 2 a is used as a movement frame
- g 2 a is used as a base frame
- Row 4 contains registered frame g 1 c generated by registering frame g 3 a to registered frame g 1 b .
- frame registration may be on a previously registered frame or an unregistered frame.
- Frame flow-tree 332 b illustrated in FIG. 5B also shows six initial unregistered frames in row 1 .
- registered frame g 2 a is used only as a movement frame.
- the process of using the mid-frame (g 2 a ), in a three frame row, as only a movement frame eliminates one frame registration iteration, although, it may not necessarily yield as accurate results.
- Frame flow-tree 332 c illustrated in FIG. 5C shows an initial five unregistered frames in row 1 . When the number of frames are odd and greater than three, the mid-frame may initially not be used in the frame registration to save on the number of frame registration iterations.
- frame pairs f 1 and f 2 are used to generate registered frames g 1 a and g 2 a .
- Frame registration from row 2 through row 4 is as described in frame flow-tree 332 a.
- Frame flow-tree 332 d illustrated in FIG. 5D shows seven initial unregistered frames in row 1 . Since the number of frames are odd and greater than three, the mid-frame may initially not be used in the frame registration to save on the number of frame registration iterations. In addition, because there are a set of a triplet of frames on each side of the mid-frame (f 4 ) in row 1 , the triplets may be processed as discussed rows 2 - 4 of frame flow-tree 332 a . This yields, in row 2 of frame flow-tree 332 d , a frame flow-tree like frame flow-tree 332 c , and may be processed accordingly.
- Frame flow-trees 332 e and 332 f illustrated in FIG. 5E and FIG. 5F show nine initial unregistered frames in row 1 .
- the triplets may be processed as discussed for rows 2 - 4 of frame flow-tree 332 a . Since the number of frames are odd and greater than three, the mid-frame (f 5 ) in row 1 of frame flow-tree 332 f , may initially not be used in the frame registration.
- the process illustrated in frame-flow tree 332 f saves on the number of frame registration iterations, it may not necessarily yield any less desirable than the process illustrated in frame flow-tree 332 e.
- frame flow-tree 332 f may be implemented in a device that takes digital pictures using hand jitter reduction, since the process is likely to be sufficient most of the time to the human eye.
- Other applications that may require higher resolutions, and are not as concerned with computation time, may wish to implement a process, where there are a higher number of total frame registrations, such as in frame flow-tree 332 e.
- a region of interest (ROI) in a base frame 334 a , and a region of interest (ROI) in a movement frame 334 b are illustrated in FIG. 6A and FIG. 6B , respectively.
- An ROI locator, such as ROI locator 312 forms a circular region as shown in both figures.
- ROI in base frame 334 a has one axis at zero degrees and the other axis at ninety degrees.
- ROI in movement frame 334 b shows the two axes rotated by angle ⁇ (theta) 336 .
- Frame 334 such as frame sfy_a of FIG. 4 , may have M columns and I rows, as illustrated in FIG. 7A .
- K sectors such as a sector 338 , may be formed around a circle 337 , as defined by an ROI locator 312 . Any sector 338 , that is part of circle 337 , may have multiple rows and columns.
- rows 1 through 8 are illustrated by figure numbers 340 - 347 .
- Row 1 is labeled as figure number 340 .
- Row 2 is labeled as figure number 341 .
- Row 3 is labeled as figure number 342 .
- Row 4 is labeled as figure number 343 .
- Row 5 is labeled as figure number 344 .
- Row 6 is labeled as figure number 345 .
- Row 7 is labeled as figure number 346 .
- Row 8 is labeled as figure number 347 .
- Projections may be on columns or rows. For illustrative purposes, discussion is limited to horizontal projections on rows, although vertical projections on columns may also be used.
- FIG. 8 illustrates a projection generator 314 that may generate a horizontal projection of row 4 (figure number 343 ) in any sector 338 .
- the pixels also sub-pixels, multiple pixels, or transformed pixels may be used) are input into summer 350 and summed to generate a projection 352 , P 3 y , P 3 y represents the third horizontal projection generated in any sector 338 by projection generator 314 .
- FIG. 7B row 1 illustrated in any sector 338 does not contain any whole pixels.
- FIG. 10 illustrates a radial integrator, such as radial integrator 316 , summing the set (a subset may also be used) of projections 360 of any sector 338 amongst the K sectors in circle 337 .
- Each k th sum is represented by S ⁇ (k).
- Sector sum of projections (SOP) buffer 317 may store any S ⁇ (k) selected.
- the set or subset (if a coarser calculation is used) of the sum of projections, i.e., ⁇ S ⁇ (0), S ⁇ (1), . . . S ⁇ (K ⁇ 1) ⁇ may be represented by a vector, is denoted as vector S ⁇ for a base frame and as a different vector S′ ⁇ for a movement frame.
- FIG. 10 Also shown in FIG. 10 , is the arc start of a sector 370 and the arc stop of a sector 371 . These start and stop points are helpful in explaining later (in FIG. 12 ), where the location of the rotation angle estimate is measured. Illustrated in FIG. 11 is one possible configuration of a rotational motion vector estimator 328 , with two inputs, vector S ⁇ 376 a and vector S′ ⁇ 376 a ′, and one output rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 326 .
- Projection correlator 380 computes the difference between the two input vectors (S ⁇ and S′ ⁇ ) and generates a projection correlation error (pce) vector at each shift position between the two input vectors.
- Each pce value may be stored in memory 386 .
- Minimum pce value index selector 388 selects the minimum pce value amongst the set of pce values stored in memory 386 . It outputs the shift position that corresponds to the minimum pce value, i.e., the index of the minimum pce value is the shift position, and is called the ⁇ (theta) factor.
- a look-up-table (LUT) 389 may be used to map the ⁇ factor to an angle which is the rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 .
- FIG. 12 illustrates more details of the rotational motion vector estimator 328 shown in FIG. 11 .
- Input vectors, S ⁇ 376 a and S′ ⁇ 376 a ′, are input into projection correlator 380 .
- Either input vector may be connected to a shifter 381 which shifts by ⁇ k positions.
- the shifter 381 is used for shift aligning the vector S ⁇ 376 a with the different vector S′ ⁇ 376 a ′.
- Subtractor 382 computes the difference between the two input vectors and generates a projection correlation error (pce) vector at each shift position between the two input vectors. Computing the norm of each pce vector at each shift position generates a pce value.
- pce projection correlation error
- each pce value is a norm of a pce vector. Illustrated in FIG. 12 is an L 1 norm. However, another norm, such as an L 2 norm or a variant of the L 2 norm may be used.
- Each pce value may be stored in memory 386 .
- Minimum pce value index selector 388 selects the minimum pce value amongst the set of pce values stored in memory 386 .
- Memory elements 386 a , 386 b , and 386 c represent pce values in the set of pce values that may be stored from the output of projection correlator 380 . As noted above, the index of the minimum pce value selected is called the 0 factor.
- a look-up-table (LUT) 389 may be used to map the 0 factor to an angle which is the rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 .
- equations such as equation 1 or equation 2 may be used to map 0 factor to the rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 .
- the shift position ⁇ k is a function of k. As disclosed above, k tracks the number of sums of projections for a sector, there may be up to K sums of projections (i.e., one for each of the K sectors).
- Equation 1 may be used for estimating rotation angles where the location of the angle is measured from arc start 370 or arc stop 371 (see FIG. 10 ).
- Equation 2 may be used for estimating rotation angles where the location angle is measured at the midpoint between arc start 370 and arc stop 371 .
- Equation 3 yields a rotational angular movement (after mapping) as a positive quantity, i.e., rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 is between (and may include 0) 0 and 360 degrees.
- Equation 4 below may also be used to capture the set of pce values. However, this may generate a rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ 329 (after mapping) that is between ⁇ 180 and 180 degrees.
- frame registrator 330 may have up to three adders, for adding rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 to any of the image-components.
- Rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 may be routed to a first adder 390 , and added to a base frame sfy_a to generate a registered frame sfy_a+ ⁇ tilde over ( ⁇ ) ⁇ .
- Rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 may be routed to a second adder 392 , and added to a base frame sfCb to generate a registered frame sfCb+ ⁇ tilde over ( ⁇ ) ⁇ .
- Rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 may be routed to a third adder 394 , and added to a base frame sfCr to generate a registered frame sfCr+ ⁇ tilde over ( ⁇ ) ⁇ .
- FIG. 14 illustrates a possible configuration of early terminator 332 .
- An angle threshold 400 ⁇ tilde over ( ⁇ ) ⁇ th
- Comparator 402 may take the difference, ⁇ tilde over ( ⁇ ) ⁇ tilde over ( ⁇ ) ⁇ th , and the sign-bit may be checked by sign-bit detector 404 .
- sign-bit detector 404 When ⁇ tilde over ( ⁇ ) ⁇ th is greater than ⁇ tilde over ( ⁇ ) ⁇ , the difference is negative and the sign bit of the difference may be set.
- the setting of the sign-bit may trigger an early exit signal, and if desired stop the processing of the N unregistered frames.
- FIG. 15 is a flow chart illustrating a possible method of frame registration of images.
- N frames are input 440 and control of frame selection 442 may be implemented as illustrated in the frame flow-trees disclosed in FIG. 5A-5F .
- Storing and fetching of image-component Y 444 , storing and fetching of image component Cb 446 , and storing and fetching of image-component Cr 448 may take place.
- Signals fsel_Y, fsel_Cb, and fsel_Cr may select which frames sfy_a (and optional sfy_b if a configuration such as shown in FIG.
- a base frame and movement frame for at least one-image component may be selected.
- a region of interest (ROI) locator may identify a region of interest, and may segment the ROI into K sectors forming a circle 450 .
- ROI region of interest
- projections may be generated 452 as disclosed above.
- low pass filtering 454 may take place as disclosed previously.
- Two vectors may be formed (as disclosed above) on the set of sum of projections and used for projection correlation 458 .
- the projection correlation 458 may generate a set of pce values.
- the shift position (i.e., index) resulting from the selection of the minimum pce value 460 , amongst the set of pce values, may be used to estimate the rotation angle between a base frame and a movement frame.
- the index of the minimum pce value is called ⁇ factor, and may be mapped to an angle 462 as disclosed previously.
- rotation angle estimate ⁇ tilde over ( ⁇ ) ⁇ 329 may be generated from frame registeration 464 . There may be an early exit signal (although not explicitly illustrated) to terminate the method prior to N frames being processed.
- the total number of sectors formed and selected, in this example, is K 1440.
- a graph 488 of the projection correlation between input vectors (S ⁇ and S′ ⁇ ) is shown versus a mapped shift angle in FIG. 16B .
- the estimation rotation is 35 degrees and was obtain by using equation 1 above.
- transformed pixels may also be used in a frame registration method or a frame registrator.
- Transformed pixels may be transformed by taking a transform that maps the pixels in the spatial domain to a spatial-frequency domain, such as a discrete cosine transform (DCT), or as shown in FIG. 17A and FIG. 17B a fourier transform.
- DCT discrete cosine transform
- the fourier transform of a base frame 490 and fourier transform of a movement frame 492 are shown to illustrate that the rotation angle between a fourier transform in a base frame, and a fourier transform in a movement frame, may be estimated with the techniques and processes disclosed herein.
- FIG. 17C displays a graph of the radial integration of both frames, and illustrates what the relative rotation angle estimate difference between the two frames is.
- the techniques may be improve removing blurriness from images with longer exposure times.
- the techniques and configurations may also aid in the reduction of hand jitter for practically any digital device that takes pictures.
- the techniques and configurations may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques and configurations may be directed to a computer-readable medium comprising computer-readable program code (also may be called computer-code), that when executed in a device that takes pictures, performs one or more of the methods mentioned above.
- the computer-readable program code may be stored on memory in the form of computer readable instructions.
- a processor such as a DSP may execute instructions stored in memory in order to carry out one or more of the techniques described herein.
- the techniques may be executed by a DSP that invokes various hardware components, such as projection correlation, to accelerate the frame registration process.
- the frame registration techniques and configurations disclosed may be implemented in one or more microprocessors, one or more application specific integrated circuits (ASICs), and one or more field programmable gate arrays (FPGAs), or some other hardware-software combination.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Studio Circuits (AREA)
Abstract
Description
- This application claims the benefit of provisional U.S. Application Ser. No. 60/760,768, entitled “HAND JITTER REDUCTION SYSTEM DESIGN,” filed Jan. 19, 2006. This disclosure is related to co-pending patent applications “A hand jitter reduction system for cameras” (Attorney Docket No. 060270), co-filed with this application, and “Hand jitter reduction compensating for linear movement” (Attorney Docet No. 060193) co-filed with this application.
- This disclosure relates to digital image processing and, more particularly, hand jitter reduction compensating for rotational motion.
- The demand for multimedia applications in mobile communications has been growing at an astounding rate. Today, a user can send and receive still images, as well as download images and video from the Internet, for viewing on a mobile unit or handset. The integration of the digital camera into the mobile unit has further contributed to the growing trend in mobile communications for multimedia functionality.
- Given the limited amount of resources like battery capacity, processing power, and transmission speed associated with a mobile unit, effective digital imaging processing techniques are needed to support multimedia functions. This requires the development of more sophisticated hardware and software that reduces computational complexity for multimedia applications while maintaining the image quality. The development of such hardware and software leads to lower power consumption and longer standby time for the mobile unit.
- One facet of the digital imaging process involves removing blurriness from a picture. Blurriness may be caused by hand jitter. Hand jitter is caused by the movement of the user's hand when taking a digital picture with a camera. Even if the user is unaware of the movement, the hand may be continually moving. The movements are relatively small, but if the movements are large relative to the exposure time, the digital picture may be blurry. An object or person in the picture may appear to be moving. Blurriness may also be caused by an object/person moving when a picture is being taken. Blurriness may also be caused by limitations of the optical system used to capture the pictures.
- Under low lighting conditions, a digital camera, for example, one found in a mobile unit, takes a longer time to register a picture. The longer exposure time increases the probability that the slight movements produced by the hand may lead to blurriness. Similarly, the longer exposure time increases the chance that the movement by the object/person may be large relative to the exposure time.
- Current techniques for compensating for camera movements involve the use of small gyroscopes or other mechanical devices. None of the techniques seem to have an acceptable way to digitally compensate for the camera movements, especially under low lighting conditions. It would be desirable to reduce the amount of blurriness in a digital picture with efficient processing resources suitable for mobile applications under all conditions.
- The details of one or more configurations are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description, drawings and claims.
- Multiple frame registration may be implemented by capturing multiple frames and checking the parity of the frames to determine how to register them. Registration takes place between a base frame and a movement frame. As part of the registration, a region of interest may be identified. A region of interest locator may segment a circle into a set of K sectors. A projection generator may generate a horizontal (or vertical) projection for any Lth row (or column) of a sector. A projection is the summing of pixels in a column or row in a sector. The projections in each sector may be formed and summed. Each kth sum of projections is represented by Sθ(k). The set or subset (if a coarser calculation is used) of a sum of projections, i.e., {Sθ(0), Sθ(1), . . . Sθ(K−1)}, may be represented by a vector, and is denoted as Sθ. Vectors Sθ from a base frame and vector S′θ from a movement frame may be input into a projection correlator. The minimum projection correlation may be used to select the rotation angle estimate between the base and movement frames. Frame registration may be an iterative process which may be terminated if the rotation angle estimate is within a certain tolerance and cause an early exit condition and terminate the frame registration earlier than the processing of N processed frames. Frame registration also may be governed by the parity of the N processed frames.
- One of the advantages of multiple frame registration is to reduce noise and blurriness due to rotational movements as a result of hand jitter in a digital picture.
- Various embodiments are illustrated by way of example, and not by way of limitation, in the accompanying drawings.
-
FIG. 1 is a block diagram illustrating a digital imaging process. -
FIG. 2 is a block diagram illustrating the functionality of a pre-processing module in a digital image processing system. -
FIG. 3 is a hardware block diagram of one architectural configuration illustrating a frame registration module for estimating rotational motion. -
FIG. 4 is a hardware block diagram of another architectural configuration illustrating a frame registration module for estimating rotational motion. -
FIG. 5A-5F illustrate frame flow-trees, which may be used in the selection of which frame is a base frame and which frame is a movement frame. - A region of interest (ROI) in a base frame, and a region of interest (ROI) in a
movement frame 334 b are illustrated inFIG. 6A andFIG. 6B , respectively. - A frame may have M columns and I rows, as illustrated in
FIG. 7A . -
FIG. 7B , illustrates multiple rows and columns of a sector. -
FIG. 8 illustrates a projection generator that may generate horizontal projections. -
FIG. 9 illustrates each projection by a horizontal line, with an arrow tip, spanning the whole pixels in each row, where the projection is generated over. -
FIG. 10 illustrates a radial integrator summing a set projections of any sector amongst the K sectors in a circle. - Illustrated in
FIG. 11 is a possible configuration of a rotational motion vector estimator. -
FIG. 12 illustrates more details of the rotational motion vector estimator shown inFIG. 11 . - One architectural configuration of a frame registrator is shown in
FIG. 13 . -
FIG. 14 illustrates a possible configuration of an early terminator. -
FIG. 15 is a flow chart illustrating a possible method of frame registration of images. - A graph of the radial integrated outputs is illustrated in
FIG. 16A . - A graph of the projection correlation between two input vectors is shown in
FIG. 16B . - The fourier transform of a base frame is shown in in
FIG. 17A . - The fourier transform of a movement frame is shown in
FIG. 17B . -
FIG. 17C , displays a graph of the radial integration of both frames, and illustrates what the relative rotation angle estimate difference between the two frames is. - The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment, configuration or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs. In general, described herein, is a novel method and apparatus to reduce blurriness and/or noise in digital pictures by generating a rotation angle estimate and using the estimate for frame registration.
- In conventional camera devices, when a user takes a snapshot (currently done by pressing a button), mostly only one frame is used to generate a picture. Methods which employ using more than one frame to generate a picture often are not successful because they yield poor results. With conventional camera devices, the picture may be blurry due to movements produced by the user's own hand movements, these hand movements are known as hand jitter. Conventional camera devices also are challenged by the amount of time required to expose a picture. Under low lighting conditions, the exposure time is typically increased. Increasing the exposure time increases the amount of noise that a user may see due to low lighting conditions as well as increases the probability that hand jitter will produce a blurry picture. Currently, camera devices may contain small gyroscopes to compensate for the hand jitter produced by the user. However, there are many challenges faced when placing gyroscopes on mobile units. Even when these challenges are overcome, the digital hand jitter reduction techniques may be used in combination with devices that have gyroscopes. Current camera devices may also scale the gain under low lighting conditions. Unfortunately, simply increasing the gain amplifies the noise present as a result of the low light level. The result is often a picture of poor quality. Similarly, digital compensation for hand jitter does not always provide adequate results. However, with the techniques disclosed throughout this disclosure, it has been possible to reduce hand jitter, as well as reduce noise under lower light conditions.
-
FIG. 1 is a block diagram illustrating a digital imaging process suitable for a camera device integrated into a mobile unit. The mobile unit may be a wireless phone, personal digital assistant (PDA), laptop computer, or any other mobile wireless device. A lens (not shown) may be used to focus an image onto animage sensor 102, in animage sensor module 104. In one configuration,image sensor module 104 may have a memory for storing gain and exposure parameters.Image sensor module 104 may also have a control driver for modifying gain and auto-exposure parameters. In another configuration,image sensor module 104 may be coupled to an integrated circuit, such as a Mobile Station Modem (MSM™), or other module which has a memory and/or control driver for storing and modifying gain and auto-exposure parameters. Theimage sensor 102 may be a charge-coupled device (CCD), a complimentary metal oxide semiconductor (CMOS) image sensor, or any other suitable image sensor. In at least one configuration of theimage sensor 102, an array of semiconductors may be used to capture light at different pixels of the image. A color filter array (CFA) (not shown) positioned in front of theimage sensor 102 may be used to pass a single color (i.e., red, green or blue) to each semiconductor. The most common CFAs are RGB and CMYG patterns. Theimage sensor module 104 may drive orcontrol image sensor 102 to modify the gain, and or exposure time. - Before a user presses the button to take a snapshot and produce a digital picture, a preview mode, may capture a series of frames produced by the
image sensor 102. The whole frame or a sub-part of the frame is referred to as an image or interchangeably a picture. For illustrative purposes, it is convenient to discuss the images being processed as a series of frames. Although it should be recognized that not the entire frame need be processed when using a front-endimage processing module 106. In addition, the sequence of frames is also known as a stream. The stream may be provided to a front-endimage processing module 106 where they are de-mosaiced in order to obtain full RGB resolution as an input to the still image andvideo compressor 108. As the stream passes through the front-endimage processing module 106, in the preview mode, statistics may be collected on frames that aid with the production of the digital picture. These statistics may be, but are not limited to, exposure metrics, white balance metrics, and focus metrics. - The front-end
image processing module 106 may feed various signals, which help control theimage sensor 102, back into theimage sensor module 104. The still image andvideo compressor 108 may use JPEG compression, or any other suitable compression algorithm. An auto-exposure control module 110 may receive a value proportional to the light level being processed by the front-endimage processing module 106, and compare it to a stored light target, in order to aid in at least one of the functions of the front-endimage processing module 106. Images that are processed through the modules in front-endimage processing module 106 are part of digital frames. The stream may also be sent to a view finder which may be located indisplay module 112. In the preview mode, a preview decision from thedisplay module 112 may be used in the control of the auto-exposure. - The preview mode in a mobile unit having a digital camera may be used in either a normal mode or a hand jitter reduction (hjr) mode. The user may select the hjr mode (shown as hjr select in
FIG. 1 ) through a user-interface either through a menu or manually. Auto-exposure parameters such as gain, auto-exposure time, frame rate and number of frames to process, may be determined within moments after the user presses the button to take a snapshot and produce a digital picture. The collected statistics may be used to determine auto-exposure parameters used during the snapshot in both the normal mode and the hjr mode. Hence, after the user presses the button, the image processing may be different between hjr mode and normal mode. Before the user presses the button the preview mode is processing images as it would in normal mode, even if the hjr mode has been selected. -
FIG. 2 is a block diagram illustrating the functionality of one configuration of one front endimage processing module 106 a in a digital image processing system. The front-endimage processing module 106 a may be used to compensate for differences between the responses of human visual system and sensor signals generated by theimage sensor 102. These differences may be corrected using various processing techniques including, by way of example, black correction andlens rolloff 202,de-mosaic module 204, white balance andcolor correction 206,gamma adjustment 208, andcolor conversion 210. These processes are represented inFIG. 2 as separate processing modules, but alternatively may be performed using a shared hardware or software platform. Moreover, these modules may include multiple image processing modules that perform the same function, thereby allowing the function to be performed in parallel on different images. - After the color conversion module processes a frame, three color image-components (Y, Cb, and Cr) may be may be sent to hand
jitter control module 212. The various parameters from the auto-exposure control module may be fed into handjitter control module 212. Handjitter control module 212 may serve multiple purposes. Handjitter control module 212, may determine the image processing that takes place after the snapshot. Handjitter control module 212 may detect the value of hjr select, and determine if hand jitter reduction (hjr) needs to be performed. Even though the user has selected hjr mode, handjitter control module 212 may determine that image processing as is done in normal mode may take place. Handjitter control module 212 may determine that image processing in hjr mode take place. Generating a digital picture image processing in hjr mode may include capturing a single frame or multiple frames. If handjitter control module 212 determines that multiple frames will be captured, after passing through hjr control module, the frames may be sent to noise reduction/frame registration module 214, along with a parameter which indicates how many frames may be processed by noise reduction/frame registration module 214. If a single frame is to be processed, noise reduction may take place on the single frame through the use of anoise reduction module 215. Noise reduction module may be a bayer filter, or other similar filter. If multiple frames are to be processed, noise reduction/frame registration module 214 may buffer the number of frames, numf, specified by handjitter control module 212, and perform frame registration on them. Depending on how many frames and the light level, the purpose of the multiple frame registration may serve the purpose of noise reduction and/or blur reduction. Multiple frame registration may be done by aframe registration module 216. - If hand
jitter control module 212 determines that image processing takes place as in normal mode, noise reduction/frame registration module 214 may not be used, and the output fromcolor correction module 210, for example, may be used, even though the user selected hjr mode. Depending on what image processing (the one in normal node or the one in hjr mode) is determined by handjitter control module 212, a signal (sel) may be used to select which multiplexer 217 output to send topost-process module 218. The output ofpost-process module 218 may be sent to still andimage video compressor 108 and/ordisplay module 112. - In addition to outputting a select signal (sel) and the number of frames to use for noise reduction and/or frame registration, hand
jitter control module 212 may also output other parameters: new auto-exposure frame rate (ae fr_new), new auto-exposure gain (ae gain new), new auto-exposure time (ae time_new), and the number of frames to be processed (numf). These parameters may be sent to imagesensor module 104 to controlimage sensor 102. A digital gain may also be output by handjitter control module 212 and may be applied at any module after theimage sensor module 104. As an example, the digital gain may be applied during the white-balance/color correction module 206. - Those ordinarily skilled in the art will recognize that while pixels are normally described, sub-pixels, or multiple pixels may also be used as inputs into front-end
image processing module 106 a. Furthermore, a sub-set of these image-components or other forms: RGB, and spatial-frequency transformed pixels, may also be sent to a hand jitter control module, such as handjitter control module 212. - As mentioned previously, the
frame registration module 216 may used to reduce the amount of blurriness or reduce noise in a digital picture with efficient processing resources suitable for mobile applications. Currently, a normal exposure time for a picture may be around 150-300 milli-seconds (ms). Instead of capturing one picture (frame) in 150-300 ms, N frames may be captured and processed at reduced exposure times prior to frame registration. In order to reduce the amount of blurriness in a picture,frame registration module 216 may compensate for the amount of rotational movement between any two frames amongst the N frames being processed at the reduced exposure times. - Typically in a
frame registration module 216, N frames are processed by iteratively selecting a pair of frames at a time: a base frame and a movement frame. Compensation of rotational movement, between the base frame and the movement frame, is accomplished by estimating rotation angle during every iterative frame pair selection and “registering” the movement frame to the base frame. After computing an estimate of the rotational movement between the horizontal and vertical movement frame relative to the base frame, the movement frame is registered to the base frame by adding the estimated rotation angle estimate to the base frame. The registered frame represents the compensation of the base frame due to the estimated rotation angle between the base frame and the movement frame. The registered frame may be used as a new base frame or may be used as a new movement frame. The selection of how any frame, registered or not registered, depends on the parity of the number of frames being processed and may be configurable. The frame selection process is discussed in more detail inFIGS. 5A-5F . - The frame selection process may be implemented by a
frame selector control 300, seen inFIG. 3 .FIG. 3 is a hardware block diagram of one architectural configuration illustrating a frame registration module 216 a for estimating rotational motion. As mentioned above, the number of frames being processed byframe registration module 216 may be predetermined prior to frame registration.Frame selector control 300 may use a configurable look-up table (see discussion forFIGS. 5A-5F ) to select amongst frames fa, fb, ga, or gb. The two unregistered frames being processed are designated as fa and fb. The two registered frames being processed are designated as ga and gb. - After the color conversion process three color image-components (Y, Cb, and Cr) may be input into a frame registration module 216 a. However, artisans ordinarily skilled in the art will recognize that a sub-set of these image-components or other forms: R, G, and B; and spatial-frequency transformed pixels; of these image-components, may also be used. Furthermore, while pixels are normally used, sub-pixels, or multiple pixels may also be used as inputs. Image-component Y may be routed to input
Y frame buffer 302, image-component Cb may be routed to input/mergedCb frame buffer 304, and image-component Cr may be routed to input/mergedCr frame buffer 306. Frame registration may be carried out on all three image-components (Y, Cb, and Cr). Estimation on a rotation angle need only be performed on one of the image-components, although it may be performed on more than one component. Illustrated as an example, the Y image-component may be used to estimate the rotation angle. As such, a registered frame may be routed to mergedY frame buffer 308. In addition, the frame registration process may be carried out on only part of a frame if desired. -
Frame selector 300 may have up to five outputs, mux_sel, fsel_Cb, fsel_Cr, fsel_fy, and fsel_gY. From frames fa, fb, ga, or gb, mux_sel selects frommux 310 which two pair of frames may be used to estimate the rotation angle between a base frame and a movement frame. A base frame is designated by frame sfy_a, and a movement frame is designated by a movement frame sfy_b. Selection of frame fa and frame fb may be through signal fsel_fy, while selection of frame ga and frame gb may be through signal fsel_gY. Similarly, fsel_Cb and fsel_Cr may be used to select which Cb (sfCb) and Cr (sfCr) frames may be used for frame registration of the Cb and Cr image-components. - Frame sfy_a may be routed to a region of interest (ROI)
locator 312. An ROI in the art usually identifies areas in a frame that may be considered to be visually important, examples include objects, a part of an object, or a face. TheROI locator 312 described herein may work in conjunction with known or future ROI's algorithms that identify visually important areas. Once a region of interest is identified, theROI locator 312 may form a set of K sectors that forms a circle. The formation of the K sectors may aid in the estimate of the rotation angle. As can be seen inFIG. 3 , K sectors are input toROI locator 312. The number of K sectors may be configurable and is determined by what resolution is desired for the rotation angle estimate. AfterROI locator 312 segments the region of interest into K sectors, forming a circle,projection generator 314 may generate a horizontal (or vertical) projection for any Lth row (or column) of a sector. Interpolation or decimation of rows in a sector may also be implemented, i.e., the number of horizontal projections generated in a sector may be more or less thanL. Radial integrator 316 sums a set or a subset of the horizontal (or vertical) projections (1 up to L) for any of the K sectors. Each kth sum of projections is represented by Sθ(k). Sum of projections (SOP)buffer 318 may store any Sθ(k) selected. Selection is accomplished through optional control signal sector_sel. In some cases, a subset of the K sectors may be stored for a more coarse calculation of the rotation angle estimate. The set or subset (if a coarser calculation is used) of a sum of projections, i.e., {Sθ(0), Sθ(1), . . . Sθ(K−1)}, may be represented by a vector, and is denoted as Sθ. - Similarly, frame sfY_b may be routed to a region of interest (ROI)
locator 320 which has a configurable input to form K sectors.ROI locator 320 also segments the region of interest into a circle of K sectors, andprojection generator 322 may generate a horizontal (or vertical) projection for each Lth row (or column) of a sector. Interpolation of projections in a sector may also be implemented, i.e., the number of projections generated in a sector may be more than L. Decimation of projections in a sector may be implemented, i.e., the number of projections generated in a sector may be less thanL. Radial integrator 324 sums a set or a subset of the horizontal (or vertical) projections (1 up to L) for any of the K sectors. Each kth sum of projections is represented by S′θ(k). Sum of projections (SOP)buffer 326 may store any S′θ(k) selected. Selection is accomplished through optional control signal selector_sel. In some cases, a subset of the K sectors may be stored for a more coarse calculation of the rotation angle estimate. The set or subset (if a coarser calculation is used) of a sum of projections, i.e., {S′θ(0), S′θ(1), . . . S′θ(K−1)}, may be represented by a vector, and is denoted as S′θ. - Rotational
motion vector estimator 328 receives two sets of data, namely input vectors Sθ and S′θ, and generates rotation angle estimate {tilde over (θ)}{tilde over (θ)} 329. Rotation angle estimate {tilde over (θ)} 329 may be added to sfy_a, sfCb, and sfCr inframe registrator 330. The resulting registered frames, namely, sfy_a+{tilde over (θ)}, sfCb+{tilde over (θ)}, and sfCr+{tilde over (θ)}, may be stored in buffer memories merged_Y frame buffer, input/merged Cb frame buffer, and input/merged Cr frame buffer, respectively. After the first iteration of frame registration(s), frame ga may be available for the second iteration of frame registration(s).Early terminator 332 may determine if the rotation angle estimate is within a certain tolerance and cause an early exit condition and terminate the frame registration earlier than the processing of N frames. -
FIG. 4 is a hardware block diagram of another architectural configuration illustrating a frame registration module 216 b for estimating rotational motion. The architectural configuration illustrated inFIG. 4 aims to reduce the number of components used to generate any sum of projections, Sθ(k). As such, there is only oneROI locator 312, oneprojections generator 314, and oneradial integrator 316, instead of two of each of these components, as shown inFIG. 3 . In addition, projections Ply through PLY that may be generated byprojections generator 314 may be filtered through an optionallow pass filter 315. Filtered projections P1′y through PL′y may be passed intoradial integrator 316 to generate a sum of each set of projections (1 up to L) for any sector. - Although frames may be processed simultaneously, through interleaving of rows between a base frame and a movement frame, the output of
radial integrator 316 inFIG. 4 illustrates the processing of any one of K sectors from one frame (either a base frame or a movement frame) to generate a sum of projections for a sector, Sθ(k). In the architectural configuration illustrated inFIG. 4 , an sfy_a frame is either a base frame or a movement frame, that is, sfy_a is of the following form: [base frame, movement frame, base frame, movement frame, . . . base frame, movement frame]. When sfy_a is a base frame, toggle pass (TP) 311, in the upper left corner ofFIG. 3 , allows sfy_a to pass intoframe registrator 330. - Filtered sector sum of projections (SOP)
buffer 317 may store any sum from each set of projections for any sector amongst the set of sectors from a base frame and may store each sum from each set of projections for any sector amongst the set of sectors from a movement frame. Although the set of sectors from a base frame or a movement frame is typically K sectors, less than K sectors may be selected through optional control signal sector_sel. - Interpolation of projections in a sector may also be implemented, i.e., the number of projections generated in a sector may be more than L. Decimation of projections in a sector may be implemented, i.e., the number of projections generated in a sector may be less than
L. Radial integrator 316 may sum a set or a subset of the horizontal (or vertical) projections (either filtered or unfiltered) for any of the K sectors. Each kth sum of projections is represented by Sθ(k). Sum of projections (SOP)buffer 317 may store any Sθ(k). In some cases, a subset of the K sectors may be summed for a more coarse calculation of the rotational angle estimate. The set or subset (if a coarser calculation is used) of a sum of projections, i.e., {Sθ(0), Sθ(1), . . . Sθ(K−1)}, may be represented by a vector, and is denoted as vector Sθ for a base frame, and as a different vector S′θ for a movement frame. - The inputs (Sθ and S′θ) and the output (rotation angle estimate {tilde over (θ)}{tilde over (θ)} 329) to
motion vector estimator 328, as well as the components and calculations that follow the estimate of rotation angle {tilde over (θ)}{tilde over (θ)} 329, are as disclosed inFIG. 3 . In bothFIG. 3 andFIG. 4 , what is disclosed illustrates that for any base frame and movement frame pair, on at least one image-component (e.g. Y), an iteration of frame registration takes place. For every iteration, a rotation angle estimate angle {tilde over (θ)}{tilde over (θ)} 329 may be generated. - Selection of which frame is a base frame and which frame is a movement frame may be designated by frame flow-trees such as those illustrated in
FIG. 5A-5F , and which may be implemented in a block such asframe selector control 300. Frame flow-trees may be implemented by using a configurable look-up-table (LUT) designating what frames to register in each row of a frame flow-tree, depending on the parity of the number of frames in the row. The frame flow-tree 332 a illustrated inFIG. 5A has four rows.Row 1 shows six initial unregistered frames: f1 (base), f2 (movement), B (base), f4 (movement), f5 (base) and f6 (movement). Each of the six unregistered frames may represent an image-component, for example, the Y image-component. Frame registration of frame f2 to f1 generates registered frame g1 a inrow 2, frame registration of frame f4 to f3 generates registered frame g2 a inrow 2, and frame registration of frame f6 to f5 generates registered frame g3 a inrow 2. When there is an even number of frames in a row, the number of frame registrations yielding the subsequent row may be the even number divided by two. When there are three number of frames in a row, the mid-frame in the row, may be a base frame or a movement frame. For example, to generate frame g1 b inrow 3, g2 a is used as a movement frame, and to generate frame g2 b inrow 3, g2 a is used as a base frame. Row 4 contains registered frame g1 c generated by registering frame g3 a to registered frame g1 b. As can be seen, frame registration may be on a previously registered frame or an unregistered frame. - Frame flow-tree 332 b illustrated in
FIG. 5B also shows six initial unregistered frames inrow 1. However, registered frame g2 a is used only as a movement frame. The process of using the mid-frame (g2 a), in a three frame row, as only a movement frame eliminates one frame registration iteration, although, it may not necessarily yield as accurate results. Frame flow-tree 332 c illustrated inFIG. 5C shows an initial five unregistered frames inrow 1. When the number of frames are odd and greater than three, the mid-frame may initially not be used in the frame registration to save on the number of frame registration iterations. That is, frame pairs f1 and f2, as well as frame pairs f4 and f5, are used to generate registered frames g1 a and g2 a. Frame registration fromrow 2 through row 4 is as described in frame flow-tree 332 a. - Frame flow-tree 332 d illustrated in
FIG. 5D shows seven initial unregistered frames inrow 1. Since the number of frames are odd and greater than three, the mid-frame may initially not be used in the frame registration to save on the number of frame registration iterations. In addition, because there are a set of a triplet of frames on each side of the mid-frame (f4) inrow 1, the triplets may be processed as discussed rows 2-4 of frame flow-tree 332 a. This yields, inrow 2 of frame flow-tree 332 d, a frame flow-tree like frame flow-tree 332 c, and may be processed accordingly. - Frame flow-trees 332 e and 332 f illustrated in
FIG. 5E andFIG. 5F , respectively, show nine initial unregistered frames inrow 1. There are three sets of triplets of frames inrow 1 of frame flow-tree 332 e. The triplets may be processed as discussed for rows 2-4 of frame flow-tree 332 a. Since the number of frames are odd and greater than three, the mid-frame (f5) inrow 1 of frame flow-tree 332 f, may initially not be used in the frame registration. Although using the process illustrated in frame-flow tree 332 f saves on the number of frame registration iterations, it may not necessarily yield any less desirable than the process illustrated in frame flow-tree 332 e. - As the number of frames increases the exposure times between frames decreases and the probability that there is a smaller rotational angular displacement increases. That is, the estimated rotation angle between frames when the exposure time is smaller is likely to be smaller, thus, accuracy in the estimate used for compensating for the rotation angle better. Hence, the process illustrated in frame flow-tree 332 f may be implemented in a device that takes digital pictures using hand jitter reduction, since the process is likely to be sufficient most of the time to the human eye. Other applications that may require higher resolutions, and are not as concerned with computation time, may wish to implement a process, where there are a higher number of total frame registrations, such as in frame flow-tree 332 e.
- A region of interest (ROI) in a
base frame 334 a, and a region of interest (ROI) in amovement frame 334 b are illustrated inFIG. 6A andFIG. 6B , respectively. An ROI locator, such asROI locator 312 forms a circular region as shown in both figures. ROI inbase frame 334 a has one axis at zero degrees and the other axis at ninety degrees. ROI inmovement frame 334 b shows the two axes rotated by angle θ (theta) 336. By frame registration of amovement frame 334 b to abase frame 334 a, an estimate of the rotation angle between the two frames is made, i.e., the estimation ofangle theta 336. -
Frame 334, such as frame sfy_a ofFIG. 4 , may have M columns and I rows, as illustrated inFIG. 7A . K sectors, such as asector 338, may be formed around acircle 337, as defined by anROI locator 312. Anysector 338, that is part ofcircle 337, may have multiple rows and columns. InFIG. 7B ,rows 1 through 8 are illustrated by figure numbers 340-347.Row 1 is labeled asfigure number 340.Row 2 is labeled asfigure number 341.Row 3 is labeled asfigure number 342. Row 4 is labeled asfigure number 343. Row 5 is labeled asfigure number 344. Row 6 is labeled asfigure number 345. Row 7 is labeled asfigure number 346. Row 8 is labeled asfigure number 347. - Projections may be on columns or rows. For illustrative purposes, discussion is limited to horizontal projections on rows, although vertical projections on columns may also be used.
FIG. 8 illustrates aprojection generator 314 that may generate a horizontal projection of row 4 (figure number 343) in anysector 338. The pixels (also sub-pixels, multiple pixels, or transformed pixels may be used) are input intosummer 350 and summed to generate aprojection 352, P3 y, P3 y represents the third horizontal projection generated in anysector 338 byprojection generator 314. Although there are 8 rows illustrated inFIG. 7B ,row 1 illustrated in anysector 338 does not contain any whole pixels. Generally, projections are computed on rows (or columns) which have more than one whole pixel to sum over. If the sector boundary cuts off any pixels in a row (or column) as is cutoff forrow 1, the pixels are not used in the sum used to generate a projection. Hence, what is illustrated are horizontal projections generated over rows 2-8. Projection Ply is generated overrow 2, projection P2 y overrow 3, etcetera. The set of projections {P1 y, P2 y, . . . , PLy} 360, where L=7, in anysector 338 are shown inFIG. 9 .FIG. 9 illustrates each projection by a horizontal line, with an arrow tip, spanning the whole pixels in each row, where the projection is generated over. -
FIG. 10 illustrates a radial integrator, such asradial integrator 316, summing the set (a subset may also be used) ofprojections 360 of anysector 338 amongst the K sectors incircle 337. Each kth sum is represented by Sθ(k). Sector sum of projections (SOP)buffer 317 may store any Sθ(k) selected. The set or subset (if a coarser calculation is used) of the sum of projections, i.e., {Sθ(0), Sθ(1), . . . Sθ(K−1)}, may be represented by a vector, is denoted as vector Sθ for a base frame and as a different vector S′θ for a movement frame. Also shown inFIG. 10 , is the arc start of asector 370 and the arc stop of asector 371. These start and stop points are helpful in explaining later (inFIG. 12 ), where the location of the rotation angle estimate is measured. Illustrated inFIG. 11 is one possible configuration of a rotationalmotion vector estimator 328, with two inputs,vector S θ 376 a and vector S′θ 376 a′, and one output rotation angle estimate {tilde over (θ)} 326.Projection correlator 380, computes the difference between the two input vectors (Sθ and S′θ) and generates a projection correlation error (pce) vector at each shift position between the two input vectors. Computing the difference of the input vectors for the set of shift positions between the input vectors generates a set of pce vectors. Computing the norm of each pce vector at each shift generates a pce value. Each pce value may be stored inmemory 386. Minimum pcevalue index selector 388 selects the minimum pce value amongst the set of pce values stored inmemory 386. It outputs the shift position that corresponds to the minimum pce value, i.e., the index of the minimum pce value is the shift position, and is called the θ (theta) factor. A look-up-table (LUT) 389 may be used to map the θ factor to an angle which is the rotation angle estimate {tilde over (θ)}{tilde over (θ)} 329. -
FIG. 12 illustrates more details of the rotationalmotion vector estimator 328 shown inFIG. 11 . Input vectors,S θ 376 a and S′θ 376 a′, are input intoprojection correlator 380. Either input vector may be connected to ashifter 381 which shifts by Δk positions. Theshifter 381 is used for shift aligning thevector S θ 376 a with the different vector S′θ 376 a′.Subtractor 382 computes the difference between the two input vectors and generates a projection correlation error (pce) vector at each shift position between the two input vectors. Computing the norm of each pce vector at each shift position generates a pce value. To compute the norm, abs block 383 computes the absolute value of the pce vector, andsummer 384 sums all the elements of the pce vector. Thus, each pce value is a norm of a pce vector. Illustrated inFIG. 12 is an L1 norm. However, another norm, such as an L2 norm or a variant of the L2 norm may be used. Each pce value may be stored inmemory 386. Minimum pcevalue index selector 388 selects the minimum pce value amongst the set of pce values stored inmemory 386.Memory elements projection correlator 380. As noted above, the index of the minimum pce value selected is called the 0 factor. - A look-up-table (LUT) 389 may be used to map the 0 factor to an angle which is the rotation angle estimate {tilde over (θ)} 329. Instead of a LUT, equations such as
equation 1 orequation 2 may be used to map 0 factor to the rotation angle estimate {tilde over (θ)} 329. It should be noted that in both equations, the shift position Δk is a function of k. As disclosed above, k tracks the number of sums of projections for a sector, there may be up to K sums of projections (i.e., one for each of the K sectors). -
- where Δk=k+1, and k=0 . . . K−1.
Equation 1 may be used for estimating rotation angles where the location of the angle is measured from arc start 370 or arc stop 371 (seeFIG. 10 ).Equation 2 may be used for estimating rotation angles where the location angle is measured at the midpoint between arc start 370 andarc stop 371. -
- where Δk=k, and k=0 . . . K−1.
- Mathematically, the set (for all values of Δk) of pce values to estimate a rotational angular movement between frames is captured by
equation 3 below: -
- The form of
equation 3 yields a rotational angular movement (after mapping) as a positive quantity, i.e., rotation angle estimate {tilde over (θ)}{tilde over (θ)} 329 is between (and may include 0) 0 and 360 degrees. Equation 4 below may also be used to capture the set of pce values. However, this may generate a rotation angle estimate {tilde over (θ)}{tilde over (θ)} 329 (after mapping) that is between −180 and 180 degrees. -
- Subsequent to all N frames being processed on one image-component (e.g., Y), it may be possible to add all the estimated rotation angles to the appropriate frames on the other image-components (e.g., Cb and Cr). This may happen because projections need only be generated for one image-component and the frame registration sequence is known beforehand via a frame flow-tree. This possible architecture configuration of
frame registrator 330 is not shown. One configuration offrame registrator 330, which is shown inFIG. 13 , may have up to three adders, for adding rotation angle estimate {tilde over (θ)} 329 to any of the image-components. Rotation angle estimate {tilde over (θ)} 329 may be routed to afirst adder 390, and added to a base frame sfy_a to generate a registered frame sfy_a+{tilde over (θ)}. Rotation angle estimate {tilde over (θ)} 329 may be routed to asecond adder 392, and added to a base frame sfCb to generate a registered frame sfCb+{tilde over (θ)}. Rotation angle estimate {tilde over (θ)} 329 may be routed to athird adder 394, and added to a base frame sfCr to generate a registered frame sfCr+{tilde over (θ)}. -
FIG. 14 illustrates a possible configuration ofearly terminator 332. Anangle threshold 400, {tilde over (θ)}th, may be compared with a comparator 402 to rotation angle estimate {tilde over (θ)} 329. Comparator 402 may take the difference, {tilde over (θ)}−{tilde over (θ)}th, and the sign-bit may be checked by sign-bit detector 404. When {tilde over (θ)}th is greater than {tilde over (θ)}, the difference is negative and the sign bit of the difference may be set. The setting of the sign-bit may trigger an early exit signal, and if desired stop the processing of the N unregistered frames. -
FIG. 15 is a flow chart illustrating a possible method of frame registration of images. N frames are input 440 and control offrame selection 442 may be implemented as illustrated in the frame flow-trees disclosed inFIG. 5A-5F . Storing and fetching of image-component Y 444, storing and fetching ofimage component Cb 446, and storing and fetching of image-component Cr 448 may take place. Signals fsel_Y, fsel_Cb, and fsel_Cr may select which frames sfy_a (and optional sfy_b if a configuration such as shown inFIG. 3 is used), sfCb, and sfCr in accordance with a frame flow-tree such as disclosed inFIG. 5A-5F . A base frame and movement frame for at least one-image component may be selected. A region of interest (ROI) locator may identify a region of interest, and may segment the ROI into K sectors forming acircle 450. For any sector in a base frame and movement frame, projections may be generated 452 as disclosed above. Potentially, low pass filtering 454 may take place as disclosed previously. By summing the filtered or unfiltered projections of a sector, as was disclosed above, any of the K sectors may be radially integrated 456. Two vectors may be formed (as disclosed above) on the set of sum of projections and used forprojection correlation 458. Theprojection correlation 458 may generate a set of pce values. The shift position (i.e., index) resulting from the selection of theminimum pce value 460, amongst the set of pce values, may be used to estimate the rotation angle between a base frame and a movement frame. The index of the minimum pce value is called θ factor, and may be mapped to anangle 462 as disclosed previously. As discussed above, rotation angle estimate {tilde over (θ)} 329 may be generated fromframe registeration 464. There may be an early exit signal (although not explicitly illustrated) to terminate the method prior to N frames being processed. - For exemplary purposes, a
graph 486 of the radial integrated outputs (for both a base frame and movement frame) versus k, an index tracking the number of sectors selected, is illustrated inFIG. 16A . The total number of sectors formed and selected, in this example, is K=1440. Similarly, for exemplary purposes, agraph 488 of the projection correlation between input vectors (Sθ and S′θ) is shown versus a mapped shift angle inFIG. 16B . The estimation rotation is 35 degrees and was obtain by usingequation 1 above. - As mentioned previously, transformed pixels may also be used in a frame registration method or a frame registrator. Transformed pixels may be transformed by taking a transform that maps the pixels in the spatial domain to a spatial-frequency domain, such as a discrete cosine transform (DCT), or as shown in
FIG. 17A andFIG. 17B a fourier transform. The fourier transform of abase frame 490 and fourier transform of amovement frame 492, are shown to illustrate that the rotation angle between a fourier transform in a base frame, and a fourier transform in a movement frame, may be estimated with the techniques and processes disclosed herein. One should note the rotated spectral pattern that can be seen betweenFIG. 17A andFIG. 17B . -
FIG. 17C , displays a graph of the radial integration of both frames, and illustrates what the relative rotation angle estimate difference between the two frames is. - A number of different configurations and techniques have been described. The techniques may be improve removing blurriness from images with longer exposure times. The techniques and configurations may also aid in the reduction of hand jitter for practically any digital device that takes pictures. The techniques and configurations may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the techniques and configurations may be directed to a computer-readable medium comprising computer-readable program code (also may be called computer-code), that when executed in a device that takes pictures, performs one or more of the methods mentioned above.
- The computer-readable program code may be stored on memory in the form of computer readable instructions. In that case, a processor such as a DSP may execute instructions stored in memory in order to carry out one or more of the techniques described herein. In some cases, the techniques may be executed by a DSP that invokes various hardware components, such as projection correlation, to accelerate the frame registration process. The frame registration techniques and configurations disclosed may be implemented in one or more microprocessors, one or more application specific integrated circuits (ASICs), and one or more field programmable gate arrays (FPGAs), or some other hardware-software combination. These techniques and configurations are within the scope of the following claims.
Claims (39)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/534,935 US7970239B2 (en) | 2006-01-19 | 2006-09-25 | Hand jitter reduction compensating for rotational motion |
PCT/US2007/060808 WO2007085007A2 (en) | 2006-01-19 | 2007-01-19 | Hand jitter reduction compensating for rotational motion |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US76076806P | 2006-01-19 | 2006-01-19 | |
US11/534,935 US7970239B2 (en) | 2006-01-19 | 2006-09-25 | Hand jitter reduction compensating for rotational motion |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070172150A1 true US20070172150A1 (en) | 2007-07-26 |
US7970239B2 US7970239B2 (en) | 2011-06-28 |
Family
ID=38285642
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/534,935 Active 2030-03-06 US7970239B2 (en) | 2006-01-19 | 2006-09-25 | Hand jitter reduction compensating for rotational motion |
Country Status (2)
Country | Link |
---|---|
US (1) | US7970239B2 (en) |
WO (1) | WO2007085007A2 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070166020A1 (en) * | 2006-01-19 | 2007-07-19 | Shuxue Quan | Hand jitter reduction system for cameras |
US20070236579A1 (en) * | 2006-01-19 | 2007-10-11 | Jingqiang Li | Hand jitter reduction for compensating for linear displacement |
US7970239B2 (en) * | 2006-01-19 | 2011-06-28 | Qualcomm Incorporated | Hand jitter reduction compensating for rotational motion |
US20110216157A1 (en) * | 2010-03-05 | 2011-09-08 | Tessera Technologies Ireland Limited | Object Detection and Rendering for Wide Field of View (WFOV) Image Acquisition Systems |
US20110249739A1 (en) * | 2010-04-12 | 2011-10-13 | Sony Corporation | Context adaptive directional intra prediction |
WO2012093393A1 (en) * | 2011-01-07 | 2012-07-12 | Seal Mobile Id Ltd | Method and system for unobtrusive mobile device user recognition |
US8493459B2 (en) | 2011-09-15 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Registration of distorted images |
US8493460B2 (en) * | 2011-09-15 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Registration of differently scaled images |
US8723959B2 (en) | 2011-03-31 | 2014-05-13 | DigitalOptics Corporation Europe Limited | Face and other object tracking in off-center peripheral regions for nonlinear lens geometries |
US20140212052A1 (en) * | 2013-01-25 | 2014-07-31 | Delta Electronics, Inc. | Method of fast image matching |
US8928730B2 (en) | 2012-07-03 | 2015-01-06 | DigitalOptics Corporation Europe Limited | Method and system for correcting a distorted input image |
US20150112135A1 (en) * | 2012-06-28 | 2015-04-23 | Olympus Corporation | Image processing apparatus, image processing method, and computer-readable recording medium |
WO2019013603A1 (en) * | 2017-07-14 | 2019-01-17 | 주식회사 엘지화학 | Method for analyzing polymer layer |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8189961B2 (en) * | 2010-06-09 | 2012-05-29 | Microsoft Corporation | Techniques in optical character recognition |
US20150116523A1 (en) * | 2013-10-25 | 2015-04-30 | Nvidia Corporation | Image signal processor and method for generating image statistics |
US9547884B1 (en) * | 2015-12-01 | 2017-01-17 | Information Systems Laboratories, Inc. | Image registration using a modified log polar transformation |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2832110A (en) * | 1951-11-01 | 1958-04-29 | Blaw Knox Co | Ladle stopper control apparatus |
US4446521A (en) * | 1980-03-28 | 1984-05-01 | Tokyo Shibaura Denki Kabushiki Kaisha | Image reconstruction apparatus and process |
US4845766A (en) * | 1987-05-11 | 1989-07-04 | Nippon Sheet Glass Co., Ltd. | Simultaneous projection feature analysis apparatus |
US4922543A (en) * | 1984-12-14 | 1990-05-01 | Sten Hugo Nils Ahlbom | Image processing device |
US5262856A (en) * | 1992-06-04 | 1993-11-16 | Massachusetts Institute Of Technology | Video image compositing techniques |
US5832110A (en) * | 1996-05-28 | 1998-11-03 | Ricoh Company, Ltd. | Image registration using projection histogram matching |
US5832101A (en) * | 1995-09-19 | 1998-11-03 | Samsung Electronics Co., Ltd. | Device and method for detecting a motion vector of an image |
US5943450A (en) * | 1996-11-27 | 1999-08-24 | Samsung Electronics Co., Ltd. | Apparatus and method for compensating for camera vibration during video photography |
US6128047A (en) * | 1998-05-20 | 2000-10-03 | Sony Corporation | Motion estimation process and system using sparse search block-matching and integral projection |
US6166370A (en) * | 1996-05-14 | 2000-12-26 | Michel Sayag | Method and apparatus for generating a control signal |
US6310985B1 (en) * | 1998-07-29 | 2001-10-30 | Electroglas, Inc. | Measuring angular rotation of an object |
US6381279B1 (en) * | 1998-01-22 | 2002-04-30 | Hewlett-Packard Company | Method for providing motion-compensated multi-field enhancement of still images from video |
US6418168B1 (en) * | 1999-03-24 | 2002-07-09 | Sony Corporation | Motion vector detection apparatus, method of the same, and image processing apparatus |
US6522712B1 (en) * | 1999-11-19 | 2003-02-18 | General Electric Company | Reconstruction of computed tomographic images using interpolation between projection views |
US20030044048A1 (en) * | 2001-06-18 | 2003-03-06 | Zhengyou Zhang | Incremental motion estimation through local bundle adjustment |
US20040114831A1 (en) * | 2002-12-17 | 2004-06-17 | Xerox Corporation | Non-iterative method of calculating image skew |
US20040125908A1 (en) * | 2002-07-23 | 2004-07-01 | Erdogan Cesmeli | Method and apparatus for deriving motion information from projection data |
US20040145673A1 (en) * | 2003-01-15 | 2004-07-29 | Koichi Washisu | Camera and program |
US20040170246A1 (en) * | 2001-06-15 | 2004-09-02 | Anne Koenig | Method for reconstruction of an image of a moving object |
US20040239775A1 (en) * | 2003-05-30 | 2004-12-02 | Canon Kabushiki Kaisha | Photographing device and method for obtaining photographic image having image vibration correction |
US20050166054A1 (en) * | 2003-12-17 | 2005-07-28 | Yuji Fujimoto | Data processing apparatus and method and encoding device of same |
US20050232494A1 (en) * | 2002-01-07 | 2005-10-20 | Xerox Corporation | Image type classification using color discreteness features |
US7065261B1 (en) * | 1999-03-23 | 2006-06-20 | Minolta Co., Ltd. | Image processing device and image processing method for correction of image distortion |
US20070166020A1 (en) * | 2006-01-19 | 2007-07-19 | Shuxue Quan | Hand jitter reduction system for cameras |
US20070171981A1 (en) * | 2006-01-25 | 2007-07-26 | Yingyong Qi | Projection based techniques and apparatus that generate motion vectors used for video stabilization and encoding |
US20070236579A1 (en) * | 2006-01-19 | 2007-10-11 | Jingqiang Li | Hand jitter reduction for compensating for linear displacement |
US20070237514A1 (en) * | 2006-04-06 | 2007-10-11 | Eastman Kodak Company | Varying camera self-determination based on subject motion |
US20080292171A1 (en) * | 2007-05-25 | 2008-11-27 | Herbert Bruder | Method and X-ray CT system for generating computed tomography displays |
US7672503B2 (en) * | 2003-08-29 | 2010-03-02 | Sony Corporation | Direction-recognizing apparatus, direction-recognizing method, direction-recognizing system, and robot apparatus |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0229886A (en) | 1988-07-20 | 1990-01-31 | Ricoh Co Ltd | Method for extracting feature variable |
US6160900A (en) | 1994-02-04 | 2000-12-12 | Canon Kabushiki Kaisha | Method and apparatus for reducing the processing time required in motion vector detection |
US5745808A (en) | 1995-08-21 | 1998-04-28 | Eastman Kodak Company | Camera exposure control system using variable-length exposure tables |
US6005981A (en) | 1996-04-11 | 1999-12-21 | National Semiconductor Corporation | Quadtree-structured coding of color images and intra-coded images |
MY119560A (en) | 1996-05-27 | 2005-06-30 | Nippon Telegraph & Telephone | Scheme for detecting captions in coded video data without decoding coded video data |
JP3695119B2 (en) | 1998-03-05 | 2005-09-14 | 株式会社日立製作所 | Image synthesizing apparatus and recording medium storing program for realizing image synthesizing method |
US6285711B1 (en) | 1998-05-20 | 2001-09-04 | Sharp Laboratories Of America, Inc. | Block matching-based method for estimating motion fields and global affine motion parameters in digital video sequences |
JP2000047297A (en) | 1998-07-28 | 2000-02-18 | Minolta Co Ltd | Digital still camera |
JP2000224470A (en) | 1999-02-02 | 2000-08-11 | Minolta Co Ltd | Camera system |
JP4076109B2 (en) | 1999-06-08 | 2008-04-16 | 富士フイルム株式会社 | Control method for solid-state imaging device |
US7286724B2 (en) | 1999-12-06 | 2007-10-23 | Hyundai Curitel, Inc. | Method and apparatus for searching, browsing and summarizing moving image data using fidelity for tree-structure moving image hierarchy |
KR100371513B1 (en) | 1999-12-06 | 2003-02-07 | 주식회사 팬택앤큐리텔 | Method and apparatus of summerizing and browsing video sequences using fidelity values of key frame stored in its edge of key frame hierarchy |
JP4015944B2 (en) | 2000-07-21 | 2007-11-28 | ザ トラスティース オブ コロンビア ユニバーシティ イン ザ シティ オブ ニューヨーク | Method and apparatus for image mosaicking |
US6681060B2 (en) | 2001-03-23 | 2004-01-20 | Intel Corporation | Image retrieval using distance measure |
US7331523B2 (en) | 2001-07-13 | 2008-02-19 | Hand Held Products, Inc. | Adaptive optical image reader |
US7245320B2 (en) | 2002-06-04 | 2007-07-17 | Micron Technology, Inc. | Method and apparatus for automatic gain and exposure control for maintaining target image brightness in video imager systems |
JP4272863B2 (en) | 2002-09-20 | 2009-06-03 | キヤノン株式会社 | Camera and camera system |
JP3804617B2 (en) | 2003-02-14 | 2006-08-02 | コニカミノルタフォトイメージング株式会社 | Image processing apparatus and method |
US7403568B2 (en) | 2003-08-13 | 2008-07-22 | Apple Inc. | Pre-processing method and system for data reduction of video sequences and bit rate reduction of compressed video sequences using temporal filtering |
US20050195221A1 (en) | 2004-03-04 | 2005-09-08 | Adam Berger | System and method for facilitating the presentation of content via device displays |
US7755667B2 (en) | 2005-05-17 | 2010-07-13 | Eastman Kodak Company | Image sequence stabilization method and camera having dual path image sequence stabilization |
US7970239B2 (en) * | 2006-01-19 | 2011-06-28 | Qualcomm Incorporated | Hand jitter reduction compensating for rotational motion |
US20070248330A1 (en) | 2006-04-06 | 2007-10-25 | Pillman Bruce H | Varying camera self-determination based on subject motion |
US20080165280A1 (en) | 2007-01-05 | 2008-07-10 | Deever Aaron T | Digital video stabilization with manual control |
US8600189B2 (en) | 2007-11-12 | 2013-12-03 | Qualcomm Incorporated | Block-based image stabilization |
-
2006
- 2006-09-25 US US11/534,935 patent/US7970239B2/en active Active
-
2007
- 2007-01-19 WO PCT/US2007/060808 patent/WO2007085007A2/en active Application Filing
Patent Citations (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2832110A (en) * | 1951-11-01 | 1958-04-29 | Blaw Knox Co | Ladle stopper control apparatus |
US4446521A (en) * | 1980-03-28 | 1984-05-01 | Tokyo Shibaura Denki Kabushiki Kaisha | Image reconstruction apparatus and process |
US4922543A (en) * | 1984-12-14 | 1990-05-01 | Sten Hugo Nils Ahlbom | Image processing device |
US4845766A (en) * | 1987-05-11 | 1989-07-04 | Nippon Sheet Glass Co., Ltd. | Simultaneous projection feature analysis apparatus |
US5262856A (en) * | 1992-06-04 | 1993-11-16 | Massachusetts Institute Of Technology | Video image compositing techniques |
US5832101A (en) * | 1995-09-19 | 1998-11-03 | Samsung Electronics Co., Ltd. | Device and method for detecting a motion vector of an image |
US6166370A (en) * | 1996-05-14 | 2000-12-26 | Michel Sayag | Method and apparatus for generating a control signal |
US5832110A (en) * | 1996-05-28 | 1998-11-03 | Ricoh Company, Ltd. | Image registration using projection histogram matching |
US5943450A (en) * | 1996-11-27 | 1999-08-24 | Samsung Electronics Co., Ltd. | Apparatus and method for compensating for camera vibration during video photography |
US6381279B1 (en) * | 1998-01-22 | 2002-04-30 | Hewlett-Packard Company | Method for providing motion-compensated multi-field enhancement of still images from video |
US6128047A (en) * | 1998-05-20 | 2000-10-03 | Sony Corporation | Motion estimation process and system using sparse search block-matching and integral projection |
US6996176B1 (en) * | 1998-05-20 | 2006-02-07 | Sony Corporation | Motion estimation process and system using sparse search block-matching and integral protection |
US6310985B1 (en) * | 1998-07-29 | 2001-10-30 | Electroglas, Inc. | Measuring angular rotation of an object |
US20020097904A1 (en) * | 1998-07-29 | 2002-07-25 | Electroglas, Inc. | Method and apparatus for measuring angular rotation of an object |
US7065261B1 (en) * | 1999-03-23 | 2006-06-20 | Minolta Co., Ltd. | Image processing device and image processing method for correction of image distortion |
US6418168B1 (en) * | 1999-03-24 | 2002-07-09 | Sony Corporation | Motion vector detection apparatus, method of the same, and image processing apparatus |
US6522712B1 (en) * | 1999-11-19 | 2003-02-18 | General Electric Company | Reconstruction of computed tomographic images using interpolation between projection views |
US20040170246A1 (en) * | 2001-06-15 | 2004-09-02 | Anne Koenig | Method for reconstruction of an image of a moving object |
US6996254B2 (en) * | 2001-06-18 | 2006-02-07 | Microsoft Corporation | Incremental motion estimation through local bundle adjustment |
US20030044048A1 (en) * | 2001-06-18 | 2003-03-06 | Zhengyou Zhang | Incremental motion estimation through local bundle adjustment |
US20050232494A1 (en) * | 2002-01-07 | 2005-10-20 | Xerox Corporation | Image type classification using color discreteness features |
US6879656B2 (en) * | 2002-07-23 | 2005-04-12 | General Electric Company | Method and apparatus for deriving motion information from projection data |
US20040125908A1 (en) * | 2002-07-23 | 2004-07-01 | Erdogan Cesmeli | Method and apparatus for deriving motion information from projection data |
US20040114831A1 (en) * | 2002-12-17 | 2004-06-17 | Xerox Corporation | Non-iterative method of calculating image skew |
US20040145673A1 (en) * | 2003-01-15 | 2004-07-29 | Koichi Washisu | Camera and program |
US20040239775A1 (en) * | 2003-05-30 | 2004-12-02 | Canon Kabushiki Kaisha | Photographing device and method for obtaining photographic image having image vibration correction |
US7672503B2 (en) * | 2003-08-29 | 2010-03-02 | Sony Corporation | Direction-recognizing apparatus, direction-recognizing method, direction-recognizing system, and robot apparatus |
US20050166054A1 (en) * | 2003-12-17 | 2005-07-28 | Yuji Fujimoto | Data processing apparatus and method and encoding device of same |
US20070236579A1 (en) * | 2006-01-19 | 2007-10-11 | Jingqiang Li | Hand jitter reduction for compensating for linear displacement |
US20070166020A1 (en) * | 2006-01-19 | 2007-07-19 | Shuxue Quan | Hand jitter reduction system for cameras |
US20070171981A1 (en) * | 2006-01-25 | 2007-07-26 | Yingyong Qi | Projection based techniques and apparatus that generate motion vectors used for video stabilization and encoding |
US20070237514A1 (en) * | 2006-04-06 | 2007-10-11 | Eastman Kodak Company | Varying camera self-determination based on subject motion |
US20080292171A1 (en) * | 2007-05-25 | 2008-11-27 | Herbert Bruder | Method and X-ray CT system for generating computed tomography displays |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070166020A1 (en) * | 2006-01-19 | 2007-07-19 | Shuxue Quan | Hand jitter reduction system for cameras |
US20070236579A1 (en) * | 2006-01-19 | 2007-10-11 | Jingqiang Li | Hand jitter reduction for compensating for linear displacement |
US7970239B2 (en) * | 2006-01-19 | 2011-06-28 | Qualcomm Incorporated | Hand jitter reduction compensating for rotational motion |
US8019179B2 (en) * | 2006-01-19 | 2011-09-13 | Qualcomm Incorporated | Hand jitter reduction for compensating for linear displacement |
US8120658B2 (en) | 2006-01-19 | 2012-02-21 | Qualcomm Incorporated | Hand jitter reduction system for cameras |
US20110216157A1 (en) * | 2010-03-05 | 2011-09-08 | Tessera Technologies Ireland Limited | Object Detection and Rendering for Wide Field of View (WFOV) Image Acquisition Systems |
US20110249739A1 (en) * | 2010-04-12 | 2011-10-13 | Sony Corporation | Context adaptive directional intra prediction |
US8743957B2 (en) * | 2010-04-12 | 2014-06-03 | Sony Corporation | Context adaptive directional intra prediction |
WO2012093393A1 (en) * | 2011-01-07 | 2012-07-12 | Seal Mobile Id Ltd | Method and system for unobtrusive mobile device user recognition |
US8723959B2 (en) | 2011-03-31 | 2014-05-13 | DigitalOptics Corporation Europe Limited | Face and other object tracking in off-center peripheral regions for nonlinear lens geometries |
US8493459B2 (en) | 2011-09-15 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Registration of distorted images |
US8493460B2 (en) * | 2011-09-15 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Registration of differently scaled images |
US20150112135A1 (en) * | 2012-06-28 | 2015-04-23 | Olympus Corporation | Image processing apparatus, image processing method, and computer-readable recording medium |
US9743825B2 (en) * | 2012-06-28 | 2017-08-29 | Olympus Corporation | Image processing apparatus, image processing method, and computer-readable recording device |
US9262807B2 (en) | 2012-07-03 | 2016-02-16 | Fotonation Limited | Method and system for correcting a distorted input image |
US8928730B2 (en) | 2012-07-03 | 2015-01-06 | DigitalOptics Corporation Europe Limited | Method and system for correcting a distorted input image |
US9165215B2 (en) * | 2013-01-25 | 2015-10-20 | Delta Electronics, Inc. | Method of fast image matching |
US20140212052A1 (en) * | 2013-01-25 | 2014-07-31 | Delta Electronics, Inc. | Method of fast image matching |
WO2019013603A1 (en) * | 2017-07-14 | 2019-01-17 | 주식회사 엘지화학 | Method for analyzing polymer layer |
KR20190008025A (en) * | 2017-07-14 | 2019-01-23 | 주식회사 엘지화학 | Method for analyzing polymer layer |
CN110785665A (en) * | 2017-07-14 | 2020-02-11 | 株式会社Lg化学 | Method for analyzing polymer film |
KR102176230B1 (en) | 2017-07-14 | 2020-11-09 | 주식회사 엘지화학 | Method for analyzing polymer layer |
US11145049B2 (en) | 2017-07-14 | 2021-10-12 | Lg Chem, Ltd. | Method for analyzing polymer membrane |
Also Published As
Publication number | Publication date |
---|---|
WO2007085007A2 (en) | 2007-07-26 |
US7970239B2 (en) | 2011-06-28 |
WO2007085007A3 (en) | 2008-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7970239B2 (en) | Hand jitter reduction compensating for rotational motion | |
US8581992B2 (en) | Image capturing apparatus and camera shake correction method, and computer-readable medium | |
US9118840B2 (en) | Image processing apparatus which calculates motion vectors between images shot under different exposure conditions, image processing method, and computer readable medium | |
US9883125B2 (en) | Imaging systems and methods for generating motion-compensated high-dynamic-range images | |
US9558543B2 (en) | Image fusion method and image processing apparatus | |
JP3770271B2 (en) | Image processing device | |
US9639913B2 (en) | Image processing device, image processing method, image processing program, and storage medium | |
JP5017419B2 (en) | Image generating apparatus, image generating method, and program | |
US20080143840A1 (en) | Image Stabilization System and Method for a Digital Camera | |
US8019179B2 (en) | Hand jitter reduction for compensating for linear displacement | |
JP5612017B2 (en) | Camera shake reduction system | |
KR101109532B1 (en) | Image capturing device, image capturing method, and a storage medium recording thereon a image capturing program | |
US11816858B2 (en) | Noise reduction circuit for dual-mode image fusion architecture | |
JP7060634B2 (en) | Image pickup device and image processing method | |
US8836800B2 (en) | Image processing method and device interpolating G pixels | |
US8731327B2 (en) | Image processing system and image processing method | |
US20080107358A1 (en) | Image Processing Apparatus, Image Processing Method, and Computer Program | |
US20050134713A1 (en) | Method of processing a digital image | |
WO2016154873A1 (en) | Terminal device and photographing method | |
US11803949B2 (en) | Image fusion architecture with multimode operations | |
US20220044371A1 (en) | Image Fusion Architecture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: QUALCOMM INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:QUAN, SHUXUE;JIANG, XIAOYUN;LI, JINGQIANG;REEL/FRAME:018719/0063 Effective date: 20061218 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |