US10491832B2 - Image capture device with stabilized exposure or white balance - Google Patents

Image capture device with stabilized exposure or white balance Download PDF

Info

Publication number
US10491832B2
US10491832B2 US15/678,610 US201715678610A US10491832B2 US 10491832 B2 US10491832 B2 US 10491832B2 US 201715678610 A US201715678610 A US 201715678610A US 10491832 B2 US10491832 B2 US 10491832B2
Authority
US
United States
Prior art keywords
roi
image
image capture
jitter
capture device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/678,610
Other versions
US20190058821A1 (en
Inventor
Jisoo Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US15/678,610 priority Critical patent/US10491832B2/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEE, JISOO
Priority to CN201880052452.0A priority patent/CN111034170A/en
Priority to PCT/US2018/039142 priority patent/WO2019036112A1/en
Publication of US20190058821A1 publication Critical patent/US20190058821A1/en
Application granted granted Critical
Publication of US10491832B2 publication Critical patent/US10491832B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • H04N5/2352
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6815Motion detection by distinguishing pan or tilt from motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/684Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
    • H04N23/6842Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by controlling the scanning position, e.g. windowing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • H04N5/23258
    • H04N5/23267

Definitions

  • This disclosure relates to imaging devices generally, and more specifically to automated image capture control.
  • Digital image capture devices e.g., cameras in cell phones, tablets, and laptop computers
  • Digital image capture devices use sophisticated image capture and signal processing techniques to render high quality video and still images in spite of their small sensor size.
  • Properly capturing the image data is fundamental to video and still photography.
  • digital image capture devices automatically focus the image capture device's lens for sharpness, set the exposure time for accurate light levels, and adjust the white balance so colors can be reproduced accurately regardless of the color temperature of the light source for the image.
  • Hand-held images may be blurred or have artifacts due to camera motion, particularly in low-light settings.
  • a method for image capture control may include computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI).
  • EIS electronic image stabilization
  • ROI region of interest
  • a jitter or panning shift is computed based on the EIS compensation.
  • the first ROI is adjusted for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device based on the jitter or panning shift.
  • An image capture control apparatus may include an electronic image stabilizer for outputting a first signal based on image data from an image capture device.
  • the image data have a first region of interest (ROI).
  • a jitter estimator is configured to compute a jitter or panning shift based on the first signal.
  • An ROI adjuster is configured to compute an adjusted ROI based on the jitter or panning shift, for use in at least one of a white balance process or an automatic exposure process of the image capture device.
  • An image capture control apparatus may include means for outputting a first signal based on image data from an image capture device, the image data having a first region of interest (ROI).
  • a means for estimating a jitter or panning shift is provided based on the first signal.
  • a means for computing an adjusted ROI based on the jitter or panning shift is provided.
  • the adjusted ROI is for use in at least one of the group consisting of a white balance process and an automatic exposure process of the image capture device.
  • a non-transitory, machine-readable storage medium can be encoded with program code for image capture control.
  • the program code comprises: code for computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI); code for computing a jitter or panning shift based on the EIS compensation; and code for adjusting the first ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device based on the jitter or panning shift.
  • EIS electronic image stabilization
  • ROI region of interest
  • FIG. 1 is a block diagram of an exemplary image capture device having stabilized automatic exposure or stabilized automatic white balance.
  • FIGS. 2A-2D are diagrams showing how jitter can affect automatic exposure or automatic white balance, and how the image capture device of FIG. 1 can compensate for the jitter.
  • FIG. 3A is a high level flow chart of a method for stabilizing automatic exposure or stabilizing automatic white balance using the image capture device of FIG. 1 .
  • FIG. 3B is a detailed schematic diagram showing a method for stabilizing automatic exposure or stabilizing automatic white balance using the image capture device of FIG. 1 .
  • FIG. 3C is a diagram showing an example of different regions of interest for automatic white balance, autofocus, and automatic exposure.
  • FIG. 4 is a high level flow chart of the jitter estimation method of FIGS. 3A and 3B .
  • FIG. 5 shows an example of the jitter estimation method of FIG. 4 applied to a jitter profile indicating the image capture device is translating without rotating.
  • FIG. 6 is a diagram of the jitter stabilized automatic white balance, automatic exposure, or automatic focus statistics engine of FIG. 3B .
  • FIG. 7 is a detailed flow chart of an exemplary jitter estimation computation.
  • This disclosure provides image capture devices with jitter stabilized automatic exposure (AE) and/or jitter stabilized automatic white balance (AWB).
  • AE jitter stabilized automatic exposure
  • AVB jitter stabilized automatic white balance
  • CMOS complementary metal-oxide-semiconductor
  • image capture devices a series of images of the same scene or subject captured in a short period of time show uneven light levels and color temperature.
  • Color temperature is the temperature of an ideal black-body that radiates light having a color distribution comparable to the color distribution of the source of light reflected off the subject during image capture. Light level and color temperature may exhibit fluctuations, even though the time period was too short for the ambient lighting conditions to change.
  • Jitter is a camera movement due to camera shake (e.g., due to hand shake or tripod vibration). Jitter can be distinguished from panning motion. Jitter has smaller amplitude than panning motion. Jitter may include translations in three directions and rotations in three directions. An example of jitter motion includes from one to three translations on the order of 1 cm or less, and from one to three rotations on the order of 5-10°. Jitter can change the scene at the edges of the field of view (FOY) of the image capture device. Jitter can also affect focus at shutter speeds that are slow, for a given effective focal length of the image capture device. The direction of jitter motion can oscillate with an irregular, variable period (e.g., from 25 to 150 Hz).
  • an irregular, variable period e.g., from 25 to 150 Hz.
  • Panning changes the scene within the FOV of the image capture device, and may change the entire scene or a portion of the scene.
  • panning primarily includes a rotation about the yaw axis.
  • the yaw axis is parallel to the short sides of the image capture device in landscape mode, and is parallel to the long sides of the image capture device in portrait mode.
  • a panning motion may include any yaw rotation from zero to 360 degrees, such as 90° or 180°, for example.
  • Panning often has a larger amplitude than jitter and usually is not a repeating motion.
  • panning includes rotation about both the pitch and yaw axes (e.g., for hand-held panning or panning while the image capture device is mounted on a tripod having a ball-type head.
  • the pitch axis is parallel to the long sides of the image capture device in landscape mode, and is parallel to the short sides of the image capture device in portrait mode.
  • panning can also include a rotation about the roll axis (e.g., switching between landscape and portrait mode) or translations in up to three directions. Jitter can also be present during a panning motion. If the image capture device is hand-held, up to six jitter components (three translation, three rotation) may be present.
  • Jitter can cause objects to enter or exit the edges of the image frame during a series of frames. If the object entering or exiting the image frame has a different light level or light source from other subjects in the frame, the entry or exit of the object at the edge changes the overall light level and distribution of red, green and blue light in the image (“the histogram”).
  • the AE and AWB systems respond to the changes in the histogram by changing the exposure and white balance settings between images, even though lighting conditions have not changed.
  • the change in the exposure and white balance setting between image frames can be sufficiently large to be detected by a user's unaided eyes.
  • jitter encompasses image capture device motion resulting from hand shake, floor vibrations, wind, loud sounds, vehicle motion, accidental contact with the image capture device or a tripod or selfie stick on which the image capture device is mounted, or other causes.
  • the AE/AWB stabilization techniques described below can be applied in mobile devices and dedicated cameras. Although the examples below include both AE stabilization and AWB stabilization, the motion compensation can be applied to either AE or AWB alone, or to AE and/or AWB in combination with autofocus (AF).
  • AF autofocus
  • the AE stabilization and AWB stabilization can be applied to video or still images.
  • FIG. 1 is a schematic block diagram of an exemplary image capture device 100 including at least one processor 160 linked to video or still camera optics and sensor 115 for capturing images.
  • the processor 160 is also in communication with a working memory 105 , instruction memory 130 , storage device 110 and an optional electronic display 125 .
  • image capture device 100 and AE/AWB stabilization techniques described herein can provide advantages in many different types of portable and stationary computing devices.
  • the image capture device 100 can also be implemented in a special-purpose video or still camera or a multi-purpose device capable of performing imaging and non-imaging applications.
  • image capture device 100 can be a portable personal computing device such as a mobile phone, digital camera, tablet computer, laptop computer, personal digital assistant, or the like.
  • the image capture device 100 has video or still camera optics and camera optics and sensor 115 , including an image sensor having an array of pixels. Each pixel has a microlens and—in the case of a color image sensor—color filters, including at least two types of phase detection (PD) pixels.
  • the video or still camera optics and camera optics and sensor 115 can also have a primary focusing mechanism that is positionable based, at least partly, on data received from the image signal processor 120 .
  • the focusing mechanism positions the optics to produce a focused image of a region of interest (ROI) in the target scene.
  • ROI region of interest
  • video or still camera optics and sensor 115 includes multiple lenses and multiple sensors.
  • the lenses can include front-facing and/or back-facing lenses.
  • the sensors can include color and/or monochrome sensors.
  • the at least one processor 160 can include multiple processors, such as an image signal processor 120 and a device processor 150 .
  • the processor 160 has a single central processing unit that performs image signal processing and other operations.
  • Image signal processor 120 can include one or more dedicated image signal processors (ISPs) or a software implementation programmed on a general purpose processor.
  • ISPs dedicated image signal processors
  • the image signal processor 120 performs phase detection operations.
  • an image sensor 200 shown in FIG. 2 , described below can be configured to perform the phase detection operations.
  • the image signal processor 120 can control image capture parameters, including but not limited to, autofocus and auto-exposure.
  • the image signal processor 120 can also be configured to perform various image capture operations on received image data in order to execute phase-detection autofocus (PDAF), contrast autofocus, automatic exposure, and automatic white balance.
  • PDAF phase-detection autofocus
  • Image signal processor 120 can be a general purpose processing unit or a processor specially designed for imaging applications. Image signal processor 120 also performs several image processing operations including demosaicing, noise reduction and cross-talk reduction.
  • the image signal processor 120 also performs post-processing functions, such as cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, sharpening, or the like.
  • post-processing functions such as cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, sharpening, or the like.
  • the image signal processor 120 is connected to an instruction memory 130 for storing instructions and a working memory 105 .
  • the instruction memory 130 stores a capture control block 135 , an autofocus block 140 , an automatic white balance block 141 , an automatic exposure block 142 , an image processing block 143 , an electronic image stabilization block 144 , a jitter estimation block 146 , a region of interest (ROI) adjuster block 147 , and an operating system block 145 .
  • the instruction memory 130 can include a variety of additional modules that configure the image signal processor 120 of device processor 150 to perform various image processing and device management tasks.
  • Working memory 105 can be used by image signal processor 120 to store a working set of instructions loaded from the function blocks in the instruction memory 130 .
  • Working memory 105 can also be used by image signal processor 120 to store dynamic data created during the operation of image capture device 100 .
  • the instruction memory 130 comprises flash memory
  • the working memory 105 comprises dynamic random access memory (DRAM).
  • the capture control block 135 can include instructions that configure the image signal processor 120 to adjust the lens position, set the exposure time, sensor gain, and configure the white balance filter of the image capture device 100 , in response to instructions generated during an auto focus operation, for example.
  • Capture control block 135 can further include instructions that control the overall image capture functions of the image capture device 100 .
  • capture control block 135 can call the autofocus block 140 to calculate lens or sensor movement to achieve a desired autofocus position and output a lens control signal to the lens.
  • Autofocus (AF) block 140 adjusts the lens position, so a region of interest (ROI) within the field of view (FOY) of the sensor, corresponding to one or more focus point(s), is focused in the plane of the sensor.
  • the focus point (and thus the AF ROI) can be manually selected by the user, or the image capture device 100 can select one or more focus points based on which object(s) is (are) nearest to the image capture device 100 .
  • AF block 140 stores instructions for executing phase detection autofocus (PDAF), contrast autofocus, or laser autofocus.
  • PDAF phase detection autofocus
  • PDAF divides incoming light into pairs of images and captures the divided light rays coming from the opposite sides of the lens, creating a rangefinder.
  • the two images are then analyzed to find a separation (phase) error and determine whether the ROI is focused of the sensor is in focus in the sensor plane.
  • Contrast AF moves the lens through its focal range, stopping at the point where maximal contrast is detected between adjacent pixels.
  • Laser AF shines a laser or light emitting diode (LED) on the subject and calculates the distance based on the time of flight.
  • AF finds the optimal lens position for bringing a region of interest (ROI) into focus in the plane of the sensor.
  • ROI region of interest
  • the region of interest for AF (“the AF ROI”) can be a small fraction of the image area (e.g. 10% to 15%).
  • the AF ROI can be manually selected (e.g., by tapping on an ROI in the display), or automatically selected (e.g., by face detection).
  • the AF ROI can be located at the position of any of the focus points of the image capture device 100 .
  • Some mobile devices use information from imaging pixels for phase detection autofocus (PDAF), allowing thousands of different regions throughout the FOV of the lens to serve as AF ROI.
  • PDAF phase detection autofocus
  • AVB block 141 determines the color correction to be applied globally to the image so that neutral objects can be rendered as gray.
  • AWB block 141 determines the illuminating light source under which an image was captured, and scales the components (e.g., R, G, and B) of the image so they conform to the light in which the image is to be displayed or printed. Because the same color temperature is applied globally throughout the image, the ROI for AWB (“the AWB ROI”) can include the entire field of view (FOY) of the sensor.
  • FOY field of view
  • Automatic exposure block 142 determines the length of time that the sensing elements integrate light before being read out.
  • the ambient light is metered, and an exposure time is selected.
  • AE determines the length of time the shutter is open. As the ambient light level increases, the selected exposure time becomes shorter. As the ambient light level decreases, the selected exposure time becomes longer.
  • the available ROI used by the AE block (“the AE ROI”) depends on the metering mode. Most image capture devices include one or more of the following metering modes: spot metering, partial metering, center-weighted metering, or multi-zone metering (also referred to as “matrix metering” and “evaluative metering”).
  • the AE ROI for spot metering can constitute from about 1% to about 5% of the FOV of the sensor, either centered around the center focus point or the current focus point selected by the user.
  • the autofocus system chooses a lens position that focuses light from the current focus point in the plane of the image sensor. Spot metering ensures correct exposure of a specific spot in the image.
  • the AE ROI for partial metering is approximately 6% to 8% of the FOV of the sensor, centered around the active focus point(s). Partial metering can be used if the FOV has specific bright and dark areas, and the user only wants a specific region to affect the exposure.
  • the partial metering AE ROI contains, and is concentric with, the spot metering AE ROI.
  • Center-weighted metering uses an AE ROI covering most of the FOV, but the light levels measured near the center region of the sensor are given more weight (e.g., 60% to 85%) than the light levels measured at the periphery of the sensor.
  • the center weighted metering AE ROI is much larger than the spot metering AE ROI or the partial metering AE ROI.
  • the center weighted metering AE ROI contains the spot metering AE ROI and the partial metering AE ROI at its centroid, if the center focus point is being used for spot metering or partial metering.
  • the AE ROI for center-weighted metering can be different from, or may overlap, the AE ROI for spot metering or partial metering.
  • the regions containing the spot metering AE ROI and partial metering AE ROI are given the greatest weight in center-weighted metering.
  • Multi-zone metering uses an AE ROI including multiple regions of the image, and can use proprietary algorithms to analyze the image content and select an optimum exposure setting based on the light levels and the image content. Matrix metering can use empirical data to select an exposure setting considered optimal for a reference image having similar characteristics. Multi-zone metering can be used if the scene is fairly evenly lit.
  • the multi-zone metering AE ROI can include the entire FOV, and thus can be larger than the AE ROIs of all of the other metering modes, as well as additional regions within the FOV of the sensor.
  • the image capture device 100 is configured to use three different ROIs, including an AF ROI for performing autofocus, an AE ROI for performing auto exposure and an AWB ROI for performing automatic white balance. These ROIs can be different from each other, as shown in FIG. 3C and described below.
  • the AF ROI can include one or more regions, and can be on the order of 1% of the FOV or less.
  • the AE ROI can vary between 1% and 100% of the FOV of the sensor and contains the AF ROI.
  • the center-weighted AE method assigns different weights to different parts of the FOV, it is possible for the AE ROI to give very little weight to the region containing the AF ROI (e.g., in spot or partial metering modes), effectively using an ROI that does not overlap with the AF ROI.
  • the AWB ROI can include 100% of the FOV of the sensor.
  • the AWB ROI contains the AF ROI and contains or coincides with the AE ROI.
  • the AWB ROI may give little weight to the region of the FOV containing the AF ROI, (e.g., in spot or partial metering modes).
  • the three blocks AF block 140 , AWB block 141 and AE block 142 are included in “3A” engine 136 .
  • the 3A engine 136 includes functions that operate on raw image sensor data measured by the image sensor prior to capturing an image. To enhance the speed of focusing the lens, the 3A engine 136 can be implemented in a field-programmable gate array (FPGA) or an application specific integrated circuitry (ASIC), but other embodiments may perform the 3A functions using software code executed by image signal processor 120 .
  • FPGA field-programmable gate array
  • ASIC application specific integrated circuitry
  • Image processing block 143 can include several processing functions including demosaicing, noise reduction, cross-talk reduction, color processing, gamma adjustment, image filtering (e.g., spatial image filtering), lens artifact or defect correction, image sharpening, or the like.
  • demosaicing noise reduction, cross-talk reduction, color processing, gamma adjustment, image filtering (e.g., spatial image filtering), lens artifact or defect correction, image sharpening, or the like.
  • the electronic image stabilization (EIS) block 144 provides a form of post-capture image processing.
  • the EIS block 144 provides a means for outputting a first signal ( 319 , FIG. 3B ) discussed below, based on image data from the image capture device 100 .
  • the first signal 319 can contain data representing a motion vector of the image capture device 100 .
  • the first signal (motion vector 319 ) indicates that EIS block 144 is applying motion compensation to an image frame.
  • EIS block 144 positions a cropped pixel array (“the image window”) within the total array of pixels on the sensor, as described below in the discussion of FIGS. 2A-2D .
  • the image window includes the pixels that are used to capture images.
  • the image window can include all of the pixels in the sensor, except for a few rows and columns at the periphery of the sensor.
  • the image window is in the center of the sensor while the image capture device 100 is stationary.
  • the peripheral pixels surround the pixels of the image window and form a set of buffer pixel rows and buffer pixel columns around the image window.
  • EIS block 144 shifts the image window from frame to frame of video, so that the image window tracks the same scene over successive frames—assuming that the subject does not move. (If the subject moves, the scene has changed; if the camera is held still except for jitter, EIS tracks the background over successive frames, and shifts the image window in a similar manner EIS cannot remove motion blur, if the image capture device is panned too quickly at any given exposure speed.)
  • the image window can include at least 95% (e.g., 95% to 99%) of the pixels on the sensor.
  • the first ROI (used for AE and/or AWB) may comprise the image data within the field of view of at least 95% (e.g., 95% to 99%) of the plurality of imaging pixels in the image sensor 200 ( FIG. 2 ) of the image capture device 100 ( FIG. 1 ).
  • a plurality of buffer pixels at the periphery of the sensor (outside of the image window) are reserved as a buffer to allow the image window to shift to compensate for jitter.
  • the image window can be moved so that the subject remains at the same location within the adjusted image window, even though light from the subject impinges on a different region of the sensor.
  • the buffer pixels can include the ten topmost rows, ten bottommost rows, ten leftmost columns and ten rightmost columns of pixels on the sensor.
  • the buffer pixels are not used for AF, AE or AWB, and are not included in the image output. If jitter moves the sensor to the left by twice the width of a column of pixels between frames, the EIS block 144 can shift the image window to the right by two columns of pixels, so the captured image shows the same scene in the next frame as in the current frame. EIS block 144 thus smoothes the transition from one frame to the next.
  • Jitter estimation block 146 receives inputs from EIS block 144 indicating detection of motion between the most recent two image frames received. Jitter estimation block 146 provides a means for estimating a jitter or panning shift based on the first signal output by the EIS block 144 . Optionally, jitter estimation block 146 can receive measurements from one or more inertial sensors (e.g., motion sensors, such as a gyroscope 116 and/or an accelerometer 117 ). EIS block 144 provides a first signal to the jitter estimation block 146 .
  • inertial sensors e.g., motion sensors, such as a gyroscope 116 and/or an accelerometer 117 .
  • the gyroscope 116 provides a second signal to the jitter estimation block 146 indicating a rotation of the image capture device 100 .
  • the accelerometer 117 provides a second signal to the jitter estimation block 146 indicating a translation of the image capture device 100 .
  • the accelerometer 117 provides a signal to the jitter estimation block 146 indicating a translation of the image capture device 100 , and the jitter estimation block 146 uses data from the EIS block 144 to determine the magnitude and direction of the translation.
  • the jitter estimation block 146 can compute a more general transformation including translation and rotation of the subject.
  • the jitter estimation block 146 uses an algorithm that accounts for signals from the EIS block 144 and the one or more inertial sensors (e.g., gyroscope 116 and/or accelerometer 117 ) to determine the jitter and rotation.
  • ROI adjuster block 147 compares the magnitude of the jitter motion to the size of the region of buffer pixels available at the periphery of the sensor. ROI adjuster block 147 provides a means for computing an adjusted ROI based on the jitter or panning shift, for use in a white balance process and/or an automatic exposure process of the image capture device 100 . ROI adjuster block 147 determines whether enough buffer pixels are available at the appropriate edge of the sensor for shifting the AE ROI, AWB ROI (and optionally, the AF ROI), corresponding to the image window shift initiated by the EIS block 144 .
  • the ROI adjuster In response to a determination that enough buffers are available to compensate for the jitter, the ROI adjuster adds buffer pixels to one side of the image window, and removes the same number of pixels from the opposite side of the image window, effectively moving the AE ROI and AWB ROI to track the shift in the image window.
  • the ROI adjuster can add virtual pixels to the one side of the ROI. The virtual pixels have luminance and chroma values extrapolated from the pixels at the edge of the sensor.
  • the ROI adjuster block 147 removes the same number of pixels from the opposite side of the image window effectively moving the ROI.
  • the AE block 142 and AWB block 141 provide means for computing an exposure time and/or a color temperature for an image frame using image data corresponding to the adjusted ROI.
  • a 3A engine 136 provides the means for computing an exposure time and/or a color temperature for an image frame using image data corresponding to the adjusted ROI.
  • the 3A engine 136 can incorporate AF block 140 , AWB block 141 and AE block 142 in a single processing engine.
  • the AF block 140 provides a means for selecting an autofocus ROI to be used by the autofocus control system of the image capture device 100 .
  • the exemplary 3A engine 136 is configured so that the AWB ROI and AE ROI can be different from each other, different from the first ROI (the original AF ROI), and different from the adjusted AF ROI.
  • the ROI adjuster block 147 the means for computing an adjusted ROI based on the jitter or panning shift—is configured to select the adjusted AE ROI and AWB ROI to be larger than the autofocus ROI.
  • the AE ROI and AWB ROI can include the entire FOV, while the AF ROI may be a spot occupying 5% of the FOV.
  • the ROI adjuster block 147 adjusts the boundary of the ROI every time the EIS block 144 processes an image frame. In other embodiments, the ROI adjuster block 147 adjusts the ROI boundary less frequently. For example, ROI adjuster block 147 can adjust the ROI every second, third, or fourth image frame. In other embodiments, the ROI adjuster block 147 adjusts the boundary of the ROI every time the EIS block 144 shifts the image window.
  • Operating system block 145 acts as an intermediary between programs and the processor 160 .
  • the operating system can be “APPLE iOS”TM, from Apple, Inc., Cupertino, Calif. or any other operating system for a mobile phone or tablet.
  • the operating system block 145 can be ‘WINDOWS”TM sold by Microsoft Corporation of Redmond, Wash. or any other operating system for a computer.
  • Operating system block 145 can include device drivers to manage hardware resources such as the video or still camera optics and sensor 115 . Instructions contained in the image processing blocks discussed above interact with hardware resources indirectly, through standard subroutines or application program interfaces (APIs) in operating system block 145 . Instructions within operating system block 145 can then interact directly with these hardware components. Operating system block 145 can further configure the image signal processor 120 to share information with device processor 150 .
  • APIs application program interfaces
  • Device processor 150 can be configured to control the display 125 to display the captured image or a preview of the captured image to a user.
  • the display 125 can be external to the image capture device 100 or can be part of the image capture device 100 .
  • the display 125 can be configured to provide a view finder displaying a preview image prior to capturing an image.
  • the display 125 can comprise a liquid crystal display (LCD), light emitting diode (LED), or organic light emitting diode (OLED) screen, and can be touch sensitive.
  • LCD liquid crystal display
  • LED light emitting diode
  • OLED organic light emitting diode
  • Device processor 150 can write data to storage device 110 .
  • the data can include data representing captured images, data generated during phase detection and/or metadata, e.g., exchangeable image file format (EXIF) data.
  • EXIF exchangeable image file format
  • storage device 110 is represented schematically as a disk device, storage device 110 can be configured as any type of storage media device.
  • the storage device 110 can include a disk drive, such as an optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, random access memory (RAM), read-only memory (ROM), and/or electrically-erasable programmable ROM (EEPROM).
  • the storage device 110 can also include multiple memory units.
  • FIG. 1 shows an image capture device 100 having separate components to implement a processor 160 and working memory 105
  • these separate components can be combined in a variety of ways.
  • the memory components can be combined with processor components in a system on a chip (SOC).
  • FIGS. 2A to 2D schematically demonstrate ROI adjustment as performed by the ROI adjuster block 147 .
  • FIG. 2A shows light from a scene including face 225 and light bulb 201 focused on the image sensor 200 .
  • the face 225 is centered within the image window 202 within the image sensor 200 .
  • the individual pixels in image sensor 200 are omitted for ease of viewing.
  • the peripheral region 203 of the image sensor 200 contains buffer pixels (not shown) which are outside of the image window 202 when the image capture device 100 is stationary.
  • the image capture device 100 uses multi-zone metering, and the AE ROI and AWB ROI both coincide with the entire image window 202 .
  • the face 225 is in the center of the image window 202 .
  • a bright object e.g., a light bulb 201
  • the light bulb 201 inside the image window contributes to the metered light level and lowers the average color temperature for a “warm” white balance.
  • Red and orange light have lower color temperatures than blue light, but red and orange light are customarily referred to as “warm” and blue light is customarily referred to as “cool”.
  • FIG. 2B shows the same image sensor 200 after jitter causes the image sensor 200 to move downward and to the right, causing light from the scene to impinge on image sensor 200 beyond the upper left corner of the original image window 202 .
  • the face 225 is off center with respect to image window 202 , and the light bulb 201 is outside the image window 202 .
  • the EIS block 144 updates the image window with an X shift and a Y shift.
  • the X shift includes adding buffer pixels in region 207 to the image window for the next frame and removing pixels in region 208 from the image window for the next frame.
  • the Y shift includes adding buffer pixels in region 206 to the image window for the next frame and removing pixels in region 209 from the image window for the next frame.
  • Adjusted image window 204 includes the image window 202 , plus the added region 207 , minus the removed region 208 .
  • the adjusted image window 204 is indicated by the entire dashed square.
  • the light bulb 201 is again located within the adjusted image window 204 , contributing to the light level.
  • the AE block 142 would set the exposure time based on the original AE ROI—coinciding with original image window 202 in this example—which no longer contains the light bulb 201 .
  • the exposure time would be too long, over-exposing the image.
  • the AWB block 141 would not take the warm light from light bulb 201 into account, and will set the color temperature to a value suitable for cooler lighting.
  • the ROI adjuster block 147 adds buffer pixels in regions 206 and 207 to the AE ROI and AWB ROI, and removes pixels in regions 208 and 209 from the AE ROI and AWB ROI as shown in phantom, consistent with the change in the image window.
  • the light bulb 201 is inside the adjusted AE ROI and AWB ROI, corresponding to the adjusted image window 204 .
  • the exposure time is reduced for proper exposure, and the white balance again reflects warmer light. This is just a simplified example, and AE ROI can be (and frequently is) different from the AWB ROI and the image window.
  • FIG. 2C shows the same image sensor 200 imaging the same subject (i.e., face 225 ) as shown in FIG. 2A .
  • the light bulb 201 is outside of the image window. Assuming the face 225 is not illuminated by the light bulb 201 , the light from light bulb 201 does not contribute substantially to the light level metered in the image window 202 . Without the “warm” light from the light bulb 201 , the AWB block 141 sets the white balance for “cool” light.
  • FIG. 2D shows image sensor 200 after jitter causes the image sensor 200 to move upward and to the right, causing light from the scene to impinge on image sensor 200 beyond the lower left corner of the original image window 202 .
  • the face 225 is off-center, and the light bulb is now within the original image window 202 .
  • the EIS block 144 updates the image window with an X shift and a Y shift.
  • the X shift includes adding buffer pixels in region 218 to the image window for the next frame and removing pixels in region 216 from the image window for the next frame.
  • the Y shift includes adding buffer pixels in region 217 to the image window for the next frame and removing pixels in region 219 from the image window for the next frame.
  • the image window shifts downward and leftward to keep the face 225 in the same location relative to the adjusted image window 204 (as shown in phantom).
  • the light bulb 201 is located outside the adjusted image window 214 , reducing the light level and the “cool” light.
  • the AE block 142 would set the exposure time based on the original AE ROI—coinciding with original image window 202 —which now contains the light bulb 201 .
  • the AE will set the exposure time too short, and white balance setting will be suitable for “warm” light, even though the light bulb is not inside the adjusted image window 214 .
  • the ROI adjuster block 147 adds buffer pixels in regions 216 and 217 to the AE ROI and AWB ROI, and removes pixels in regions 218 and 219 from the AE ROI and AWB ROI as shown in phantom, consistent with the change in the image window 214 .
  • the light bulb 201 is outside the adjusted AE ROI.
  • the AE block 142 sets the exposure properly.
  • the ROI adjuster block 147 updates the AWB ROI, so the color temperature is measured for a cooler light source, without the warm light of the light bulb 201 .
  • FIG. 3A is a flow chart of an exemplary method for adjusting the AE ROI or AWB ROI for image capture control.
  • the 3A engine 136 receives sensor data from at least one inertial sensor (e.g., gyroscope 116 and/or accelerometer 117 ) indicating image capture device 100 is moving (e.g., translating or rotating).
  • at least one inertial sensor e.g., gyroscope 116 and/or accelerometer 117 .
  • EIS block 144 computes an electronic image stabilization (EIS) compensation for the current (first) image frame based on image data from the image capture device 100 .
  • the image data has a first region of interest (ROI), for AE and AWB operations.
  • the EIS block 144 adjusts the location of the first image window 202 within the image sensor 200 , so that the subject in the first frame remains at the same location within the image window that the subject had in a previous frame, even though the sensor has moved relative to the subject between the previous frame and the first frame. Because the 3A engine 136 ( FIG. 1 ) computes statistics before the first image frame is captured, and EIS block 144 ( FIG.
  • the ROI shift computed based on the first frame will be provided to the 3A engine 136 for use by AF block 140 , AWB block 141 and AE block 142 (all shown in FIG. 1 ) for generating 3A statistics for the second frame. That is, the shift identified by EIS block 144 is used to compensate for motion in the image data of the current (first) frame and provided to 3A engine 136 ( FIG. 1 ) to adjust the ROIs for 3A statistics generation for the next (second) frame.
  • the application of the shift to the ROI for 3A's statistics can lag one frame behind the application of the same shift amount to the image window shift for motion compensation.
  • jitter estimation block 146 computes a jitter or panning shift based on the EIS compensation.
  • the EIS compensation can divide the motion of the image capture device 100 into a panning shift component and a jitter component by analyzing the motion acceleration components and/or the rotation components. For example, a rotation about a vertical axis without any horizontal acceleration indicates panning. The difference between the total detected motion and the rotation is due to jitter.
  • the ROI adjuster block 147 adjusts the first ROI for use in an AWB process or an AE process of the image capture device 100 , or for use in both the AWB process and the AE process.
  • the ROI adjuster block 147 can adjust the first ROI to compensate for the jitter component, without compensating for the panning shift component. This allows AWB block 141 and/or AE block 142 to change the color temperature and/or exposure time to match the change in scene due to panning, without introducing additional fluctuations in color temperature and/or exposure time due to jitter.
  • the ROI adjuster block 147 can adjust the first ROI to compensate for both the jitter component and the panning.
  • the ROI adjustment may include adjusting a second ROI (e.g., the AF ROI) for use in AF control, where the first ROI (e.g., the AWB ROI for AWB and/or the AE ROI for AE) and the second ROI (for AF) for a single image frame are different from each other.
  • the first ROI is often larger than the second ROI.
  • the first ROI (AWB ROI 370 ) can include the entire FOV of the image sensor.
  • the second ROI (e.g., AE ROI 372 ) depends on the metering mode, and may be about 6-8% of the FOV for partial metering as shown in FIG. 3C .
  • the AF ROI 374 can be about 15% of the FOV, and can be located with the focus point 373 centered within the AF ROI.
  • the AF ROI 374 and the AE ROI 273 are different sizes and do not overlap each other, while the AWB ROI 370 occupies the entire FOV of the image capture device 100 , and contains the AF ROI 374 and the AE ROI 372 .
  • FIG. 3B is a detailed schematic diagram of the operation of the image capture device 100 of FIG. 1 .
  • the user activates the focusing operation, e.g., by tapping a point on the display 125 of the image capture device 100 to select a focus ROI, or a camera application activates the focusing operation on its own by, for example, face detection.
  • inertial sensor e.g., accelerator/gyro
  • inertial sensor 302 in the case of image capture devices having a motion (inertial) sensor 302 (e.g., an accelerometer 116 or gyro 117 , FIG. 1 )
  • the inertial sensor 302 provides a signal 303 to the 3A engine 308 and provides a signal 305 to the jitter estimation block 324 .
  • the sensor driver 304 operates the image sensor 306 to collect sensor data for autofocusing the lens.
  • the sensor data can include PD pixel values and or imaging pixel values.
  • the image sensor 306 outputs raw analog sensor data to front end processing block 310 .
  • the front end processing block 310 conditions the analog signal received from the image sensor and performs analog-to-digital (A/D) conversion.
  • the front end processing block 310 can clamp the input signal, sample the clamped input signal, filter out thermal noise, attenuate low frequency drift, and convert the raw sensor data from analog format to digital format in an analog-to-digital converter (ADC), not shown.
  • ADC analog-to-digital converter
  • the raw digital data undergoes demosaicing (color filter array interpolation) to reconstruct a full red-green-blue (RGB) color image.
  • the front end processing block 310 can perform a mathematical coordinate transformation from the red-green-blue (RGB) color space to an associated YCbCr (luminance, blue difference, red difference) color space.
  • the YCbCr space separates the luminance value from the color values.
  • the luminance signal (Y) can be transmitted at high bandwidth or stored with high resolution, and two color components (Cb and Cr) can be bandwidth-reduced, subsampled, and compressed for more efficient transmission/storage.
  • the YCbCr data 311 are output from the front end processing block 310 .
  • the front end processing block 310 outputs the YCbCr data 311 to the YCbCr buffers 314 and to the jitter stabilized 3A statistics engine 312 .
  • the image enhancements block 316 can perform operations such as cropping, scaling (e.g., to a different resolution), image format conversion, color filtering, sharpening, or the like.
  • the EIS block 318 compares successive frames to determine a camera motion vector for the current frame, and adjusts the boundaries of the image window on the imaging sensor based on two successive frames. For example, EIS block 318 can compute the shift for motion compensation of the image window for the current frame based on image data from the current frame and the previous frame. (As noted above, the application of the ROI shift to the 3A engine 308 lags one frame behind applying the shift to the image window for the corresponding motion compensation.) In general, the EIS compensation based on a first frame of image data is used to shift an image window of the first frame, and the adjusted ROI—which is adjusted based on the EIS compensation—is applied to a second frame of image data following the first frame of image data. EIS block 318 outputs the first signal (e.g., motion vector) 319 to the jitter estimation block 324 .
  • the first signal e.g., motion vector
  • the jitter estimation block 324 receives motion data 305 from the motion (inertial) sensor 302 (e.g., gyroscope 116 and/or accelerometer 117 from FIG. 1 ) and the first signal (e.g., calculated motion vector) 319 from EIS block 318 .
  • the jitter estimation block 324 computes and outputs the estimated jitter 325 based on both motion data 305 and motion vector 319 .
  • jitter estimation block 324 can compute a weighted average of the motion data 305 from the inertial sensor 302 and the calculated motion vector 319 from the EIS block 318 .
  • the ROI adjuster 320 receives the estimated jitter 325 and determines the adjusted AF ROI, AE ROI and AWB ROI to be applied to the next image frame.
  • the ROI adjuster outputs information 321 , which can identify the new ROI boundaries, or identify the change in the ROI boundaries.
  • Block 312 receives the YCbCr image data from the front end processing block 310 and the new ROI boundaries or the change in the ROI boundaries from the ROI adjuster 320 . Based on the new ROI boundaries or the changes in the ROI boundaries, jitter stabilized 3A statistics block 312 determines from which pixels the YCbCr image data is used by the 3A engine 308 for determining the AF, AE and AWB settings for the next image frame. Jitter stabilized 3A statistics block 312 computes the motion compensated image statistics (e.g., histogram data) using the adjusted image window and adjusted ROIs 321
  • motion compensated image statistics e.g., histogram data
  • Video encoder block 322 converts raw (uncompressed) digital video to compressed video signals.
  • the video encoder block 322 can be an (ISO/IEC 14496) MPEG-4 encoder.
  • Video buffers and bitstream 326 store video or graphics information as it moves from the video encoder to the display screen or storage medium.
  • the video buffers gather all information before the information is to be displayed or stored, to accommodate any mismatch between the speed at which the encoded video is output from video encoder block 322 and the speed at which a display or storage medium can receive the video data.
  • capture control block 135 of FIG. 1 is configured to allow the ROI adjuster to adjust the AF ROI, AE ROI and AWB ROI after each image frame. Capture control block 135 is also configured to initiate processing by 3A engine 308 before every succeeding image frame, based on the adjusted AF ROI, adjusted AE ROI and adjusted AWB ROI. As a result, the exposure speed and white balance can be maintained at proper levels for the scene, even while the camera is moving relative to the scene, and the image window is shifting.
  • FIG. 4 is a flow chart of the jitter estimation block 324 .
  • Jitter estimation block 324 receives sensor data 305 from the inertial sensor 302 (e.g., gyroscope and/or accelerometer), shown in FIG. 3 .
  • the inertial sensor 302 e.g., gyroscope and/or accelerometer
  • Decision block 330 determines whether there is significant jitter present. Jitter is considered insignificant if the motion sensor output signals 305 cannot be distinguished from thermal noise.
  • block 331 is executed, and the previous (uncompensated) ROI boundary is used.
  • block 332 is executed, and a jitter-compensated ROI boundary is computed. Following execution of block 331 or 332 , the adjusted ROI boundary 321 is output.
  • FIG. 5 is a flow chart of an example of a boundary change computation in the case where the jitter includes a translation without a rotation.
  • Jitter estimation block 324 receives sensor data 305 from inertial sensor (the gyroscope and/or accelerometer 302 ) (shown in FIG. 3 ).
  • Decision block 330 determines whether there is significant jitter present.
  • block 341 is executed.
  • the shift in the X direction is set to zero, and the shift in the Y direction is set to zero.
  • block 342 is executed.
  • the shift in the X direction is computed as a function f( ) based on sensor data 305 and the first signal 319 output by EIS block 318
  • the shift in the Y direction is computed as a function g( ) based on sensor data 305 and EIS block output (first signal 319 ).
  • the functions f( ) and g( ) are the techniques within the EIS block that take both the sensor data 305 and EIS block output (first signal 319 ) into account in determining the motion compensation shift. Following execution of block 341 or 342 , the adjusted ROI boundary 321 (including the shift in the X direction and the shift in the Y direction) is output.
  • FIG. 6 is a diagram of the jitter stabilized 3A statistics engine 312 of FIG. 3B .
  • the jitter stabilized 3A statistics engine 312 receives the YCbCr image data 311 for the image frame currently being captured and the adjusted ROI boundary 321 (including X shift and Y shift) for the next image frame to be captured.
  • the jitter stabilized 3A statistics engine 312 outputs the jitter compensated image statistics 313 to be used by the 3A (AE, AWB, AF) block 308 , which reflect the stabilized image window determined by EIS block 318 during processing of the current image frame.
  • FIG. 7 is a flow chart of an exemplary method for finding the X shift and Y shift in an image capture device 100 having a motion (inertial) sensor (e.g., gyroscope 116 and/or accelerometer 117 of FIG. 1 ).
  • a motion (inertial) sensor e.g., gyroscope 116 and/or accelerometer 117 of FIG. 1 .
  • the jitter estimation block 146 of FIG. 1 sums pixel values across each individual row of the imaging sensor for a first image frame to provide a first vector.
  • the jitter estimation block 146 of FIG. 1 sums pixel values across each individual column of the imaging sensor for the first image frame, to provide a second vector;
  • the jitter estimation block 146 of FIG. 1 sums pixel values across each individual row of the imaging sensor for a second image frame to provide a third vector.
  • the jitter estimation block 146 of FIG. 1 sums pixel values across each individual column of the imaging sensor for the second image frame, to provide a fourth vector.
  • the jitter estimation block 146 of FIG. 1 compares the first vector to the third vector.
  • the jitter estimation block 146 of FIG. 1 determines the Y shift from the difference vector between the first vector and the third vector.
  • the shift is indicated by the substantially non-zero values in the difference vector.
  • the jitter estimation block 146 of FIG. 1 compares the second vector to the fourth vector, to determine the jitter or panning shift.
  • the jitter estimation block 146 of FIG. 1 determines the X shift from the difference vector between the second vector and the fourth vector.
  • the shift is indicated by the substantially non-zero values in the difference vector.
  • the jitter estimation method of FIG. 7 is just one example, and other jitter estimation methods can be used.
  • the ROI adjuster block 147 can adjust the AE ROI, AWB ROI, AF ROI and image window to be different from each other.
  • the adjusted AWB ROI is set to be the same as the adjusted imaging window, to optimize the white balance across the entire image.
  • the methods described herein can provided jitter stabilized exposure and white balance for video and still images collected in burst mode.
  • the jitter stabilized AE adjustments and AWB adjustments ensure that the light levels and white balance of the collected image frames remain steady.
  • the methods and system described herein may be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes.
  • the disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine readable storage media encoded with computer program code.
  • the media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD-ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium.
  • the computer program code When the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method.
  • the methods may also be at least partially embodied in the form of a computer into which computer program code is loaded or executed, such that, the computer becomes a special purpose computer for practicing the methods.
  • the computer program code segments configure the processor to create specific logic circuits.
  • the methods may alternatively be at least partially embodied in application specific integrated circuits for performing the methods.

Abstract

A method for image capture control may include computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI). A jitter or panning shift is computed based on the EIS compensation. The first ROI is adjusted for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device based on the jitter or panning shift.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
None
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
None
BACKGROUND
This disclosure relates to imaging devices generally, and more specifically to automated image capture control.
FIELD Description of Related Art
Digital image capture devices (e.g., cameras in cell phones, tablets, and laptop computers) use sophisticated image capture and signal processing techniques to render high quality video and still images in spite of their small sensor size. Properly capturing the image data is fundamental to video and still photography. For high quality images, digital image capture devices automatically focus the image capture device's lens for sharpness, set the exposure time for accurate light levels, and adjust the white balance so colors can be reproduced accurately regardless of the color temperature of the light source for the image.
Mobile devices, such as cell phones, tablets, and laptop computers, are often hand-held when capturing an image. Unlike digital single lens reflex (DSLR) cameras and other dedicated video and still cameras, mobile devices lack a mount for securing the mobile device on a tripod. Images captured by a hand-held image capture device (“hand-held images”) may be blurred or have artifacts due to camera motion, particularly in low-light settings.
SUMMARY
A method for image capture control may include computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI). A jitter or panning shift is computed based on the EIS compensation. The first ROI is adjusted for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device based on the jitter or panning shift.
An image capture control apparatus may include an electronic image stabilizer for outputting a first signal based on image data from an image capture device. The image data have a first region of interest (ROI). A jitter estimator is configured to compute a jitter or panning shift based on the first signal. An ROI adjuster is configured to compute an adjusted ROI based on the jitter or panning shift, for use in at least one of a white balance process or an automatic exposure process of the image capture device.
An image capture control apparatus may include means for outputting a first signal based on image data from an image capture device, the image data having a first region of interest (ROI). A means for estimating a jitter or panning shift is provided based on the first signal. A means for computing an adjusted ROI based on the jitter or panning shift is provided. The adjusted ROI is for use in at least one of the group consisting of a white balance process and an automatic exposure process of the image capture device.
A non-transitory, machine-readable storage medium can be encoded with program code for image capture control. The program code comprises: code for computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI); code for computing a jitter or panning shift based on the EIS compensation; and code for adjusting the first ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device based on the jitter or panning shift.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a block diagram of an exemplary image capture device having stabilized automatic exposure or stabilized automatic white balance.
FIGS. 2A-2D are diagrams showing how jitter can affect automatic exposure or automatic white balance, and how the image capture device of FIG. 1 can compensate for the jitter.
FIG. 3A is a high level flow chart of a method for stabilizing automatic exposure or stabilizing automatic white balance using the image capture device of FIG. 1.
FIG. 3B is a detailed schematic diagram showing a method for stabilizing automatic exposure or stabilizing automatic white balance using the image capture device of FIG. 1.
FIG. 3C is a diagram showing an example of different regions of interest for automatic white balance, autofocus, and automatic exposure.
FIG. 4 is a high level flow chart of the jitter estimation method of FIGS. 3A and 3B.
FIG. 5 shows an example of the jitter estimation method of FIG. 4 applied to a jitter profile indicating the image capture device is translating without rotating.
FIG. 6 is a diagram of the jitter stabilized automatic white balance, automatic exposure, or automatic focus statistics engine of FIG. 3B.
FIG. 7 is a detailed flow chart of an exemplary jitter estimation computation.
DETAILED DESCRIPTION
This description of the exemplary embodiments is intended to be read in connection with the accompanying drawings, which are to be considered part of the entire written description. In the description, relative terms such as “lower,” “upper,” “horizontal,” “vertical,”, “above,” “below,” “up,” “down,” “top” and “bottom” as well as derivatives thereof (e.g., “horizontally,” “downwardly,” “upwardly,” etc.) refer to the orientation as then described or as shown in the drawing under discussion. These relative terms are for convenience of description and do not require that the apparatus be constructed or operated in a particular orientation.
This disclosure provides image capture devices with jitter stabilized automatic exposure (AE) and/or jitter stabilized automatic white balance (AWB).
As mobile devices have improved, many mobile devices now include cameras capable of capturing a series of image frames of the same scene or subject in a short period of time (e.g., video or still images in burst mode). In many image capture devices, a series of images of the same scene or subject captured in a short period of time show uneven light levels and color temperature. Color temperature is the temperature of an ideal black-body that radiates light having a color distribution comparable to the color distribution of the source of light reflected off the subject during image capture. Light level and color temperature may exhibit fluctuations, even though the time period was too short for the ambient lighting conditions to change.
Jitter is a camera movement due to camera shake (e.g., due to hand shake or tripod vibration). Jitter can be distinguished from panning motion. Jitter has smaller amplitude than panning motion. Jitter may include translations in three directions and rotations in three directions. An example of jitter motion includes from one to three translations on the order of 1 cm or less, and from one to three rotations on the order of 5-10°. Jitter can change the scene at the edges of the field of view (FOY) of the image capture device. Jitter can also affect focus at shutter speeds that are slow, for a given effective focal length of the image capture device. The direction of jitter motion can oscillate with an irregular, variable period (e.g., from 25 to 150 Hz).
Panning changes the scene within the FOV of the image capture device, and may change the entire scene or a portion of the scene. In many cases, panning primarily includes a rotation about the yaw axis. The yaw axis is parallel to the short sides of the image capture device in landscape mode, and is parallel to the long sides of the image capture device in portrait mode. A panning motion may include any yaw rotation from zero to 360 degrees, such as 90° or 180°, for example. Panning often has a larger amplitude than jitter and usually is not a repeating motion. In some cases, panning includes rotation about both the pitch and yaw axes (e.g., for hand-held panning or panning while the image capture device is mounted on a tripod having a ball-type head. The pitch axis is parallel to the long sides of the image capture device in landscape mode, and is parallel to the short sides of the image capture device in portrait mode. It is also possible for panning to also include a rotation about the roll axis (e.g., switching between landscape and portrait mode) or translations in up to three directions. Jitter can also be present during a panning motion. If the image capture device is hand-held, up to six jitter components (three translation, three rotation) may be present.
Jitter can cause objects to enter or exit the edges of the image frame during a series of frames. If the object entering or exiting the image frame has a different light level or light source from other subjects in the frame, the entry or exit of the object at the edge changes the overall light level and distribution of red, green and blue light in the image (“the histogram”). The AE and AWB systems respond to the changes in the histogram by changing the exposure and white balance settings between images, even though lighting conditions have not changed. The change in the exposure and white balance setting between image frames can be sufficiently large to be detected by a user's unaided eyes.
As described below, hand-held images of a scene or subject captured in rapid succession have more consistent appearance when fluctuations in AE and AWB settings due to unintended image capture device motion (“jitter”) are reduced or eliminated. Jitter encompasses image capture device motion resulting from hand shake, floor vibrations, wind, loud sounds, vehicle motion, accidental contact with the image capture device or a tripod or selfie stick on which the image capture device is mounted, or other causes. The AE/AWB stabilization techniques described below can be applied in mobile devices and dedicated cameras. Although the examples below include both AE stabilization and AWB stabilization, the motion compensation can be applied to either AE or AWB alone, or to AE and/or AWB in combination with autofocus (AF). The AE stabilization and AWB stabilization can be applied to video or still images.
FIG. 1 is a schematic block diagram of an exemplary image capture device 100 including at least one processor 160 linked to video or still camera optics and sensor 115 for capturing images. The processor 160 is also in communication with a working memory 105, instruction memory 130, storage device 110 and an optional electronic display 125.
The image capture device 100 and AE/AWB stabilization techniques described herein can provide advantages in many different types of portable and stationary computing devices. The image capture device 100 can also be implemented in a special-purpose video or still camera or a multi-purpose device capable of performing imaging and non-imaging applications. For example, image capture device 100 can be a portable personal computing device such as a mobile phone, digital camera, tablet computer, laptop computer, personal digital assistant, or the like.
The image capture device 100 has video or still camera optics and camera optics and sensor 115, including an image sensor having an array of pixels. Each pixel has a microlens and—in the case of a color image sensor—color filters, including at least two types of phase detection (PD) pixels. The video or still camera optics and camera optics and sensor 115 can also have a primary focusing mechanism that is positionable based, at least partly, on data received from the image signal processor 120. The focusing mechanism positions the optics to produce a focused image of a region of interest (ROI) in the target scene. In some image capture devices 100, video or still camera optics and sensor 115 includes multiple lenses and multiple sensors. The lenses can include front-facing and/or back-facing lenses. The sensors can include color and/or monochrome sensors.
The at least one processor 160 can include multiple processors, such as an image signal processor 120 and a device processor 150. In other embodiments, the processor 160 has a single central processing unit that performs image signal processing and other operations. Image signal processor 120 can include one or more dedicated image signal processors (ISPs) or a software implementation programmed on a general purpose processor. In some examples, the image signal processor 120 performs phase detection operations. Alternatively, an image sensor 200 (shown in FIG. 2, described below) can be configured to perform the phase detection operations.
Referring again to FIG. 1, the image signal processor 120 can control image capture parameters, including but not limited to, autofocus and auto-exposure. The image signal processor 120 can also be configured to perform various image capture operations on received image data in order to execute phase-detection autofocus (PDAF), contrast autofocus, automatic exposure, and automatic white balance. Image signal processor 120 can be a general purpose processing unit or a processor specially designed for imaging applications. Image signal processor 120 also performs several image processing operations including demosaicing, noise reduction and cross-talk reduction. In some embodiments, the image signal processor 120 also performs post-processing functions, such as cropping, scaling (e.g., to a different resolution), image stitching, image format conversion, color processing, image filtering (e.g., spatial image filtering), lens artifact or defect correction, sharpening, or the like.
As shown in FIG. 1, the image signal processor 120 is connected to an instruction memory 130 for storing instructions and a working memory 105. The instruction memory 130 stores a capture control block 135, an autofocus block 140, an automatic white balance block 141, an automatic exposure block 142, an image processing block 143, an electronic image stabilization block 144, a jitter estimation block 146, a region of interest (ROI) adjuster block 147, and an operating system block 145. The instruction memory 130 can include a variety of additional modules that configure the image signal processor 120 of device processor 150 to perform various image processing and device management tasks. Working memory 105 can be used by image signal processor 120 to store a working set of instructions loaded from the function blocks in the instruction memory 130. Working memory 105 can also be used by image signal processor 120 to store dynamic data created during the operation of image capture device 100. For example, in some embodiments, the instruction memory 130 comprises flash memory, and the working memory 105 comprises dynamic random access memory (DRAM).
The capture control block 135 can include instructions that configure the image signal processor 120 to adjust the lens position, set the exposure time, sensor gain, and configure the white balance filter of the image capture device 100, in response to instructions generated during an auto focus operation, for example. Capture control block 135 can further include instructions that control the overall image capture functions of the image capture device 100. For example, capture control block 135 can call the autofocus block 140 to calculate lens or sensor movement to achieve a desired autofocus position and output a lens control signal to the lens.
Autofocus (AF) block 140 adjusts the lens position, so a region of interest (ROI) within the field of view (FOY) of the sensor, corresponding to one or more focus point(s), is focused in the plane of the sensor. The focus point (and thus the AF ROI) can be manually selected by the user, or the image capture device 100 can select one or more focus points based on which object(s) is (are) nearest to the image capture device 100. AF block 140 stores instructions for executing phase detection autofocus (PDAF), contrast autofocus, or laser autofocus. PDAF divides incoming light into pairs of images and captures the divided light rays coming from the opposite sides of the lens, creating a rangefinder. The two images are then analyzed to find a separation (phase) error and determine whether the ROI is focused of the sensor is in focus in the sensor plane. Contrast AF moves the lens through its focal range, stopping at the point where maximal contrast is detected between adjacent pixels. Laser AF shines a laser or light emitting diode (LED) on the subject and calculates the distance based on the time of flight. AF finds the optimal lens position for bringing a region of interest (ROI) into focus in the plane of the sensor.
The region of interest for AF (“the AF ROI”) can be a small fraction of the image area (e.g. 10% to 15%). The AF ROI can be manually selected (e.g., by tapping on an ROI in the display), or automatically selected (e.g., by face detection). The AF ROI can be located at the position of any of the focus points of the image capture device 100. Some mobile devices use information from imaging pixels for phase detection autofocus (PDAF), allowing thousands of different regions throughout the FOV of the lens to serve as AF ROI.
Automatic white balance (AWB) block 141 determines the color correction to be applied globally to the image so that neutral objects can be rendered as gray. AWB block 141 determines the illuminating light source under which an image was captured, and scales the components (e.g., R, G, and B) of the image so they conform to the light in which the image is to be displayed or printed. Because the same color temperature is applied globally throughout the image, the ROI for AWB (“the AWB ROI”) can include the entire field of view (FOY) of the sensor.
Automatic exposure block 142 determines the length of time that the sensing elements integrate light before being read out. The ambient light is metered, and an exposure time is selected. In the case of a DSLR camera, AE determines the length of time the shutter is open. As the ambient light level increases, the selected exposure time becomes shorter. As the ambient light level decreases, the selected exposure time becomes longer. In many image capture devices 100, the available ROI used by the AE block (“the AE ROI”) depends on the metering mode. Most image capture devices include one or more of the following metering modes: spot metering, partial metering, center-weighted metering, or multi-zone metering (also referred to as “matrix metering” and “evaluative metering”).
The AE ROI for spot metering can constitute from about 1% to about 5% of the FOV of the sensor, either centered around the center focus point or the current focus point selected by the user. The autofocus system chooses a lens position that focuses light from the current focus point in the plane of the image sensor. Spot metering ensures correct exposure of a specific spot in the image.
The AE ROI for partial metering is approximately 6% to 8% of the FOV of the sensor, centered around the active focus point(s). Partial metering can be used if the FOV has specific bright and dark areas, and the user only wants a specific region to affect the exposure. The partial metering AE ROI contains, and is concentric with, the spot metering AE ROI.
Center-weighted metering uses an AE ROI covering most of the FOV, but the light levels measured near the center region of the sensor are given more weight (e.g., 60% to 85%) than the light levels measured at the periphery of the sensor. The center weighted metering AE ROI is much larger than the spot metering AE ROI or the partial metering AE ROI. The center weighted metering AE ROI contains the spot metering AE ROI and the partial metering AE ROI at its centroid, if the center focus point is being used for spot metering or partial metering. If spot metering or partial metering is using a focus point other than the centroid of the FOV, then the AE ROI for center-weighted metering can be different from, or may overlap, the AE ROI for spot metering or partial metering. The regions containing the spot metering AE ROI and partial metering AE ROI are given the greatest weight in center-weighted metering.
Multi-zone metering uses an AE ROI including multiple regions of the image, and can use proprietary algorithms to analyze the image content and select an optimum exposure setting based on the light levels and the image content. Matrix metering can use empirical data to select an exposure setting considered optimal for a reference image having similar characteristics. Multi-zone metering can be used if the scene is fairly evenly lit. The multi-zone metering AE ROI can include the entire FOV, and thus can be larger than the AE ROIs of all of the other metering modes, as well as additional regions within the FOV of the sensor.
Thus, the image capture device 100 is configured to use three different ROIs, including an AF ROI for performing autofocus, an AE ROI for performing auto exposure and an AWB ROI for performing automatic white balance. These ROIs can be different from each other, as shown in FIG. 3C and described below. The AF ROI can include one or more regions, and can be on the order of 1% of the FOV or less. The AE ROI can vary between 1% and 100% of the FOV of the sensor and contains the AF ROI. Because the center-weighted AE method assigns different weights to different parts of the FOV, it is possible for the AE ROI to give very little weight to the region containing the AF ROI (e.g., in spot or partial metering modes), effectively using an ROI that does not overlap with the AF ROI. The AWB ROI can include 100% of the FOV of the sensor. The AWB ROI contains the AF ROI and contains or coincides with the AE ROI. The AWB ROI may give little weight to the region of the FOV containing the AF ROI, (e.g., in spot or partial metering modes).
In many image capture devices 100, the three blocks AF block 140, AWB block 141 and AE block 142 are included in “3A” engine 136. The 3A engine 136 includes functions that operate on raw image sensor data measured by the image sensor prior to capturing an image. To enhance the speed of focusing the lens, the 3A engine 136 can be implemented in a field-programmable gate array (FPGA) or an application specific integrated circuitry (ASIC), but other embodiments may perform the 3A functions using software code executed by image signal processor 120.
Image processing block 143 can include several processing functions including demosaicing, noise reduction, cross-talk reduction, color processing, gamma adjustment, image filtering (e.g., spatial image filtering), lens artifact or defect correction, image sharpening, or the like.
The electronic image stabilization (EIS) block 144 provides a form of post-capture image processing. The EIS block 144 provides a means for outputting a first signal (319, FIG. 3B) discussed below, based on image data from the image capture device 100. The first signal 319 can contain data representing a motion vector of the image capture device 100. The first signal (motion vector 319) indicates that EIS block 144 is applying motion compensation to an image frame. EIS block 144 positions a cropped pixel array (“the image window”) within the total array of pixels on the sensor, as described below in the discussion of FIGS. 2A-2D. The image window includes the pixels that are used to capture images. The image window can include all of the pixels in the sensor, except for a few rows and columns at the periphery of the sensor. The image window is in the center of the sensor while the image capture device 100 is stationary. The peripheral pixels surround the pixels of the image window and form a set of buffer pixel rows and buffer pixel columns around the image window. EIS block 144 shifts the image window from frame to frame of video, so that the image window tracks the same scene over successive frames—assuming that the subject does not move. (If the subject moves, the scene has changed; if the camera is held still except for jitter, EIS tracks the background over successive frames, and shifts the image window in a similar manner EIS cannot remove motion blur, if the image capture device is panned too quickly at any given exposure speed.)
For example, the image window can include at least 95% (e.g., 95% to 99%) of the pixels on the sensor. The first ROI (used for AE and/or AWB) may comprise the image data within the field of view of at least 95% (e.g., 95% to 99%) of the plurality of imaging pixels in the image sensor 200 (FIG. 2) of the image capture device 100 (FIG. 1). A plurality of buffer pixels at the periphery of the sensor (outside of the image window) are reserved as a buffer to allow the image window to shift to compensate for jitter. The image window can be moved so that the subject remains at the same location within the adjusted image window, even though light from the subject impinges on a different region of the sensor. In another example, the buffer pixels can include the ten topmost rows, ten bottommost rows, ten leftmost columns and ten rightmost columns of pixels on the sensor. When the image capture device 100 is stationary, the buffer pixels are not used for AF, AE or AWB, and are not included in the image output. If jitter moves the sensor to the left by twice the width of a column of pixels between frames, the EIS block 144 can shift the image window to the right by two columns of pixels, so the captured image shows the same scene in the next frame as in the current frame. EIS block 144 thus smoothes the transition from one frame to the next.
Jitter estimation block 146 receives inputs from EIS block 144 indicating detection of motion between the most recent two image frames received. Jitter estimation block 146 provides a means for estimating a jitter or panning shift based on the first signal output by the EIS block 144. Optionally, jitter estimation block 146 can receive measurements from one or more inertial sensors (e.g., motion sensors, such as a gyroscope 116 and/or an accelerometer 117). EIS block 144 provides a first signal to the jitter estimation block 146. In an image capture device 100 having a gyroscope 116, the gyroscope 116 provides a second signal to the jitter estimation block 146 indicating a rotation of the image capture device 100. In an image capture device 100 having an accelerometer 117, the accelerometer 117 provides a second signal to the jitter estimation block 146 indicating a translation of the image capture device 100.
In an example, the accelerometer 117 provides a signal to the jitter estimation block 146 indicating a translation of the image capture device 100, and the jitter estimation block 146 uses data from the EIS block 144 to determine the magnitude and direction of the translation.
The jitter estimation block 146 can compute a more general transformation including translation and rotation of the subject. In another example, the jitter estimation block 146 uses an algorithm that accounts for signals from the EIS block 144 and the one or more inertial sensors (e.g., gyroscope 116 and/or accelerometer 117) to determine the jitter and rotation.
ROI adjuster block 147 compares the magnitude of the jitter motion to the size of the region of buffer pixels available at the periphery of the sensor. ROI adjuster block 147 provides a means for computing an adjusted ROI based on the jitter or panning shift, for use in a white balance process and/or an automatic exposure process of the image capture device 100. ROI adjuster block 147 determines whether enough buffer pixels are available at the appropriate edge of the sensor for shifting the AE ROI, AWB ROI (and optionally, the AF ROI), corresponding to the image window shift initiated by the EIS block 144. In response to a determination that enough buffers are available to compensate for the jitter, the ROI adjuster adds buffer pixels to one side of the image window, and removes the same number of pixels from the opposite side of the image window, effectively moving the AE ROI and AWB ROI to track the shift in the image window. In response to a determination that insufficient additional buffers are available on one side to compensate for the jitter, the ROI adjuster can add virtual pixels to the one side of the ROI. The virtual pixels have luminance and chroma values extrapolated from the pixels at the edge of the sensor. The ROI adjuster block 147 removes the same number of pixels from the opposite side of the image window effectively moving the ROI.
In an image capture device 100 having ROI adjuster block 147, the AE block 142 and AWB block 141 provide means for computing an exposure time and/or a color temperature for an image frame using image data corresponding to the adjusted ROI. In some image capture devices 100, a 3A engine 136 provides the means for computing an exposure time and/or a color temperature for an image frame using image data corresponding to the adjusted ROI.
The 3A engine 136 can incorporate AF block 140, AWB block 141 and AE block 142 in a single processing engine. The AF block 140 provides a means for selecting an autofocus ROI to be used by the autofocus control system of the image capture device 100. The exemplary 3A engine 136 is configured so that the AWB ROI and AE ROI can be different from each other, different from the first ROI (the original AF ROI), and different from the adjusted AF ROI. For example, the ROI adjuster block 147—the means for computing an adjusted ROI based on the jitter or panning shift—is configured to select the adjusted AE ROI and AWB ROI to be larger than the autofocus ROI. For example, the AE ROI and AWB ROI can include the entire FOV, while the AF ROI may be a spot occupying 5% of the FOV.
In some embodiments, the ROI adjuster block 147 adjusts the boundary of the ROI every time the EIS block 144 processes an image frame. In other embodiments, the ROI adjuster block 147 adjusts the ROI boundary less frequently. For example, ROI adjuster block 147 can adjust the ROI every second, third, or fourth image frame. In other embodiments, the ROI adjuster block 147 adjusts the boundary of the ROI every time the EIS block 144 shifts the image window.
Operating system block 145 acts as an intermediary between programs and the processor 160. For example, if the image capture device 100 is a mobile phone or tablet, the operating system can be “APPLE iOS”™, from Apple, Inc., Cupertino, Calif. or any other operating system for a mobile phone or tablet. If the image capture device 100 is a computer, the operating system block 145 can be ‘WINDOWS”™ sold by Microsoft Corporation of Redmond, Wash. or any other operating system for a computer. Operating system block 145 can include device drivers to manage hardware resources such as the video or still camera optics and sensor 115. Instructions contained in the image processing blocks discussed above interact with hardware resources indirectly, through standard subroutines or application program interfaces (APIs) in operating system block 145. Instructions within operating system block 145 can then interact directly with these hardware components. Operating system block 145 can further configure the image signal processor 120 to share information with device processor 150.
Device processor 150 can be configured to control the display 125 to display the captured image or a preview of the captured image to a user. The display 125 can be external to the image capture device 100 or can be part of the image capture device 100. The display 125 can be configured to provide a view finder displaying a preview image prior to capturing an image. The display 125 can comprise a liquid crystal display (LCD), light emitting diode (LED), or organic light emitting diode (OLED) screen, and can be touch sensitive.
Device processor 150 can write data to storage device 110. The data can include data representing captured images, data generated during phase detection and/or metadata, e.g., exchangeable image file format (EXIF) data. While storage device 110 is represented schematically as a disk device, storage device 110 can be configured as any type of storage media device. For example, the storage device 110 can include a disk drive, such as an optical disk drive or magneto-optical disk drive, or a solid state memory such as a FLASH memory, random access memory (RAM), read-only memory (ROM), and/or electrically-erasable programmable ROM (EEPROM). The storage device 110 can also include multiple memory units.
Although FIG. 1 shows an image capture device 100 having separate components to implement a processor 160 and working memory 105, in other examples these separate components can be combined in a variety of ways. For example, in an alternative example (not shown), the memory components can be combined with processor components in a system on a chip (SOC).
FIGS. 2A to 2D schematically demonstrate ROI adjustment as performed by the ROI adjuster block 147. FIG. 2A shows light from a scene including face 225 and light bulb 201 focused on the image sensor 200. The face 225 is centered within the image window 202 within the image sensor 200. The individual pixels in image sensor 200 are omitted for ease of viewing. The peripheral region 203 of the image sensor 200 contains buffer pixels (not shown) which are outside of the image window 202 when the image capture device 100 is stationary. For ease of illustration, assume that the image capture device 100 uses multi-zone metering, and the AE ROI and AWB ROI both coincide with the entire image window 202. The face 225 is in the center of the image window 202. A bright object (e.g., a light bulb 201) is located inside of the image window and inside the AE ROI for light metering. The light bulb 201 inside the image window contributes to the metered light level and lowers the average color temperature for a “warm” white balance. (Red and orange light have lower color temperatures than blue light, but red and orange light are customarily referred to as “warm” and blue light is customarily referred to as “cool”.)
FIG. 2B shows the same image sensor 200 after jitter causes the image sensor 200 to move downward and to the right, causing light from the scene to impinge on image sensor 200 beyond the upper left corner of the original image window 202. The face 225 is off center with respect to image window 202, and the light bulb 201 is outside the image window 202. To compensate for the sensor motion, the EIS block 144 updates the image window with an X shift and a Y shift. The X shift includes adding buffer pixels in region 207 to the image window for the next frame and removing pixels in region 208 from the image window for the next frame. The Y shift includes adding buffer pixels in region 206 to the image window for the next frame and removing pixels in region 209 from the image window for the next frame. As a result, the image window shifts upward and leftward to keep the face 225 in the same location with respect to the adjusted image window 204 (as shown in phantom). The left buffer pixels added to the image window are indicated by L-shaped region 207. The right buffer pixels removed from the image window are indicated by L-shaped region 208. Adjusted image window 204 includes the image window 202, plus the added region 207, minus the removed region 208. The adjusted image window 204 is indicated by the entire dashed square.
Following the image window shift, the light bulb 201 is again located within the adjusted image window 204, contributing to the light level. Without an AE ROI adjustment, the AE block 142 would set the exposure time based on the original AE ROI—coinciding with original image window 202 in this example—which no longer contains the light bulb 201. The exposure time would be too long, over-exposing the image. The AWB block 141 would not take the warm light from light bulb 201 into account, and will set the color temperature to a value suitable for cooler lighting. To avoid over-exposure and incorrect white balance, the ROI adjuster block 147 adds buffer pixels in regions 206 and 207 to the AE ROI and AWB ROI, and removes pixels in regions 208 and 209 from the AE ROI and AWB ROI as shown in phantom, consistent with the change in the image window. Following the ROI adjustment, the light bulb 201 is inside the adjusted AE ROI and AWB ROI, corresponding to the adjusted image window 204. The exposure time is reduced for proper exposure, and the white balance again reflects warmer light. This is just a simplified example, and AE ROI can be (and frequently is) different from the AWB ROI and the image window.
FIG. 2C shows the same image sensor 200 imaging the same subject (i.e., face 225) as shown in FIG. 2A. In FIG. 2C, the light bulb 201 is outside of the image window. Assuming the face 225 is not illuminated by the light bulb 201, the light from light bulb 201 does not contribute substantially to the light level metered in the image window 202. Without the “warm” light from the light bulb 201, the AWB block 141 sets the white balance for “cool” light.
FIG. 2D shows image sensor 200 after jitter causes the image sensor 200 to move upward and to the right, causing light from the scene to impinge on image sensor 200 beyond the lower left corner of the original image window 202. The face 225 is off-center, and the light bulb is now within the original image window 202. To compensate for the sensor motion, the EIS block 144 updates the image window with an X shift and a Y shift. The X shift includes adding buffer pixels in region 218 to the image window for the next frame and removing pixels in region 216 from the image window for the next frame. The Y shift includes adding buffer pixels in region 217 to the image window for the next frame and removing pixels in region 219 from the image window for the next frame. As a result, the image window shifts downward and leftward to keep the face 225 in the same location relative to the adjusted image window 204 (as shown in phantom).
Following the image window shift, the light bulb 201 is located outside the adjusted image window 214, reducing the light level and the “cool” light. Without an AE ROI adjustment, the AE block 142 would set the exposure time based on the original AE ROI—coinciding with original image window 202—which now contains the light bulb 201. The AE will set the exposure time too short, and white balance setting will be suitable for “warm” light, even though the light bulb is not inside the adjusted image window 214. To avoid under-exposure, the ROI adjuster block 147 adds buffer pixels in regions 216 and 217 to the AE ROI and AWB ROI, and removes pixels in regions 218 and 219 from the AE ROI and AWB ROI as shown in phantom, consistent with the change in the image window 214. Following the AE ROI adjustment, the light bulb 201 is outside the adjusted AE ROI. The AE block 142 sets the exposure properly. Similarly, the ROI adjuster block 147 updates the AWB ROI, so the color temperature is measured for a cooler light source, without the warm light of the light bulb 201.
FIG. 3A is a flow chart of an exemplary method for adjusting the AE ROI or AWB ROI for image capture control.
At block 350, the 3A engine 136 receives sensor data from at least one inertial sensor (e.g., gyroscope 116 and/or accelerometer 117) indicating image capture device 100 is moving (e.g., translating or rotating).
At block 352, in response to the sensor data, EIS block 144 computes an electronic image stabilization (EIS) compensation for the current (first) image frame based on image data from the image capture device 100. The image data has a first region of interest (ROI), for AE and AWB operations. The EIS block 144 adjusts the location of the first image window 202 within the image sensor 200, so that the subject in the first frame remains at the same location within the image window that the subject had in a previous frame, even though the sensor has moved relative to the subject between the previous frame and the first frame. Because the 3A engine 136 (FIG. 1) computes statistics before the first image frame is captured, and EIS block 144 (FIG. 1) computes the ROI shift after the first image frame is captured, the ROI shift computed based on the first frame will be provided to the 3A engine 136 for use by AF block 140, AWB block 141 and AE block 142 (all shown in FIG. 1) for generating 3A statistics for the second frame. That is, the shift identified by EIS block 144 is used to compensate for motion in the image data of the current (first) frame and provided to 3A engine 136 (FIG. 1) to adjust the ROIs for 3A statistics generation for the next (second) frame. In other words, the application of the shift to the ROI for 3A's statistics can lag one frame behind the application of the same shift amount to the image window shift for motion compensation.
At block 354, jitter estimation block 146 computes a jitter or panning shift based on the EIS compensation. In response to determining that both jitter and panning are occurring, the EIS compensation can divide the motion of the image capture device 100 into a panning shift component and a jitter component by analyzing the motion acceleration components and/or the rotation components. For example, a rotation about a vertical axis without any horizontal acceleration indicates panning. The difference between the total detected motion and the rotation is due to jitter.
At block 356, based on the jitter or panning shift, the ROI adjuster block 147 adjusts the first ROI for use in an AWB process or an AE process of the image capture device 100, or for use in both the AWB process and the AE process. In the case where the jitter estimation block 146 determines that both jitter and panning are present, the ROI adjuster block 147 can adjust the first ROI to compensate for the jitter component, without compensating for the panning shift component. This allows AWB block 141 and/or AE block 142 to change the color temperature and/or exposure time to match the change in scene due to panning, without introducing additional fluctuations in color temperature and/or exposure time due to jitter. Alternatively, if the amplitude of the panning is on the same order of magnitude as the jitter motion, the ROI adjuster block 147 can adjust the first ROI to compensate for both the jitter component and the panning.
At block 358, in some image capture devices 100, the ROI adjustment may include adjusting a second ROI (e.g., the AF ROI) for use in AF control, where the first ROI (e.g., the AWB ROI for AWB and/or the AE ROI for AE) and the second ROI (for AF) for a single image frame are different from each other. For example, the first ROI is often larger than the second ROI. For example, as shown in FIG. 3C, the first ROI (AWB ROI 370) can include the entire FOV of the image sensor. The second ROI (e.g., AE ROI 372) depends on the metering mode, and may be about 6-8% of the FOV for partial metering as shown in FIG. 3C. The AF ROI 374 can be about 15% of the FOV, and can be located with the focus point 373 centered within the AF ROI. Thus, in the example of FIG. 3C, the AF ROI 374 and the AE ROI 273 are different sizes and do not overlap each other, while the AWB ROI 370 occupies the entire FOV of the image capture device 100, and contains the AF ROI 374 and the AE ROI 372.
FIG. 3B is a detailed schematic diagram of the operation of the image capture device 100 of FIG. 1.
The user activates the focusing operation, e.g., by tapping a point on the display 125 of the image capture device 100 to select a focus ROI, or a camera application activates the focusing operation on its own by, for example, face detection.
In inertial sensor (e.g., accelerator/gyro) block 302, in the case of image capture devices having a motion (inertial) sensor 302 (e.g., an accelerometer 116 or gyro 117, FIG. 1), the inertial sensor 302 provides a signal 303 to the 3A engine 308 and provides a signal 305 to the jitter estimation block 324.
In block 304, the sensor driver 304 operates the image sensor 306 to collect sensor data for autofocusing the lens. The sensor data can include PD pixel values and or imaging pixel values. The image sensor 306 outputs raw analog sensor data to front end processing block 310.
The front end processing block 310 conditions the analog signal received from the image sensor and performs analog-to-digital (A/D) conversion. The front end processing block 310 can clamp the input signal, sample the clamped input signal, filter out thermal noise, attenuate low frequency drift, and convert the raw sensor data from analog format to digital format in an analog-to-digital converter (ADC), not shown. The raw digital data undergoes demosaicing (color filter array interpolation) to reconstruct a full red-green-blue (RGB) color image. Then, the front end processing block 310 can perform a mathematical coordinate transformation from the red-green-blue (RGB) color space to an associated YCbCr (luminance, blue difference, red difference) color space. The YCbCr space separates the luminance value from the color values. The luminance signal (Y) can be transmitted at high bandwidth or stored with high resolution, and two color components (Cb and Cr) can be bandwidth-reduced, subsampled, and compressed for more efficient transmission/storage. The YCbCr data 311 are output from the front end processing block 310.
The front end processing block 310 outputs the YCbCr data 311 to the YCbCr buffers 314 and to the jitter stabilized 3A statistics engine 312. The image enhancements block 316 can perform operations such as cropping, scaling (e.g., to a different resolution), image format conversion, color filtering, sharpening, or the like.
The EIS block 318 compares successive frames to determine a camera motion vector for the current frame, and adjusts the boundaries of the image window on the imaging sensor based on two successive frames. For example, EIS block 318 can compute the shift for motion compensation of the image window for the current frame based on image data from the current frame and the previous frame. (As noted above, the application of the ROI shift to the 3A engine 308 lags one frame behind applying the shift to the image window for the corresponding motion compensation.) In general, the EIS compensation based on a first frame of image data is used to shift an image window of the first frame, and the adjusted ROI—which is adjusted based on the EIS compensation—is applied to a second frame of image data following the first frame of image data. EIS block 318 outputs the first signal (e.g., motion vector) 319 to the jitter estimation block 324.
The jitter estimation block 324 receives motion data 305 from the motion (inertial) sensor 302 (e.g., gyroscope 116 and/or accelerometer 117 from FIG. 1) and the first signal (e.g., calculated motion vector) 319 from EIS block 318. The jitter estimation block 324 computes and outputs the estimated jitter 325 based on both motion data 305 and motion vector 319. For example, jitter estimation block 324 can compute a weighted average of the motion data 305 from the inertial sensor 302 and the calculated motion vector 319 from the EIS block 318.
The ROI adjuster 320 receives the estimated jitter 325 and determines the adjusted AF ROI, AE ROI and AWB ROI to be applied to the next image frame. The ROI adjuster outputs information 321, which can identify the new ROI boundaries, or identify the change in the ROI boundaries.
Block 312 receives the YCbCr image data from the front end processing block 310 and the new ROI boundaries or the change in the ROI boundaries from the ROI adjuster 320. Based on the new ROI boundaries or the changes in the ROI boundaries, jitter stabilized 3A statistics block 312 determines from which pixels the YCbCr image data is used by the 3A engine 308 for determining the AF, AE and AWB settings for the next image frame. Jitter stabilized 3A statistics block 312 computes the motion compensated image statistics (e.g., histogram data) using the adjusted image window and adjusted ROIs 321
Video encoder block 322 converts raw (uncompressed) digital video to compressed video signals. For example, the video encoder block 322 can be an (ISO/IEC 14496) MPEG-4 encoder.
Video buffers and bitstream 326 store video or graphics information as it moves from the video encoder to the display screen or storage medium. The video buffers gather all information before the information is to be displayed or stored, to accommodate any mismatch between the speed at which the encoded video is output from video encoder block 322 and the speed at which a display or storage medium can receive the video data.
To support jitter stabilized AE and AWB, capture control block 135 of FIG. 1 is configured to allow the ROI adjuster to adjust the AF ROI, AE ROI and AWB ROI after each image frame. Capture control block 135 is also configured to initiate processing by 3A engine 308 before every succeeding image frame, based on the adjusted AF ROI, adjusted AE ROI and adjusted AWB ROI. As a result, the exposure speed and white balance can be maintained at proper levels for the scene, even while the camera is moving relative to the scene, and the image window is shifting.
FIG. 4 is a flow chart of the jitter estimation block 324. Jitter estimation block 324 receives sensor data 305 from the inertial sensor 302 (e.g., gyroscope and/or accelerometer), shown in FIG. 3.
Decision block 330 determines whether there is significant jitter present. Jitter is considered insignificant if the motion sensor output signals 305 cannot be distinguished from thermal noise. In response to a determination that there is no significant jitter present, block 331 is executed, and the previous (uncompensated) ROI boundary is used. In response to a determination that there is significant jitter present, block 332 is executed, and a jitter-compensated ROI boundary is computed. Following execution of block 331 or 332, the adjusted ROI boundary 321 is output.
FIG. 5 is a flow chart of an example of a boundary change computation in the case where the jitter includes a translation without a rotation. Jitter estimation block 324 receives sensor data 305 from inertial sensor (the gyroscope and/or accelerometer 302) (shown in FIG. 3).
Decision block 330 determines whether there is significant jitter present. In response to a determination that there is no significant jitter present, block 341 is executed. The shift in the X direction is set to zero, and the shift in the Y direction is set to zero. In response to a determination that there is significant jitter present, block 342 is executed. The shift in the X direction is computed as a function f( ) based on sensor data 305 and the first signal 319 output by EIS block 318, and the shift in the Y direction is computed as a function g( ) based on sensor data 305 and EIS block output (first signal 319). The functions f( ) and g( ) are the techniques within the EIS block that take both the sensor data 305 and EIS block output (first signal 319) into account in determining the motion compensation shift. Following execution of block 341 or 342, the adjusted ROI boundary 321 (including the shift in the X direction and the shift in the Y direction) is output.
FIG. 6 is a diagram of the jitter stabilized 3A statistics engine 312 of FIG. 3B. The jitter stabilized 3A statistics engine 312 receives the YCbCr image data 311 for the image frame currently being captured and the adjusted ROI boundary 321 (including X shift and Y shift) for the next image frame to be captured. The jitter stabilized 3A statistics engine 312 outputs the jitter compensated image statistics 313 to be used by the 3A (AE, AWB, AF) block 308, which reflect the stabilized image window determined by EIS block 318 during processing of the current image frame.
FIG. 7 is a flow chart of an exemplary method for finding the X shift and Y shift in an image capture device 100 having a motion (inertial) sensor (e.g., gyroscope 116 and/or accelerometer 117 of FIG. 1).
At block 702, the jitter estimation block 146 of FIG. 1 sums pixel values across each individual row of the imaging sensor for a first image frame to provide a first vector.
At block 704, the jitter estimation block 146 of FIG. 1 sums pixel values across each individual column of the imaging sensor for the first image frame, to provide a second vector;
At block 706, the jitter estimation block 146 of FIG. 1 sums pixel values across each individual row of the imaging sensor for a second image frame to provide a third vector.
At block 708, the jitter estimation block 146 of FIG. 1 sums pixel values across each individual column of the imaging sensor for the second image frame, to provide a fourth vector.
At block 710, the jitter estimation block 146 of FIG. 1 compares the first vector to the third vector.
At block 712, the jitter estimation block 146 of FIG. 1 determines the Y shift from the difference vector between the first vector and the third vector. The shift is indicated by the substantially non-zero values in the difference vector.
At block 714, the jitter estimation block 146 of FIG. 1 compares the second vector to the fourth vector, to determine the jitter or panning shift.
At block 716, the jitter estimation block 146 of FIG. 1 determines the X shift from the difference vector between the second vector and the fourth vector. The shift is indicated by the substantially non-zero values in the difference vector.
The jitter estimation method of FIG. 7 is just one example, and other jitter estimation methods can be used.
In the examples described above, the ROI adjuster block 147 can adjust the AE ROI, AWB ROI, AF ROI and image window to be different from each other. In other examples, the adjusted AWB ROI is set to be the same as the adjusted imaging window, to optimize the white balance across the entire image.
The methods described herein can provided jitter stabilized exposure and white balance for video and still images collected in burst mode. When electronic image stabilization shifts the image window, the jitter stabilized AE adjustments and AWB adjustments ensure that the light levels and white balance of the collected image frames remain steady.
The methods and system described herein may be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes. The disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine readable storage media encoded with computer program code. The media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD-ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium. When the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method. The methods may also be at least partially embodied in the form of a computer into which computer program code is loaded or executed, such that, the computer becomes a special purpose computer for practicing the methods. When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits. The methods may alternatively be at least partially embodied in application specific integrated circuits for performing the methods.
Although the subject matter has been described in terms of exemplary embodiments, it is not limited thereto. Rather, the appended claims should be construed broadly, to include other variants and embodiments, which may be made by those skilled in the art.

Claims (25)

What is claimed is:
1. A method for image capture control, comprising:
computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI);
receiving first sensor data from a first inertial sensor;
computing a jitter or panning shift based on the first sensor data and the EIS compensation, wherein computing the jitter or panning shift comprises sensing a motion of the image capture device based on the first sensor data, and dividing the motion of the image capture device into a panning shift component and a jitter component based on the first sensor data and the EIS compensation; and
adjusting the first ROI based on the jitter or panning shift, wherein adjusting the first ROI compensates for the jitter component without compensating for the panning shift component.
2. The method for image capture control according to claim 1, further comprising adjusting a second ROI for use in autofocus control, wherein the first ROI and the second ROI for a single image frame are different from each other.
3. The method for image capture control according to claim 1, further comprising receiving second sensor data from at least a second inertial sensor for use in computing the jitter or panning shift.
4. The method for image capture control according to claim 1, further comprising computing histogram data used to calculate an exposure time and/or a color temperature for an image frame based on the adjusted ROI, in response to the adjusted ROI.
5. The method for image capture control according to claim 4, wherein the first ROI comprises image data within a field of view of at least 95% of a plurality of imaging pixels in an image sensor of the image capture device.
6. The method for image capture control according to claim 1, wherein:
the EIS compensation based on a first frame of image data is used to shift an image window of the first frame; and
the adjusted ROI is applied to a second frame of image data following the first frame of image data.
7. The method for image capture control according to claim 1, wherein the image capture device has an imaging sensor, and computing the jitter or panning shift includes:
summing pixel values across each individual row of the imaging sensor for a first image frame to provide a first vector;
summing pixel values across each individual column of the imaging sensor for the first image frame to provide a second vector;
summing pixel values across each individual row of the imaging sensor for a second image frame to provide a third vector;
summing pixel values across each individual column of the imaging sensor for the second image frame to provide a fourth vector; and
comparing the first vector to the third vector and comparing the second vector to the fourth vector[H] to determine the jitter or panning shift.
8. An image capture control apparatus, comprising:
a memory; and
at least one processor coupled to the memory, the at least one processor configured to:
compute an electronic image stabilization (EIS) compensation based on image data received from an image capture device, the image data having a first region of interest (ROI);
receive first sensor data from at least one inertial sensor;
compute a jitter or panning shift based on the first sensor data and the EIS compensation, wherein computing the jitter or panning shift comprises sensing a motion of the image capture device based on the first sensor data, and dividing the motion into a panning shift component and a jitter component based on the first sensor data and the EIS compensation; and
compute an adjusted ROI based on the jitter or panning shift, wherein the adjusted ROI compensates for the jitter component without compensating for the panning shift component.
9. The image capture control apparatus of claim 8, wherein the processor is further configured to:
shift an image window of a first frame of image data based on a first frame of image data; and
apply the adjusted ROI to a second frame of image data following the first frame of image data.
10. The image capture control apparatus of claim 8, wherein the processor is further configured to:
compute histogram data used to calculate an exposure time or a color temperature for an image frame based on the adjusted ROI.
11. The image capture control apparatus of claim 8, wherein the processor is further configured to:
select an autofocus ROI to be used by an autofocus control system of the image capture device, such that the autofocus ROI is different from the first ROI and different from the adjusted ROI.
12. The image capture control apparatus of claim 11, wherein the processor is further configured to:
select an adjusted ROI larger than the autofocus ROI.
13. An image capture control apparatus, comprising:
means for computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image capture device including an imaging sensor comprising a plurality of buffer pixel rows and buffer pixel columns at a periphery of the imaging sensor, the image data having a first region of interest (ROI);
means for receiving first sensor data from a first inertial sensor;
means for estimating a jitter or panning shift based on the first sensor data and the EIS compensation, comprising sensing a motion of the image capture control apparatus based on the first sensor data, and dividing the motion into a panning shift component and a jitter component based on the first sensor data and the EIS compensation; and
means for computing an adjusted ROI based on the jitter or panning shift wherein the adjusted ROI compensates for the jitter component without compensating for the panning shift component.
14. The image capture control apparatus of claim 13, further comprising means for computing an exposure time and a color temperature for an image frame using image data corresponding to the adjusted ROI.
15. The image capture control apparatus of claim 13, wherein the image capture device comprises means for selecting an autofocus ROI to be used by an autofocus control system of the image capture device, such that the autofocus ROI is different from the first ROI and different from the adjusted ROI.
16. The image capture control apparatus of claim 15, wherein the means for computing the adjusted ROI is configured to select the adjusted ROI to be larger than the autofocus ROI.
17. A non-transitory, machine-readable storage medium encoded with program code for image capture control, wherein the program code comprises:
code for computing an electronic image stabilization (EIS) compensation based on image data from an image capture device, the image data having a first region of interest (ROI);
code for receiving first sensor data from a first inertial sensor;
code for computing a jitter or panning shift based on the first sensor data and the EIS compensation, including sensing a motion of the image capture device based on the first sensor data, and dividing the motion of the image capture device into a panning shift component and a jitter component based on the first sensor data and the EIS compensation; and
code for adjusting the first ROI based on the jitter or panning shift, wherein adjusting the first ROI compensates for the jitter component without compensating for the panning shift component.
18. The non-transitory, machine-readable storage medium according to claim 17, further comprising code for adjusting a second ROI for use in autofocus control, wherein the first ROI and the second ROI for a single image frame are different from each other.
19. The non-transitory, machine-readable storage medium according to claim 18, wherein the first ROI is larger than the second ROI.
20. The non-transitory, machine-readable storage medium according to claim 18, wherein the first ROI is a center region of the single image frame.
21. The method for image capture control according to claim 1, wherein the image capture device includes an imaging sensor comprising a plurality of buffer pixel rows and buffer pixel columns at a periphery of the imaging sensor, and wherein the adjusted first ROI includes at least one of the buffer pixel rows or at least one of the buffer pixel columns.
22. The method for image capture control according to claim 1, wherein the adjusting comprises adjusting the first ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device.
23. The image capture control apparatus of claim 9, wherein:
the image capture device includes an imaging sensor comprising a plurality of buffer pixel rows and buffer pixel columns at a periphery of the imaging sensor; and
the at least one processor is further configured to compute the adjusted ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device, the adjusted ROI comprising at least one of the buffer pixel rows or at least one of the buffer pixel columns.
24. The image capture control apparatus of claim 13, wherein:
the image capture device includes an imaging sensor comprising a plurality of buffer pixel rows and buffer pixel columns at a periphery of the imaging sensor; and
the means for computing the adjusted ROI is configured to compute the adjusted ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device, the adjusted ROI comprising at least one of the buffer pixel rows or at least one of the buffer pixel columns.
25. The non-transitory, machine-readable storage medium according to claim 17, wherein:
the image capture device includes an imaging sensor comprising a plurality of buffer pixel rows and buffer pixel columns at a periphery of the imaging sensor; and
the non-transitory, machine-readable storage medium further comprises code for adjusting the first ROI for use in at least one of the group consisting of an automatic white balance process and an automatic exposure process of the image capture device, the adjusted first ROI comprising at least one of the buffer pixel rows or at least one of the buffer pixel columns.
US15/678,610 2017-08-16 2017-08-16 Image capture device with stabilized exposure or white balance Active US10491832B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/678,610 US10491832B2 (en) 2017-08-16 2017-08-16 Image capture device with stabilized exposure or white balance
CN201880052452.0A CN111034170A (en) 2017-08-16 2018-06-22 Image capturing apparatus with stable exposure or white balance
PCT/US2018/039142 WO2019036112A1 (en) 2017-08-16 2018-06-22 Image capture device with stabilized exposure or white balance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/678,610 US10491832B2 (en) 2017-08-16 2017-08-16 Image capture device with stabilized exposure or white balance

Publications (2)

Publication Number Publication Date
US20190058821A1 US20190058821A1 (en) 2019-02-21
US10491832B2 true US10491832B2 (en) 2019-11-26

Family

ID=62904678

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/678,610 Active US10491832B2 (en) 2017-08-16 2017-08-16 Image capture device with stabilized exposure or white balance

Country Status (3)

Country Link
US (1) US10491832B2 (en)
CN (1) CN111034170A (en)
WO (1) WO2019036112A1 (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110324528A (en) * 2018-03-28 2019-10-11 富泰华工业(深圳)有限公司 Photographic device, image processing system and method
JP7071204B2 (en) * 2018-04-27 2022-05-18 キヤノン株式会社 Imaging system, lens device, imaging device, and its control method
US10412306B1 (en) * 2018-08-21 2019-09-10 Qualcomm Incorporated Optical image stabilization method and apparatus
CN110248048B (en) * 2019-06-21 2021-11-09 苏宁云计算有限公司 Video jitter detection method and device
CN110536064B (en) * 2019-07-22 2021-04-06 杭州电子科技大学 Method for removing jitter of pixel-level precision video image of fixed scene
CN112313941A (en) * 2019-09-20 2021-02-02 深圳市大疆创新科技有限公司 Control device, imaging device, control method, and program
CN111815531B (en) * 2020-07-09 2024-03-01 Oppo广东移动通信有限公司 Image processing method, device, terminal equipment and computer readable storage medium
CN112218065B (en) * 2020-09-14 2022-05-10 深圳英飞拓科技股份有限公司 Image white balance method, system, terminal device and storage medium
CN112616031B (en) * 2020-12-16 2022-11-04 天津大学合肥创新发展研究院 High-speed target tracking method and system based on pulse array image sensor
US11729505B2 (en) * 2021-02-10 2023-08-15 Samsung Electronics Co., Ltd. Image signal processor, electronic device, and image stabilization method
KR20240029000A (en) * 2021-07-07 2024-03-05 퀄컴 인코포레이티드 Local motion detection to improve image capture and/or processing operations
CN113473028A (en) * 2021-07-15 2021-10-01 Oppo广东移动通信有限公司 Image processing method, image processing device, camera assembly, electronic equipment and medium
CN114140478B (en) * 2022-01-30 2022-06-03 电子科技大学 Federal learning method, system, device and medium for medical image segmentation
CN115134521B (en) * 2022-04-22 2023-09-22 咪咕视讯科技有限公司 Video shooting anti-shake method, device, equipment and storage medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556246B1 (en) * 1993-10-15 2003-04-29 Canon Kabushiki Kaisha Automatic focusing device
US6567265B1 (en) 1995-11-20 2003-05-20 Matsushita Electric Industrial Co., Ltd. Apparatus having flexible mounting mechanism
US6717816B1 (en) 2000-03-30 2004-04-06 Fujitsu Limited Shock absorbing apparatus for internal component assembled within electronic apparatus
US6810451B2 (en) 2001-09-28 2004-10-26 Kabushiki Kaisha Toshiba Card-shaped electronic apparatus
US20050057666A1 (en) * 2003-09-15 2005-03-17 Hu Shane Ching-Feng Region-based auto gain control and auto exposure control method and apparatus
US20060066744A1 (en) 2004-09-29 2006-03-30 Stavely Donald J Implementing autofocus in an image capture device while compensating for movement
US20060215378A1 (en) 2004-11-10 2006-09-28 Industrial Origami, Llc Card guide for mounting printed circuit boards and the like to electronic equipment housings
US20070160359A1 (en) * 2005-12-27 2007-07-12 Casio Computer Co., Ltd. Image pick-up apparatus with a multi-area AF function
US20070291152A1 (en) * 2002-05-08 2007-12-20 Olympus Corporation Image pickup apparatus with brightness distribution chart display capability
US20080122940A1 (en) 2006-11-27 2008-05-29 Sanyo Electric Co., Ltd. Image shooting apparatus and focus control method
WO2009064810A1 (en) 2007-11-12 2009-05-22 Qualcomm Incorporated Block-based image stabilization
US20100182460A1 (en) * 2009-01-20 2010-07-22 Sanyo Electric Co., Ltd. Image processing apparatus
US20100309321A1 (en) * 2009-06-05 2010-12-09 Ralph Brunner Image capturing devices using orientation detectors to implement automatic exposure mechanisms
WO2015183438A1 (en) 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
US20160014359A1 (en) * 2013-02-27 2016-01-14 Nikon Corporation Image sensor and electronic device
US20160155241A1 (en) * 2013-06-17 2016-06-02 Huawei Device Co., Ltd. Target Detection Method and Apparatus Based On Online Training
US20160234423A1 (en) * 2015-02-05 2016-08-11 Olympus Corporation Focus adjustment device and focus adjustment method
US20160374222A1 (en) 2015-06-19 2016-12-22 Tyco Electronics Corporation Card guide for a printed circuit board
US20170324909A1 (en) * 2016-05-04 2017-11-09 Samsung Electronics Co., Ltd. Electronic device and method for controlling the electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1377040A1 (en) * 2002-06-19 2004-01-02 STMicroelectronics S.r.l. Method of stabilizing an image sequence
US9973677B2 (en) * 2013-10-14 2018-05-15 Qualcomm Incorporated Refocusable images

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556246B1 (en) * 1993-10-15 2003-04-29 Canon Kabushiki Kaisha Automatic focusing device
US6567265B1 (en) 1995-11-20 2003-05-20 Matsushita Electric Industrial Co., Ltd. Apparatus having flexible mounting mechanism
US6717816B1 (en) 2000-03-30 2004-04-06 Fujitsu Limited Shock absorbing apparatus for internal component assembled within electronic apparatus
US6810451B2 (en) 2001-09-28 2004-10-26 Kabushiki Kaisha Toshiba Card-shaped electronic apparatus
US20070291152A1 (en) * 2002-05-08 2007-12-20 Olympus Corporation Image pickup apparatus with brightness distribution chart display capability
US20050057666A1 (en) * 2003-09-15 2005-03-17 Hu Shane Ching-Feng Region-based auto gain control and auto exposure control method and apparatus
US20060066744A1 (en) 2004-09-29 2006-03-30 Stavely Donald J Implementing autofocus in an image capture device while compensating for movement
US20060215378A1 (en) 2004-11-10 2006-09-28 Industrial Origami, Llc Card guide for mounting printed circuit boards and the like to electronic equipment housings
US20070160359A1 (en) * 2005-12-27 2007-07-12 Casio Computer Co., Ltd. Image pick-up apparatus with a multi-area AF function
US20080122940A1 (en) 2006-11-27 2008-05-29 Sanyo Electric Co., Ltd. Image shooting apparatus and focus control method
WO2009064810A1 (en) 2007-11-12 2009-05-22 Qualcomm Incorporated Block-based image stabilization
US8600189B2 (en) 2007-11-12 2013-12-03 Qualcomm Incorporated Block-based image stabilization
US20100182460A1 (en) * 2009-01-20 2010-07-22 Sanyo Electric Co., Ltd. Image processing apparatus
EP2211304A2 (en) 2009-01-20 2010-07-28 SANYO Electric Co., Ltd. Image processing apparatus
US20100309321A1 (en) * 2009-06-05 2010-12-09 Ralph Brunner Image capturing devices using orientation detectors to implement automatic exposure mechanisms
US20160014359A1 (en) * 2013-02-27 2016-01-14 Nikon Corporation Image sensor and electronic device
US20160155241A1 (en) * 2013-06-17 2016-06-02 Huawei Device Co., Ltd. Target Detection Method and Apparatus Based On Online Training
WO2015183438A1 (en) 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
US20150350533A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
US20160234423A1 (en) * 2015-02-05 2016-08-11 Olympus Corporation Focus adjustment device and focus adjustment method
US20160374222A1 (en) 2015-06-19 2016-12-22 Tyco Electronics Corporation Card guide for a printed circuit board
US20170324909A1 (en) * 2016-05-04 2017-11-09 Samsung Electronics Co., Ltd. Electronic device and method for controlling the electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report and Written Opinion-PCT/US2018/039142-ISA/EPO-dated Sep. 20, 2018.
International Search Report and Written Opinion—PCT/US2018/039142—ISA/EPO—dated Sep. 20, 2018.

Also Published As

Publication number Publication date
US20190058821A1 (en) 2019-02-21
WO2019036112A1 (en) 2019-02-21
CN111034170A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
US10491832B2 (en) Image capture device with stabilized exposure or white balance
US8767085B2 (en) Image processing methods and apparatuses to obtain a narrow depth-of-field image
US9706120B2 (en) Image pickup apparatus capable of changing priorities put on types of image processing, image pickup system, and method of controlling image pickup apparatus
JP5163031B2 (en) Electronic camera
JP4703710B2 (en) Apparatus and method for correcting image blur of digital image using object tracking
US20090310885A1 (en) Image processing apparatus, imaging apparatus, image processing method and recording medium
US20140071318A1 (en) Imaging apparatus
CN102300049A (en) Image signal processing system
JP2012019397A (en) Image processing apparatus, image processing method and image processing program
US8731327B2 (en) Image processing system and image processing method
US20240022702A1 (en) Foldable electronic device for multi-view image capture
US8570407B2 (en) Imaging apparatus, image processing program, image processing apparatus, and image processing method
US20190068868A1 (en) Phase disparity engine with low-power mode
JP5335964B2 (en) Imaging apparatus and control method thereof
JP5387341B2 (en) Imaging device
JP7442989B2 (en) Imaging device, control method for the imaging device, and program
US10469761B2 (en) Image capturing apparatus
JP2011078137A (en) Image processing apparatus and method, and program
JP2015118274A (en) Imaging device, imaging device control method, and program
JP2008270983A (en) Camera shake correction device and camera shake correction method
US11838645B2 (en) Image capturing control apparatus, image capturing control method, and storage medium
JP2014066959A (en) Imaging apparatus
JP6025555B2 (en) Image processing apparatus, image processing method, and program
US20230326163A1 (en) Imaging apparatus
US8121473B2 (en) Imaging apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEE, JISOO;REEL/FRAME:043436/0916

Effective date: 20170828

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4