US20130021484A1 - Dynamic computation of lens shading - Google Patents

Dynamic computation of lens shading Download PDF

Info

Publication number
US20130021484A1
US20130021484A1 US13/330,047 US201113330047A US2013021484A1 US 20130021484 A1 US20130021484 A1 US 20130021484A1 US 201113330047 A US201113330047 A US 201113330047A US 2013021484 A1 US2013021484 A1 US 2013021484A1
Authority
US
United States
Prior art keywords
lens shading
image
captured
captured image
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/330,047
Inventor
Noam Sorek
Ilia Vitsnudel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies General IP Singapore Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161509747P priority Critical
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US13/330,047 priority patent/US20130021484A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SOREK, NOAM, VITSNUDEL, ILIA
Publication of US20130021484A1 publication Critical patent/US20130021484A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/001Image restoration
    • G06T5/003Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • H04N19/139Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/192Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/436Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/537Motion estimation other than block-based
    • H04N19/54Motion estimation other than block-based using feature points or meshes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/56Motion estimation with initialisation of the vector search, e.g. estimating a good candidate to initiate a search
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23251Motion detection
    • H04N5/23254Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23264Vibration or motion blur correction
    • H04N5/23267Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction

Abstract

Embodiments of the present disclosure utilize captured image information to dynamically determine a lens shading surface being experienced by an imaging device or camera under current conditions. The lens shading surface is then used to apply a correction to the pixels of captured images to compensate for effects of lens shading.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority to copending U.S. provisional application entitled, “Image Capture Device Systems and Methods,” having Ser. No. 61/509,747, filed Jul. 20, 2011, which is entirely incorporated herein by reference.
  • TECHNICAL FIELD
  • The present disclosure is generally related to lens shading correction for imaging devices.
  • BACKGROUND
  • An increasing number of devices are being produced that are enabled to capture and display images. For example, mobile devices, such as cell phones, are increasingly being equipped with digital cameras to capture images, including still snapshots and motion video images.
  • One of the critical problems in small form factor cameras like the ones in cellular phones is the lens shading of the camera, where lens shading is the difference in light transition through the opto-electrical system of the camera, in a way that a same light source that is imaged by the camera at different angles, or places on the image is read by the camera in different values rather than having the same value.
  • As a result, lens shading can cause pixel cells in a pixel array of an image sensor located farther away from the center of the pixel array to have a lower pixel signal value when compared to pixel cells located closer to the center of the pixel array even when all pixel cells are exposed to the same illuminant condition. Moreover, pixels with different spectral characteristics have different responses to the lens shading, which may cause appearance of color patches even if the scene is monochromatic. In order to correct the lens shading, long and expensive calibration process is done per camera or mobile product. In many cases, calibration errors are the main source for the reduced image quality in these devices.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
  • FIG. 1 is a block diagram of an exemplary mobile device with image capture and processing capability in accordance with embodiments of the present disclosure.
  • FIG. 2 is a diagram representation of an intensity profile under uniform illumination in accordance with embodiments of the present disclosure.
  • FIG. 3 is a diagram representation of intensity profiles of multiple images under non-uniform illumination in accordance with embodiments of the present disclosure.
  • FIGS. 4-6 are flow chart diagrams depicting exemplary processes of estimating lens shading in accordance with the disclosed embodiments.
  • FIG. 7 is a diagram representation of a surface profile that may be created depending on the scene being photographed and the particular illumination characteristics in accordance with the disclosed embodiments.
  • FIGS. 8-11 are block diagrams illustrating examples of a mobile device employing the image processing circuitry of FIG. 1.
  • DETAILED DESCRIPTION
  • Embodiments of the present disclosure utilize captured image information to determine a lens shading surface being experienced by an imaging device or camera under current conditions (e.g., optical conditions, lighting conditions, etc.). The lens shading surface is then used to apply a correction to the pixels of captured images to compensate for effects of lens shading.
  • Embodiments of the present disclosure relate to image processing performed in devices. For example, embodiments include mobile devices where image processing must be performed with limited resources. Types of such mobile devices include mobile phones (e.g., cell phones), handheld computing devices (e.g., personal digital assistants (PDAs), BLACKBERRY devices, PALM devices, etc.), handheld music players (e.g., APPLE IPODs, MP3 players, etc.), and further types of mobile devices. Such mobile devices may include a camera or image sensor used to capture images, such as still images and video images. The captured images are processed internal to the mobile device.
  • FIG. 1 shows a block diagram of an exemplary mobile device 100 with image capture and processing capability. Mobile device 100 may be a mobile phone, a handheld computing device, a music player, etc. The implementation of mobile device 100 shown in FIG. 1 is provided for purposes of illustration, and is not intended to be limiting. Embodiments of the present disclosure are intended to cover mobile devices having additional and/or alternative features to those shown for mobile device 100 in FIG. 1.
  • As shown in FIG. 1, mobile device 100 includes, but is not limited to including, an image sensor device 102, an analog-to-digital (A/D) 104, an image processor 106, a speaker 108, a microphone 110, an audio codec 112, a central processing unit (CPU) 114, a radio frequency (RF) transceiver 116, an antenna 118, a display 120, a battery 122, a storage 124, and a keypad 126. These components are typically mounted to or contained in a housing. The housing may further contain a circuit board mounting integrated circuit chips and/or other electrical devices corresponding to these components. Each of these components of mobile device 100 is described as follows.
  • Battery 122 provides power to the components of mobile device 100 that require power. Battery 122 may be any type of battery, including one or more rechargeable and/or non-rechargeable batteries.
  • Keypad 126 is a user interface device that includes a plurality of keys enabling a user of mobile device 100 to enter data, commands, and/or to otherwise interact with mobile device 100. Mobile device 100 may include additional and/or alternative user interface devices to keypad 126, such as a touch pad, a roller ball, a stick, a click wheel, and/or voice recognition technology.
  • Image sensor device 102 is an image capturing device. For example, image sensor device 102 may include an array of photoelectric light sensors, such as a charge coupled device (CCD) or a CMOS (complementary metal-oxide-semiconductor) sensor device. Image sensor device 102 typically includes a two-dimensional array of sensor elements or pixel sensors organized into rows and columns. Each pixel sensor may be identified using pixel sensor coordinates, where “x” is a row number, and “y” is a column number, for any pixel sensor in the array of sensor elements. In embodiments, each pixel sensor of image sensor device 102 is configured to be sensitive to a specific color, or color range. In one example, three types of pixel sensors are present, including a first set of pixel sensors that are sensitive to the color red, a second set of pixel sensors or photo-detectors that are sensitive to green, and a third set of pixel sensors that are sensitive to blue. Image sensor device 102 receives light (from optical system 101) corresponding to an image, and generates an analog image signal corresponding to the captured image. Analog image signal includes analog values for each of the pixel sensors.
  • Optical system 101 can be a single lens, as shown, but may also be a set of lenses. An image of a scene is formed in visible optical radiation through a shutter onto a two-dimensional surface of the image sensor 102. An electrical output of the sensor carries an analog signal resulting from scanning individual photo-detectors of the surface of the sensor 102 onto which the image is projected. Signals proportional to the intensity of light striking the individual photo-detectors or pixel sensors are obtained in the output in time sequence, typically by scanning them in a raster pattern, where the rows of photo-detectors are scanned one at a time from left to right, beginning at the top row, to generate a frame of video data from which the image may be reconstructed.
  • A/D 104 receives analog image signal, converts analog image signal to digital form, and outputs a digital image signal. Digital image signal includes digital representations of each of the analog values generated by the pixel sensors or photo-detectors, and thus includes a digital representation of the captured image.
  • Image processor 106 performs image processing of the digital pixel sensor data received in digital image signal. For example, image processor 106 may be used to generate pixels of all three colors at all pixel positions when a Bayer pattern image is output by image sensor device 102.
  • Note that in an embodiment, two or more of image sensor device 102, A/D 104, and image processor 106 may be included together in a single IC chip, such as a CMOS chip, particularly when image sensor device 102 is a CMOS sensor, or may be in two or more separate chips.
  • CPU 114 is shown in FIG. 1 as coupled to each of image processor 106, audio codec 112, RF transceiver 116, display 120, storage 124, and keypad 126. CPU 114 may be individually connected to these components, or one or more of these components may be connected to CPU 114 in a common bus structure.
  • Microphone 110 and audio CODEC 112 may be present in some applications of mobile device 100, such as mobile phone applications and video applications (e.g., where audio corresponding to the video images is recorded). Microphone 110 captures audio, including any sounds such as voice, etc. Microphone 110 may be any type of microphone. Microphone 110 generates an audio signal that is received by audio codec 112. The audio signal may include a stream of digital data, or analog information that is converted to digital form by an analog-to-digital (ND) converter of audio codec 112. Audio codec 112 encodes (e.g., compresses) the received audio of the received audio signal. Audio codec 112 generates an encoded audio data stream that is received by CPU 114.
  • CPU 114 receives image processor output signal from image processor 106 and receives the audio data stream from audio codec 112. In some embodiments, CPU 114 may include an additional image processor. In one embodiment, the additional image processor performs image processing (e.g., image filtering) functions for CPU 114. In an embodiment, CPU 114 includes a digital signal processor (DSP), which may be included in the additional image processor. When present, the DSP may apply special effects to the received audio data (e.g., an equalization function) and/or to the video data. CPU 114 may store and/or buffer video and/or audio data in storage 124. Storage 124 may include any suitable type of storage, including one or more hard disc drives, optical disc drives, FLASH memory devices, etc. In an embodiment, CPU 114 may stream the video and/or audio data to RF transceiver 116, to be transmitted from mobile device 100.
  • When present, RF transceiver 116 is configured to enable wireless communications for mobile device 116. For example, RF transceiver 116 may enable telephone calls, such as telephone calls according to a cellular protocol. RF transceiver 116 may include a frequency up-converter (transmitter) and down-converter (receiver). For example, RF transceiver 116 may transmit RF signals to antenna 118 containing audio information corresponding to voice of a user of mobile device 100. RF transceiver 116 may receive RF signals from antenna 118 corresponding to audio information received from another device in communication with mobile device 100. RF transceiver 116 provides the received audio information to CPU 114. In another example, RF transceiver 116 may be configured to receive television signals for mobile device 100, to be displayed by display 120. In another example, RF transceiver 116 may transmit images captured by image sensor device 102, including still and/or video images, from mobile device 100. In another example, RF transceiver 116 may enable a wireless local area network (WLAN) link (including an IEEE 802.11 WLAN standard link), and/or other type of wireless communication link.
  • CPU 114 provides audio data received by RF transceiver 116 to audio codec 112. Audio codec 112 performs bit stream decoding of the received audio data (if needed) and converts the decoded data to an analog signal. Speaker 108 receives the analog signal, and outputs corresponding sound.
  • Image processor 106, audio codec 112, and CPU 114 may be implemented in hardware, software, firmware, and/or any combination thereof. For example, CPU 114 may be implemented as a proprietary or commercially available processor that executes code to perform its functions. Audio codec 112 may be configured to process proprietary and/or industry standard audio protocols. Image processor 106 may be a proprietary or commercially available image signal processing chip, for example.
  • Display 120 receives image data from CPU 114, such as image data generated by image processor 106. For example, display 120 may be used to display images captured by image sensor device 102. Display 120 may include any type of display mechanism, including an LCD (liquid crystal display) panel or other display mechanism. In some embodiments, the display may show a preview of images currently being received by the sensor 102, whereby a user may select a control (e.g., shutter button) to begin saving captured image(s) to storage 124.
  • Depending on the particular implementation, image processor 106 formats the image data output in image processor output signal according to a proprietary or known video data format. Display 120 is configured to receive the formatted data, and to display a corresponding captured image. In one example, image processor 106 may output a plurality of data words, where each data word corresponds to an image pixel. A data word may include multiple data portions that correspond to the various color channels for an image pixel. Any number of bits may be used for each color channel, and the data word may have any length.
  • In some implementations, display 120 has a display screen that is not capable of viewing the full resolution of the images captured by image sensor device 102. Image sensor devices 102 may have various sizes, including numbers of pixels in the hundreds of thousand, or millions, such as 1 megapixel (Mpel), 2 Mpels, 4 Mpels, 8 Mpels, etc.). Display 120 may be capable of displaying relatively smaller image sizes.
  • To accommodate such differences between a size of display 120 and a size of captured images, CPU 114 may down-size a captured image received from image processor 106 before providing the image to display 120, in some embodiments. Such image downsizing may be performed by a subsampling process. In computer graphics, subsampling is a process used to reduce an image size. Subsampling is a type of image scaling, and may alter the appearance of an image or reduce the quantity of information required to store an image. Two types of subsampling are replacement and interpolation. The replacement technique selects a single pixel from a group and uses it to represent the entire group. The interpolation technique uses a statistical sample of the group (such as a mean) to create a new representation of the entire group.
  • As stated above, image processor 106 performs processing of digital image signal and generates an image processor output signal. Image processing may include lens shading correction processes performed by a lens shading sub-module 107 of an image processor 106, in one embodiment.
  • Due to lens shading, along a given radius, the farther away from the center of the image sensor, the more attenuated the signal from a given pixel circuit becomes. Moreover, pixels with different spectral characteristics have different responses to the lens shading, which may cause appearance of color patches even if the scene is monochromatic. As such, correction is applied in order to reduce the spatial variation. By applying a gain to attenuated signals according to position, embodiments of the present disclosure perform positional gain adjustment. A function that maps pixel position into a desired correction amount is referred to herein as a lens shading gain adjustment surface. In one embodiment, the surface may consist of an interleaving of several smooth surfaces, each for every pixel type or color. Such a surface may be generated in a CMOS circuit, in one embodiment, and then used for correction of spatial non-uniformity sensitivity of the lens shading correction across pixel positions in the sensor array.
  • In conventional processes, shading correction factors for an optical photo system (e.g., lens, image sensor, and/or housing) of a mobile device 100 are performed by imaging a scene of uniform intensity onto the image sensor 102 employed by the device being calibrated. Data of the resulting circular, hyperbolic or other variation across the image sensor device (see FIG. 2) are derived by prior measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated and stored under optimal lab conditions, where imaging a scene of uniform intensity is possible.
  • Accordingly, by capturing an image of a scene that is known to be a flat illumination field, an actual response may be measured from the image. In some embodiments, a response may be measured in each of the color planes—red, green, blue. The response will show that at its center, the response is strongest and weaker at its edges. Accordingly, pixels corresponding to the edges of the pixel sensor may be multiplied by a relative corrective factor so that the corrected response is flat, after correction. These correction factors may be used for captured image(s) acquired under similar illumination conditions.
  • However, illumination conditions change as the environment of the mobile device changes. Further, different lens positions within the optical system 101 produce different lens shading effects. Accordingly, different correction factors may need to be adjusted to compensate for different light sources within a scene being photographed and/or lens positioning or qualities (e.g., changes in zoom or focus, particular manufacturing accuracies, mounting of lens, filter consistencies, etc.). Therefore, a tuning process to premeasure all the different combinations of potential different positions and light sources will be complicated and not accurate, since for each combination, the actual response measured from a captured image is different.
  • In contrast, with embodiments of the present disclosure, shading correction factors for an optical photo system, (e.g., the lens, image sensor, and/or housing) of a digital camera or other imaging device, are performed by capturing multiple images of a scene in succession. By analyzing the differences between intensity values of the captured images and the shift in detected intensity values with respect to the pixels, the lens shading effect on the camera may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the lens shading gain adjustment can be estimated, as represented in FIG. 3. In particular, a lens shading curve or surface may be determined that caused the differences between the captured images. In some embodiments, preview images captured for display on a viewfinder of a mobile device 100 may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect. For example, a series of low resolution images may be used to preview the image to the photographer before actually taking a high resolution image. Then, data of the resulting circular, hyperbolic or other variation across the image sensor 102 are derived by dynamic measurement of image sensor photo detector signals and a compensating mathematical function or functions are calculated.
  • By implementing such a process, lens shading phenomenon may be estimated dynamically and on the fly. Accordingly, during manufacturing and assembly of a camera or mobile device equipped with a camera, resources used for corrective lens shading calibration may be eliminated or significantly reduced.
  • While conditions may often exist that allow for capturing of images that can be used to estimate lens shading, in some situations, conditions may not be present to capture an image that allows for sufficient estimation of lens shading. As an example, an image may be captured where an object in a scene is moving (as opposed to the camera moving). Accordingly, a subsequent capturing of the scene is going to be quite dissimilar, since the scene is not static. Further, the lens shading sub-module 107 and/or the image processor 106 may detect that illumination types for the captured images are not the same. Accordingly, the lens shading sub-module 107 and/or image processor 106 may attempt to detect that lighting conditions are stable during capturing of the images that are used to derive the lens shading surface. For example, in a subsequent image, maybe someone turned out the lights in a room where a scene is being captured. In such a situation, the mobile device 100 may rely on prestored lens shading correction factors that are suited for a similar illumination type. The lens shading sub-module 107 and/or the image processor 106 may add lens shading correction factors to a reference database 125 of storage 124 periodically as factors are dynamically generated that are determined to be suitable to be used in future uses, where conditions do not allow for suitable correction factors to be newly generated. Also, in some embodiments, lens shading correction factors may be preloaded or stored in the reference database 125 at a manufacturing facility so that the camera is equipped with preliminary lens shading correction factors that can be used, as needed.
  • In one embodiment, the correction factors may be initially generated responsive to capturing a scene in a flat field (e.g., a white wall with desired illumination) within a closed environment. Also, since the correction factors captured at the manufacturing facility is used as a secondary measure and is not intended to be used as a primary tool for estimating the lens shading, the scene does not necessarily need to be a perfectly flat field, in one embodiment. Therefore, the motion based calculation may be performed in the manufacturing stage with relatively flat surfaces which make measurements faster to obtain and less dependent on measurement conditions. In other words, a wider range of manufacturing conditions are available to be used with systems and processes of the present disclosure.
  • As stated above, the lens shading sub-module 107 and/or the image processor 106 may detect conditions that do not allow for sufficient measuring of lens shading. Accordingly, in a case where good conditions are not present to measure lens shading, the mobile device 100 takes advantage of a lens shading correction factors stored in the reference database 125. Alternatively, in a case where good conditions are present, the lens shading sub-module 107 and/or the image processor 106 compares the differences between recently captured images and defines a lens shading surface (on the fly) by considering the differences between the captured images. Then, a lens surface gain adjustment surface may be chosen that matches the intensity variation or variations reflected by the lens shading surface across the captured images that are to be corrected. The mobile device 100 may also store the lens surface gain adjustment surface in the reference database 125 for later use, as needs arise.
  • Additionally, one embodiment of dynamic lens shading calculation utilizes gradients or differences between captured image areas to define the lens shading surface. From the lens shading surface, corrections may be prepared to compensate for the effects of lens shading. To determine the lens shading surface, inter image consideration and/or intra image consideration are evaluated. For inter image consideration, in one embodiment, two images are captured and a ratio is calculated between a pixel value in an object in one image and the pixel value of the same place on the same object in a second image (that was taken after camera motion). The calculated ratio represents a local gradient of the lens shading at the direction of the camera or mobile device movement.
  • For intra image consideration, areas with similar colors in the image and/or similar intensities may be used to estimate a portion of lens shading surface from differences in the areas of the same image. Accordingly, in one embodiment, via the lens shading sub-module 107 and/or the image processor 106, a second ratio is calculated, between a pixel value of an object in an image and the pixel value of another object that has similar luminosity in the image. The calculated second ratio represents the gradient of the lens shading between these two points.
  • For inter image consideration, in order to find matching pixel values in the two images, the images are geometrically matched with one another. In one embodiment, global motion is detected from the two images, where motion parameters may include translation and transformation (e.g., affine or perspective). Then, areas having local motion that is different from the global motion are determined. These areas may have had an object moving in the camera field or scene being photographed. In one embodiment, areas having local motion are not analyzed for gradients. In such a situation, a correction factor may be determined by extrapolation from other areas (not subject to the local motion) in single image data. Also, if intra image analysis is not available on the single image (e.g., the size of the image exceeds a threshold), a captured image may be compensated using stored lens shading correction factors in the reference database 125 instead of determining correction factors dynamically, in one embodiment.
  • The foregoing processes may be iterative in some embodiments, where after an estimation of lens shading, processes may be repeated to determine a new estimation of lens shading. In some embodiments, the motion detection is performed on full resolution image(s). After the motion detection is estimated, then a first image is transformed to match the second image geometrically and matching pixel values are attempted to be found.
  • Gradients and pixel ratios may be affected with noise and inaccuracies. For example, possible sources of noise include pixel noise (i.e., electronic and photonic noise in the process of converting luminance to digital luminance count), errors in estimation of motion, changing light condition between the two images, an object has different reflection into the camera in different positions, an incorrect assumption on similar luminance, where in reality two objects being compared have different luminance, etc. To avoid or reduce such inaccuracies, measures may be taken to calculate the gradients in ‘flat’ areas where there are no rapid changes in luminance (e.g., edges). For example, areas near edges in the captured images may be masked out.
  • Then, the lens shading surface may be calculated from the local gradients, in some embodiments. In one embodiment, a model of a lens shading surface may be computed or estimated that matches the measured gradients in the captured images. Accordingly, parameters of the model may be adjusted until an optimal result is determined from all the tested results or trials.
  • For example, one possible technique determines an optimized analytical parametric surface by selecting a surface model equation (e.g., polynomial, Gaussian, etc.) and calculating the parameters of the lens shading surface model that yield minimal difference between the surface gradient (according to the model) and measured gradients. Another possible technique, among others, determines an optimized singular value decomposition (SVD) surface composition by select largest surface eigenvectors and calculating the coefficients for the surface composition that yield minimal difference between the surface gradient (according to the model) and measured gradients.
  • To illustrate, a Gaussian model may be used to model the lens shading being experienced by the mobile device 100 and values of parameters for the model may be adjusted until an optimal match is found between the model and the measured values. Instead of a Gaussian model, other models may be considered, such as a polynomial surface model, in some embodiments. Alternatively, an SVD process may be used.
  • To match the pixel values, color layers may be estimated using a variety of techniques, including direct layer estimation, independent color layer estimation, and normalized color domain estimation. For instance, measurements may be made in normalized color domain(s) where possible, since, typically, luminosity changes more rapidly than normalized color in images. Additional measurements include calculating small number of surface model parameters with large number of measurements; limiting parameter space to a predefined space according to measurements of sample units in different light (spectra and intensity) conditions; averaging measurements before calculating gradients (e.g., by down sampling the image); calculating global gradients rather than using only local gradients; and segmenting the image to a small number of color segments, and estimating global gradients on each one. Also, in some embodiments, possible effects of light flickering during capturing of the images may be addressed and removed from the images.
  • Accordingly, in one embodiment, one technique of matching pixels involves direct layer estimation, where local gradients are calculated. In particular, in the inter image consideration, differences between the images represents the local gradients. In the intra image consideration, color segments are derived and differences between like color segments are representative of local gradients. An optimized lens shading surface is modeled which matches with local gradients at measured points. Accordingly, a model surface may be computed that fits the local gradients of each of the color segments. From inter and/or intra image considerations, information may be obtained on the gradients at each corresponding sensor point of the image, where the gradients are representative of the lens shading phenomenon. By taking gradients from inter image and/or intra image calculations and optimizing according to the two respective sets, a lens shading surface can be estimated and applied to a captured image.
  • In some embodiments, lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels. Further, different techniques or models may be used by the lens shading sub-module 107 and/or the image processor 106 for the different color channels. As an example, a green channel may determine a best fit of color plan parameters for an SVD model and a red/blue channel may utilize direct layer optimization. In general, once a lens shading surface has been determined, then lens shading can be corrected using standard correction methods.
  • Further, with estimation of the lens shading surface or curve, other image quality processes may be benefited. For example, by knowing the lens shading surface, accurate white balancing may be computed. As discussed above, different light sources create different lens shading. Therefore, by determining the lens shading correctly, an unbiased measurement for the white balance can be provided by the mobile device 100.
  • To illustrate, a white balance may be selected that is appropriate to generate the estimated lens shading curve, where different illuminants have different optical wavelength responses and hence may result in different lens shading surfaces. The image processor 106 or an auto-white-balance (AWB) sub-module of the image processor 106 may then determine the type of illuminant used to generate the lens shading curve that has been estimated and subsequently use this information to correct white balance levels in captured image(s).
  • In addition to performing accurate white balancing, more robust motion estimation may also be implemented responsive to the lens shading estimation by the lens shading sub-module 107 and/or the image processor 106. For example, from analysis performed in determining the lens shading phenomenon, global motion can be estimated by calculating a mean difference between image areas in the two images captured in a sequence, where the difference corresponds to the same object moving across one image to a different place in the second image. Since the second image has different lens shading characteristics as compared to the first, it also has a different mean brightness as compared to the first image. Accordingly, instead of examining correlations between the images in order to determine a motion vector that can be used to estimate camera motion, statistics used to determine the lens shading can also be used to estimate the camera motion. Therefore, differences in the statistics between the images may be used to calculate the camera or global motion.
  • As an example, the first image may feature a white ball at a left corner of the frame. The second image may feature the ball at a position to the right of the left corner, where the ball has a brighter intensity than in the first frame. The lens shading for the mobile device 100 has been determined, where the lens shading is found to traverse along one side of the image sensor 102 to the other side. Accordingly, at a pixel sensor corresponding to the left corner of the image, the average intensity value is going to be lower than an average intensity value at a pixel sensor to the right. Therefore, based on the lens shading statistics, it is expected that the intensity values of pixels corresponding to the ball will change based on the lens shading as the ball moves to the right in subsequent images. Therefore, by considering the global and local statistics compiled on the captured images, an object having a different intensity value than a prior value in a prior frame may be determined to be the same object in motion due to the lens shading phenomenon (that has been previously computed). As a result, motion can be analyzed and determined.
  • FIG. 4 illustrates a flow chart depicting a process of estimating lens shading in accordance with the disclosed embodiments. Lens shading estimation, in accordance with FIG. 4, is performed by a pixel processing pipeline of image processor 106 (FIG. 1) (e.g., lens shading module 107) dynamically and, if necessary, stored references surface(s) acquired during a calibration operation. The image processor 106 has access to the stored gain adjustment surface(s) and scene adjustment surface(s) in, for example, reference database 125 (FIG. 1) or other memory storage.
  • When an image is generally captured by a digital camera, the image is not captured in a known illumination type or a reference is not available for the current illumination type. The captured image is a natural image, where the lens shading sub-module 107 of an image processor 106 does not have any preset knowledge of the illumination type and may not therefore have a reference correction surface prestored according to the current illumination type. While in conventional processes, lens shading correction factors may be solely derived from capturing a scene of a flat field to create an image that contains even color and intensity values except for effects from lens shading, natural images taken by the camera normally have no such flat areas in the image. Accordingly, embodiments of the present disclosure analyze the differences in light transition from natural images captured by the mobile device 100.
  • As such, embodiments of the present disclosure take advantage of capturing multiple images in succession and determining a lens shading correction or gain adjustment surface for the present illumination conditions. In particular, since the images are captured by the same image sensor 102 of the mobile device 100, the images are captured using the same optics. Accordingly, intensity values of pixels for the multiple images should ideally be the same, and illumination levels for the captured images should also be the same, since the images are captured within parts of a second from one another, in some embodiments. In practice, the mobile device 100 may move or shift during the capturing of one image to the next. Also, due to lens shading, the intensity values of the pixels may not be exactly the same.
  • Accordingly, by analyzing the differences between intensity values of the captured images and the shift in detected intensity values with respect to the pixels of the captured images, the lens shading effect on the mobile device 100 may be better understood and represented. Therefore, in one embodiment, capturing two images of a same scene with slight camera shift between image captures provides the reference from which the corrective lens shading surface can be estimated. In particular, a lens shading curve or surface may be determined that caused the differences between the captured images. In some embodiments, preview images captured for display on a viewfinder of a camera may be used to determine the lens shading effect, where upon capturing of an image (e.g., after selecting a shutter button or control), the captured image may be corrected to compensate for the current lens shading effect.
  • Lens shading estimation begins with capturing a sequence of images at step 402. At step 404, local gradients of the captured images are determined. As noted above, the local gradients may be determined in a number of different ways. In some embodiments, techniques estimate the local gradients from inter image consideration and/or intra image consideration.
  • For example, multiple images may be captured and inter image analysis may be performed on the captured images. In addition, intra image analysis may be performed on each captured image. The intra image analysis may be performed in concert with the inter image analysis or apart from the inter image analysis, in some embodiments, based on recognition of a particular condition. For instance, a sequence of images may have been subjected to a level of local motion in the scene being photographed that does not allow for adequate statistics to be obtained. Alternatively, adequate statistics for global motion may not be able to be obtained which prohibits one or both approaches from being used or causes prestored statistics or factors in the reference database 125 to be used instead. As an example, a single image may not contain multiple areas with similar colors or intensity.
  • Referring back to FIG. 4, at step 406, a model of a lens shading surface is compared to the measured gradients from the captured images and the deviation between the two is saved for later comparison. The process proceeds to step 408 where the model is adjusted and compared again with the measured gradients and new deviation(s) are computed and compared with the saved values. The model having the set of smallest deviation values is maintained as the optimum model for the trials previously computed. The process then repeats until an optimum model is determined.
  • At step 410, a lens shading gain adjustment surface is calculated from the lens shading surface. For embodiments that derived a lens shading surface for each color channel of the image sensor, the lens shading gain adjustment surface may also be determined for each color channel. In other words, lens shading correction for color image sensors may be defined for each of a plurality of color channels in order to correct for lens shading variations across color channels. For these color image sensors, the lens shading gain adjustment surface is applied to the pixels of the corresponding color channel during post-image capture processing to correct for variations in pixel value due to the spatial location of the pixels in the pixel array. In some embodiments, monochrome image sensors, on the other hand, apply a single gain adjustment surface to all pixels of a pixel array. Likewise, color image sensors may use a single lens shading gain adjustment surface across all color channels, in some embodiments.
  • To illustrate, a pixel value located at x, y pixel coordinates may be multiplied by the lens surface gain adjustment values at the x, y pixel coordinates on the lens surface gain adjustment surface. Accordingly, at step 412, lens shading correction is performed on the pixel values of the captured image using the lens surface gain adjustment surface(s).
  • In some embodiments discussed above, a lens shading module 107 is provided to estimate the effects of lens shading and to possibly correct the gain of individual pixels in captured images. The lens shading module 107 may, for example, be implemented as software or firmware.
  • The lens shading module 107 may be implemented in image processor 106 as software designed to implement lens shading correction, in one embodiment. Alternatively, lens shading module 107 may be implemented in image sensor 102, in one embodiment.
  • In some embodiments, the lens shading module 107 utilizes lens shading correction surfaces to determine gain correction for individual pixels to account for lens shading. An individual correction or gain adjustment surface may, for example, comprise parameters to calculate gain correction although it will also be understood that in some cases a correction table may be stored. Positional gain adjustments across the pixel array can be provided as digital gain values, one corresponding to each of the pixels. It may happen that the further away a pixel is from the center of the pixel array, the more gain is needed to be applied to the pixel value. The set of digital gain values for the entire pixel array forms a lens shading gain adjustment surface.
  • In some embodiments, only a relatively few gain values are preferably stored, in order to minimize the amount of memory required to store correction data, and a determination of values between the stored values is obtained, during the image modification process, by a form of interpolation. In order to avoid noticeable discontinuities in the image intensity, these few data values are preferably fit to a smooth curve or curves that are chosen to match the intensity variation or variations across the image that are to be corrected.
  • Also, in some embodiments, the digital gain values are computed from an expression that approximates the desired lens shading gain adjustment surface, since the number of parameters needed to generate an approximate surface is generally significantly lower than the numbers of parameters needed to store the digital gain values for every pixel location. Some image sensors 102 have built-in lens shading operation on-chip, while other image sensors rely on a separate image processing imaging chip for this operation.
  • FIG. 5 is a flowchart representation of a method in accordance with one embodiment of the present disclosure. In particular, a method is presented for use in conjunction with one or more of the functions and features described in conjunction with FIGS. 1-3. In step 502, a lens shading surface is continually calculated for preview images being displayed by a mobile device 100. When the calculated lens shading surface is determined to be satisfactory (e.g., no local motion detected, illumination of scene deemed to be stable, etc.), the lens shading surface is stored in a reference database 125, in step 504. Accordingly, upon selection to capture an image, a newly calculated lens shading surface is used to compensate for lens shading effects in the captured image, in step 506 if the newly calculated lens shading surface was determined to be satisfactory. Otherwise, a lens shading surface prestored in the reference database 125 is used to compensate for lens shading effects in the captured image, in step 508.
  • Next, FIG. 6 is a flowchart representation of a method in accordance with one embodiment of the present disclosure. In particular, a method is presented for use in conjunction with one or more of the functions and features described in conjunction with FIGS. 1-3. In step 602, two or more images of the same scene, containing objects, are captured, where the images have some displacement between themselves. In step 604, the relative displacement between the images is analyzed based on tracking of image areas with details or discernable objects. In step 606, for each point and for each color plane in the image, a ratio between an intensity level at the first image and the level of the same object point at the second image is calculated. If there were no lens shading, the values would be the same. In step 608, the differences to the values are normalized, and in step 610, from the calculated ratios, a difference surface profile is created by possibly filtering the results and interpolating data points or values, as needed, to produce a smooth lens shading surface profile. As a point of reference, FIG. 7 is a representative lens shading surface profile that may be created depending on the scene being photographed and the particular illumination characteristics. In step 612, after the lens shading surface is extracted or generated from a current scene, the lens shading surface is used to indicate the light source illuminating the scene and supply an unbiased measurement for the white balance.
  • While conventional lens correction processes are characterized by poor performances; inaccurate estimation of spectra from white balance (e.g., different spectra can have same white balance but different lens shading); inaccurate measurement extrapolation during manufacturing; costly tuning or calibration process, limited in applicability to fixed focus lenses (e.g., fixed optical patterns), etc., dynamic lens shading estimation and correction methods disclosed herein improve upon the foregoing drawbacks. As flex focusing or zoom controls gain popularity with digital cameras and become more complicated, dynamic estimation of lens shading based on current image captures and not preset measurements will provide improved accuracy over current conventional processes. Contemplated advantages include improved image quality with low quality lenses in cellular phones and other camera applications; shorter time to market; shorter calibration process of the camera in the product development stage; and reduction of manufacturing cost to the camera vendor due to shorter or non-calibration process per sensor.
  • Mobile device 100 may comprise a variety of platforms in various embodiments. To illustrate, a smart phone electronic device 100 a is represented in FIG. 8, where the smart phone 100 a includes an optical system 101, at least one imaging device or sensor 102, at least one image processor 106 with lens shading sub-module 107, a power source 122, among other components (e.g., display 120, processor 114, etc.). Further, a tablet electronic device 100 b is represented in FIG. 9, where the tablet 100 b includes an optical system 101, at least one imaging device or sensor 102, at least one image processor 106 with lens shading sub-module 107, a power source 122, among other components (e.g., display 120, processor 114, etc.). Then, a laptop computer 100 c is represented in FIG. 10, where the laptop computer 100 c includes an optical system 101, at least one imaging device or sensor 102, at least one image processor 106 with lens shading sub-module 107, a power source 122, among other components (e.g., display 120, processor 114, etc.). Also, a digital camera electronic device 100 d is represented in FIG. 11, where the digital camera 100 d includes an optical system 101, at least one imaging device or sensor 102, at least one image processor 106 with lens shading sub-module 107, a power source 122, among other components (e.g., display 120, processor 114, etc.). Therefore, a variety of platforms of electronic mobile devices may be integrated with the image processor 106 and/or lens shading sub-module 107 of the various embodiments.
  • Embodiments of the present disclosure can be implemented in hardware, software, firmware, or a combination thereof. In some embodiments, the lens shading sub-module 107 is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. In some embodiments, the lens shading sub-module 107 comprises an ordered listing of executable instructions for implementing logical functions and can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a nonexhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical).
  • If implemented in hardware, as in an alternative embodiment, the lens shading sub-module 107 can be implemented with any or a combination of the following technologies, which are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
  • The flow chart of FIGS. 4-6 shows the architecture, functionality, and operation of a possible implementation of the image processor 106 and relevant sub-modules. In this regard, each block represents a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order noted in FIGS. 4-6. For example, two blocks shown in succession in FIGS. 4-6 may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved, as will be further clarified hereinbelow.
  • It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiments without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims (20)

1. An image processing method, comprising:
capturing a sequence of images via an image sensor, wherein the captured images have a global motion shift between them;
tracking a relative global motion displacement between the captured images;
calculating a ratio between an intensity level at a first image of the captured images and an intensity level of a same object point at a second image of the captured images; and
based on the ratios calculated, determining a lens shading surface profile.
2. The method of claim 1, further comprising:
determining a level of light source illuminating a scene recorded in the captured images and providing a measurement for white balance in the captured images.
3. An image processing system, comprising:
an image sensor to capture image data; and
at least one image processor that receives a plurality of captured image data and detects a global motion that is present during capturing of the captured image data, wherein the global motion is used to estimate effects of lens shading on the captured image data.
4. The system of claim 3, wherein the captured image data is not characterized by a flat illumination field.
5. The system of claim 3, wherein the at least one image processor dynamically corrects subsequently captured image data for the effects of lens shading.
6. The system of claim 5, wherein the at least one image processor corrects newly captured image data for the effects of lens shading using prestored corrective factors that are not based on the captured image data when a condition in which the captured image data is captured is determined to not be conducive to using corrective factors based on the captured image data.
7. The system of claim 5, wherein the at least one image processor corrects newly captured image data for the effects of lens shading using corrective factors that are based on the captured image data from a single image when a condition in which the captured image data is captured is determined to not be conducive to using corrective factors based on the captured image data from multiple images.
8. The system of claim 3, wherein the at least one image processor determines a white balance level of the captured image data based on an estimation of the effects of lens shading.
9. The system of claim 3, wherein the at least one image processor determines a motion estimation of the image sensor that captured the captured image data based on an estimation of the effects of lens shading.
10. The system of claim 3, wherein the at least one image processor continually captures the captured image data as part of a preview mode and continually determines the effects of lens shading during the preview mode.
11. An image processing method, comprising:
receiving at least one captured image via an image sensor;
detecting a global motion that is present during capturing of the at least one captured image; and
computing an estimate of effects of lens shading on the at least one captured image from changes in intensity values of pixels in the at least one captured image during a global motion shift.
12. The method of claim 11, wherein responsive to selecting to forgo computing the estimate of lens shading using inter image analysis of a plurality of captured image data, the estimate of the effects of lens shading is computed using intra image analysis of a single captured image, wherein the at least one captured image is the single captured image.
13. The method of claim 11, wherein the estimate of the effects of lens shading is computed using at least inter image analysis of a sequence of captured image data, wherein the at least one captured image is the sequence of captured image data.
14. The method of claim 11, further comprising:
dynamically correcting subsequently captured image data for the effects of lens shading that has been previously computed for current lighting conditions.
15. The method of claim 14, further comprising:
correcting newly captured image data for the effects of lens shading using prestored corrective factors that are not based on the at least one captured image when a condition in which the at least one captured image is captured is determined to not be conducive to using corrective factors based on the at least one captured image.
16. The method of claim 15, wherein the condition comprises a changing illumination level in the at least one captured image, wherein the at least one captured image comprises a plurality of captured images.
17. The method of claim 15, wherein the condition comprises a local motion being detected in a scene that is a subject of the at least one captured image.
18. The method of claim 11, further comprising:
determining a white balance level of the at least one captured image based on an estimation of the effects of lens shading.
19. The method of claim 11, further comprising:
determining a motion estimation of the image sensor that captured the at least one captured image based on an estimation of the effects of lens shading.
20. The method of claim 11, wherein the at least one captured image comprises a plurality of captured images that are continuously captured as part of a preview mode and the effects of lens shading are continuously determined during the preview mode.
US13/330,047 2011-07-20 2011-12-19 Dynamic computation of lens shading Abandoned US20130021484A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201161509747P true 2011-07-20 2011-07-20
US13/330,047 US20130021484A1 (en) 2011-07-20 2011-12-19 Dynamic computation of lens shading

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/330,047 US20130021484A1 (en) 2011-07-20 2011-12-19 Dynamic computation of lens shading

Publications (1)

Publication Number Publication Date
US20130021484A1 true US20130021484A1 (en) 2013-01-24

Family

ID=47555520

Family Applications (9)

Application Number Title Priority Date Filing Date
US13/232,045 Abandoned US20130021488A1 (en) 2011-07-20 2011-09-14 Adjusting Image Capture Device Settings
US13/232,052 Abandoned US20130021512A1 (en) 2011-07-20 2011-09-14 Framing of Images in an Image Capture Device
US13/235,975 Abandoned US20130021504A1 (en) 2011-07-20 2011-09-19 Multiple image processing
US13/245,941 Abandoned US20130021489A1 (en) 2011-07-20 2011-09-27 Regional Image Processing in an Image Capture Device
US13/281,521 Abandoned US20130021490A1 (en) 2011-07-20 2011-10-26 Facial Image Processing in an Image Capture Device
US13/313,352 Active 2032-01-11 US9092861B2 (en) 2011-07-20 2011-12-07 Using motion information to assist in image processing
US13/313,345 Abandoned US20130022116A1 (en) 2011-07-20 2011-12-07 Camera tap transcoder architecture with feed forward encode data
US13/330,047 Abandoned US20130021484A1 (en) 2011-07-20 2011-12-19 Dynamic computation of lens shading
US13/413,863 Abandoned US20130021491A1 (en) 2011-07-20 2012-03-07 Camera Device Systems and Methods

Family Applications Before (7)

Application Number Title Priority Date Filing Date
US13/232,045 Abandoned US20130021488A1 (en) 2011-07-20 2011-09-14 Adjusting Image Capture Device Settings
US13/232,052 Abandoned US20130021512A1 (en) 2011-07-20 2011-09-14 Framing of Images in an Image Capture Device
US13/235,975 Abandoned US20130021504A1 (en) 2011-07-20 2011-09-19 Multiple image processing
US13/245,941 Abandoned US20130021489A1 (en) 2011-07-20 2011-09-27 Regional Image Processing in an Image Capture Device
US13/281,521 Abandoned US20130021490A1 (en) 2011-07-20 2011-10-26 Facial Image Processing in an Image Capture Device
US13/313,352 Active 2032-01-11 US9092861B2 (en) 2011-07-20 2011-12-07 Using motion information to assist in image processing
US13/313,345 Abandoned US20130022116A1 (en) 2011-07-20 2011-12-07 Camera tap transcoder architecture with feed forward encode data

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/413,863 Abandoned US20130021491A1 (en) 2011-07-20 2012-03-07 Camera Device Systems and Methods

Country Status (1)

Country Link
US (9) US20130021488A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249799A1 (en) * 2011-03-30 2012-10-04 Yukiko Shibata Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system
US9066015B2 (en) 2011-04-28 2015-06-23 Nippon Avionics Co., Ltd. Image capture device, method for generating image, infrared camera system, and interchangeable lens system
US9270959B2 (en) 2013-08-07 2016-02-23 Qualcomm Incorporated Dynamic color shading correction
US9973672B2 (en) 2013-12-06 2018-05-15 Huawei Device (Dongguan) Co., Ltd. Photographing for dual-lens device using photographing environment determined using depth estimation

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101796481B1 (en) * 2011-11-28 2017-12-04 삼성전자주식회사 Method of eliminating shutter-lags with low power consumption, camera module, and mobile device having the same
US9118876B2 (en) * 2012-03-30 2015-08-25 Verizon Patent And Licensing Inc. Automatic skin tone calibration for camera images
US9472005B1 (en) * 2012-04-18 2016-10-18 Amazon Technologies, Inc. Projection and camera system for augmented reality environment
US9619036B2 (en) * 2012-05-11 2017-04-11 Comcast Cable Communications, Llc System and methods for controlling a user experience
US9438805B2 (en) * 2012-06-08 2016-09-06 Sony Corporation Terminal device and image capturing method
US8957973B2 (en) * 2012-06-11 2015-02-17 Omnivision Technologies, Inc. Shutter release using secondary camera
TWI498771B (en) * 2012-07-06 2015-09-01 Pixart Imaging Inc Gesture recognition system and glasses with gesture recognition function
KR101917650B1 (en) * 2012-08-03 2019-01-29 삼성전자 주식회사 Method and apparatus for processing a image in camera device
US9554042B2 (en) * 2012-09-24 2017-01-24 Google Technology Holdings LLC Preventing motion artifacts by intelligently disabling video stabilization
JP2014086849A (en) * 2012-10-23 2014-05-12 Sony Corp Content acquisition device and program
JP2014176034A (en) * 2013-03-12 2014-09-22 Ricoh Co Ltd Video transmission device
US9552630B2 (en) * 2013-04-09 2017-01-24 Honeywell International Inc. Motion deblurring
US9595083B1 (en) * 2013-04-16 2017-03-14 Lockheed Martin Corporation Method and apparatus for image producing with predictions of future positions
US9916367B2 (en) 2013-05-03 2018-03-13 Splunk Inc. Processing system search requests from multiple data stores with overlapping data
US8738629B1 (en) 2013-05-03 2014-05-27 Splunk Inc. External Result Provided process for retrieving data stored using a different configuration or protocol
US10003792B2 (en) 2013-05-27 2018-06-19 Microsoft Technology Licensing, Llc Video encoder for images
US20140368514A1 (en) * 2013-06-12 2014-12-18 Infineon Technologies Ag Device, method and system for processing an image data stream
US9529513B2 (en) * 2013-08-05 2016-12-27 Microsoft Technology Licensing, Llc Two-hand interaction with natural user interface
DE112014004664T5 (en) * 2013-10-09 2016-08-18 Magna Closures Inc. Display control for vehicle window
US9251594B2 (en) * 2014-01-30 2016-02-02 Adobe Systems Incorporated Cropping boundary simplicity
US9245347B2 (en) * 2014-01-30 2016-01-26 Adobe Systems Incorporated Image Cropping suggestion
US10121060B2 (en) * 2014-02-13 2018-11-06 Oath Inc. Automatic group formation and group detection through media recognition
KR20150098094A (en) * 2014-02-19 2015-08-27 삼성전자주식회사 Image Processing Device and Method including a plurality of image signal processors
CN103841328B (en) * 2014-02-27 2015-03-11 深圳市中兴移动通信有限公司 Low-speed shutter shooting method and device
CN105359531B (en) 2014-03-17 2019-08-06 微软技术许可有限责任公司 Method and system for determining for the coder side of screen content coding
US20150297986A1 (en) * 2014-04-18 2015-10-22 Aquifi, Inc. Systems and methods for interactive video games with motion dependent gesture inputs
US10051196B2 (en) * 2014-05-20 2018-08-14 Lenovo (Singapore) Pte. Ltd. Projecting light at angle corresponding to the field of view of a camera
CA2955973C (en) * 2014-08-06 2018-05-22 Patrick Gooi Orientation system for image recording devices
US10116839B2 (en) * 2014-08-14 2018-10-30 Atheer Labs, Inc. Methods for camera movement compensation for gesture detection and object recognition
KR20160048532A (en) * 2014-10-24 2016-05-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN105549302B (en) 2014-10-31 2018-05-08 国际商业机器公司 Photographic imaging range of the imaging equipment presentation device
KR20160077861A (en) * 2014-12-24 2016-07-04 삼성전자주식회사 Electronic device and method for providing an information related to communication in electronic device
WO2016183380A1 (en) * 2015-05-12 2016-11-17 Mine One Gmbh Facial signature methods, systems and software
US20160316220A1 (en) * 2015-04-21 2016-10-27 Microsoft Technology Licensing, Llc Video encoder management strategies
US10165186B1 (en) * 2015-06-19 2018-12-25 Amazon Technologies, Inc. Motion estimation based video stabilization for panoramic video from multi-camera capture device
US10136132B2 (en) 2015-07-21 2018-11-20 Microsoft Technology Licensing, Llc Adaptive skip or zero block detection combined with transform size decision
EP3136726B1 (en) * 2015-08-27 2018-03-07 Axis AB Pre-processing of digital images
US9648223B2 (en) * 2015-09-04 2017-05-09 Microvision, Inc. Laser beam scanning assisted autofocus
US9456195B1 (en) * 2015-10-08 2016-09-27 Dual Aperture International Co. Ltd. Application programming interface for multi-aperture imaging systems
US9578221B1 (en) * 2016-01-05 2017-02-21 International Business Machines Corporation Camera field of view visualizer
JP6514140B2 (en) * 2016-03-17 2019-05-15 株式会社東芝 Imaging support apparatus, method and program
US9639935B1 (en) 2016-05-25 2017-05-02 Gopro, Inc. Apparatus and methods for camera alignment model calibration
WO2017205597A1 (en) * 2016-05-25 2017-11-30 Gopro, Inc. Image signal processing-based encoding hints for motion estimation
US10140776B2 (en) * 2016-06-13 2018-11-27 Microsoft Technology Licensing, Llc Altering properties of rendered objects via control points
US9851842B1 (en) * 2016-08-10 2017-12-26 Rovi Guides, Inc. Systems and methods for adjusting display characteristics
US10366122B2 (en) * 2016-09-14 2019-07-30 Ants Technology (Hk) Limited. Methods circuits devices systems and functionally associated machine executable code for generating a searchable real-scene database
US20180109722A1 (en) * 2016-10-18 2018-04-19 Light Labs Inc. Methods and apparatus for receiving, storing and/or using camera settings and/or user preference information
KR20190075988A (en) * 2016-10-26 2019-07-01 오캠 테크놀로지스 리미티드 Wearable devices and methods that analyze images and provide feedback
CN106550227B (en) * 2016-10-27 2019-02-22 成都西纬科技有限公司 A kind of image saturation method of adjustment and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063816A1 (en) * 1998-05-27 2003-04-03 Industrial Technology Research Institute, A Taiwanese Corporation Image-based method and system for building spherical panoramas
US20060268131A1 (en) * 2002-06-21 2006-11-30 Microsoft Corporation System and method for camera calibration and images stitching
US20100194851A1 (en) * 2009-02-03 2010-08-05 Aricent Inc. Panorama image stitching

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100325253B1 (en) * 1998-05-19 2002-03-04 미야즈 준이치롯 Motion vector search method and apparatus
US20010047517A1 (en) * 2000-02-10 2001-11-29 Charilaos Christopoulos Method and apparatus for intelligent transcoding of multimedia data
JP2001245303A (en) * 2000-02-29 2001-09-07 Toshiba Corp Moving picture coder and moving picture coding method
US6407680B1 (en) * 2000-12-22 2002-06-18 Generic Media, Inc. Distributed on-demand media transcoding system and method
US7034848B2 (en) * 2001-01-05 2006-04-25 Hewlett-Packard Development Company, L.P. System and method for automatically cropping graphical images
KR100582628B1 (en) * 2001-05-31 2006-05-23 마쯔시다덴기산교 가부시키가이샤 Information storing apparatus and method therefor
US7801215B2 (en) * 2001-07-24 2010-09-21 Sasken Communication Technologies Limited Motion estimation technique for digital video encoding applications
US20030126622A1 (en) * 2001-12-27 2003-07-03 Koninklijke Philips Electronics N.V. Method for efficiently storing the trajectory of tracked objects in video
KR100850705B1 (en) * 2002-03-09 2008-08-06 삼성전자주식회사 Method for adaptive encoding motion image based on the temperal and spatial complexity and apparatus thereof
US20130107938A9 (en) * 2003-05-28 2013-05-02 Chad Fogg Method And Apparatus For Scalable Video Decoder Using An Enhancement Stream
JP4275358B2 (en) * 2002-06-11 2009-06-10 株式会社日立製作所 Picture information converting apparatus and the bit stream converter, and an image information converting transmission method
US20040131276A1 (en) * 2002-12-23 2004-07-08 John Hudson Region-based image processor
EP1577705B1 (en) * 2002-12-25 2018-08-01 Nikon Corporation Blur correction camera system
KR100566290B1 (en) * 2003-09-18 2006-03-30 삼성전자주식회사 Image Scanning Method By Using Scan Table and Discrete Cosine Transform Apparatus adapted it
JP4123171B2 (en) * 2004-03-08 2008-07-23 ソニー株式会社 The method of manufacturing the vibration type gyro sensor element, vibrating gyroscopic sensor element and the vibration direction adjusting method
WO2005094270A2 (en) * 2004-03-24 2005-10-13 Sharp Laboratories Of America, Inc. Methods and systems for a/v input device to diplay networking
US8315307B2 (en) * 2004-04-07 2012-11-20 Qualcomm Incorporated Method and apparatus for frame prediction in hybrid video compression to enable temporal scalability
US20060109900A1 (en) * 2004-11-23 2006-05-25 Bo Shen Image data transcoding
JP2006203682A (en) * 2005-01-21 2006-08-03 Nec Corp Converting device of compression encoding bit stream for moving image at syntax level and moving image communication system
TW200816798A (en) * 2006-09-22 2008-04-01 Altek Corp Method of automatic shooting by using an image recognition technology
US7843824B2 (en) * 2007-01-08 2010-11-30 General Instrument Corporation Method and apparatus for statistically multiplexing services
US7924316B2 (en) * 2007-03-14 2011-04-12 Aptina Imaging Corporation Image feature identification and motion compensation apparatus, systems, and methods
JP4983917B2 (en) * 2007-05-23 2012-07-25 日本電気株式会社 Moving image distribution system, conversion device, and moving image distribution method
KR20100031755A (en) * 2007-07-30 2010-03-24 닛본 덴끼 가부시끼가이샤 Connection terminal, distribution system, conversion method, and program
US20090060039A1 (en) * 2007-09-05 2009-03-05 Yasuharu Tanaka Method and apparatus for compression-encoding moving image
US8098732B2 (en) * 2007-10-10 2012-01-17 Sony Corporation System for and method of transcoding video sequences from a first format to a second format
US8063942B2 (en) * 2007-10-19 2011-11-22 Qualcomm Incorporated Motion assisted image sensor configuration
US8170342B2 (en) * 2007-11-07 2012-05-01 Microsoft Corporation Image recognition of content
JP2009152672A (en) * 2007-12-18 2009-07-09 Samsung Techwin Co Ltd Recording apparatus, reproducing apparatus, recording method, reproducing method, and program
JP5242151B2 (en) * 2007-12-21 2013-07-24 セミコンダクター・コンポーネンツ・インダストリーズ・リミテッド・ライアビリティ・カンパニー Vibration correction control circuit and imaging apparatus including the same
JP2009159359A (en) * 2007-12-27 2009-07-16 Samsung Techwin Co Ltd Moving image data encoding apparatus, moving image data decoding apparatus, moving image data encoding method, moving image data decoding method and program
US20090217338A1 (en) * 2008-02-25 2009-08-27 Broadcom Corporation Reception verification/non-reception verification of base/enhancement video layers
US20090323810A1 (en) * 2008-06-26 2009-12-31 Mediatek Inc. Video encoding apparatuses and methods with decoupled data dependency
US7990421B2 (en) * 2008-07-18 2011-08-02 Sony Ericsson Mobile Communications Ab Arrangement and method relating to an image recording device
JP2010039788A (en) * 2008-08-05 2010-02-18 Toshiba Corp Image processing apparatus and method thereof, and image processing program
JP2010147808A (en) * 2008-12-18 2010-07-01 Olympus Imaging Corp Imaging apparatus and image processing method in same
US8311115B2 (en) * 2009-01-29 2012-11-13 Microsoft Corporation Video encoding using previously calculated motion information
US9009338B2 (en) * 2009-03-03 2015-04-14 Viasat, Inc. Space shifting over return satellite communication channels
US8520083B2 (en) * 2009-03-27 2013-08-27 Canon Kabushiki Kaisha Method of removing an artefact from an image
US20100309975A1 (en) * 2009-06-05 2010-12-09 Apple Inc. Image acquisition and transcoding system
JP5473536B2 (en) * 2009-10-28 2014-04-16 京セラ株式会社 Portable imaging device with projector function
US20110170608A1 (en) * 2010-01-08 2011-07-14 Xun Shi Method and device for video transcoding using quad-tree based mode selection
US8681255B2 (en) * 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US9007428B2 (en) * 2011-06-01 2015-04-14 Apple Inc. Motion-based image stitching
US8554011B2 (en) * 2011-06-07 2013-10-08 Microsoft Corporation Automatic exposure correction of images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063816A1 (en) * 1998-05-27 2003-04-03 Industrial Technology Research Institute, A Taiwanese Corporation Image-based method and system for building spherical panoramas
US20080074500A1 (en) * 1998-05-27 2008-03-27 Transpacific Ip Ltd. Image-Based Method and System for Building Spherical Panoramas
US20060268131A1 (en) * 2002-06-21 2006-11-30 Microsoft Corporation System and method for camera calibration and images stitching
US20100194851A1 (en) * 2009-02-03 2010-08-05 Aricent Inc. Panorama image stitching

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249799A1 (en) * 2011-03-30 2012-10-04 Yukiko Shibata Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system
US9170161B2 (en) * 2011-03-30 2015-10-27 Nippon Avionics Co., Ltd. Image capture device, pixel output level compensation method for same, infrared camera system, and interchangeable lens system
US9066015B2 (en) 2011-04-28 2015-06-23 Nippon Avionics Co., Ltd. Image capture device, method for generating image, infrared camera system, and interchangeable lens system
US9270959B2 (en) 2013-08-07 2016-02-23 Qualcomm Incorporated Dynamic color shading correction
US9973672B2 (en) 2013-12-06 2018-05-15 Huawei Device (Dongguan) Co., Ltd. Photographing for dual-lens device using photographing environment determined using depth estimation

Also Published As

Publication number Publication date
US20130021512A1 (en) 2013-01-24
US20130022116A1 (en) 2013-01-24
US20130021488A1 (en) 2013-01-24
US20130021490A1 (en) 2013-01-24
US20130021491A1 (en) 2013-01-24
US20130021483A1 (en) 2013-01-24
US9092861B2 (en) 2015-07-28
US20130021504A1 (en) 2013-01-24
US20130021489A1 (en) 2013-01-24

Similar Documents

Publication Publication Date Title
US8121404B2 (en) Exposure control apparatus and image pickup apparatus
US7683962B2 (en) Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map
US8593483B2 (en) Temporal filtering techniques for image signal processing
US6975775B2 (en) Stray light correction method for imaging light and color measurement system
KR101428427B1 (en) Capturing and rendering high dynamic ranges images
JP5226794B2 (en) Motion-assisted image sensor configuration
US9225855B2 (en) Imaging apparatus, imaging system, and control method for increasing accuracy when determining an imaging scene based on input image data and information stored in an external information processing apparatus
TWI513298B (en) Dual image sensor image processing system and method
US8488055B2 (en) Flash synchronization using image sensor interface timing signal
US8786625B2 (en) System and method for processing image data using an image signal processor having back-end processing logic
AU2011296296B2 (en) Auto-focus control using image statistics data with coarse and fine auto-focus scores
US9615012B2 (en) Using a second camera to adjust settings of first camera
US7925047B2 (en) Face importance level determining apparatus and method, and image pickup apparatus
US8629913B2 (en) Overflow control techniques for image signal processing
JP3767541B2 (en) Light source estimation apparatus, light source estimation method, imaging apparatus and image processing method
US8471932B2 (en) Spatial filtering for image signal processing
KR101537182B1 (en) White balance optimization with high dynamic range images
US8508612B2 (en) Image signal processor line buffer configuration for processing ram image data
US7839437B2 (en) Image pickup apparatus, image processing method, and computer program capable of obtaining high-quality image data by controlling imbalance among sensitivities of light-receiving devices
EP2091235A2 (en) Auto-focus calibration for image capture device
US8831377B2 (en) Compensating for variation in microlens position during light-field image processing
KR101554639B1 (en) Method and apparatus with depth map generation
US8736700B2 (en) Techniques for synchronizing audio and video data in an image signal processing system
Mitsunaga et al. Radiometric self calibration
AU2011320937B2 (en) Automatic white balance processing with flexible color space selection

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SOREK, NOAM;VITSNUDEL, ILIA;REEL/FRAME:027410/0777

Effective date: 20111218

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119