US20100225783A1 - Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging - Google Patents

Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging Download PDF

Info

Publication number
US20100225783A1
US20100225783A1 US12717765 US71776510A US2010225783A1 US 20100225783 A1 US20100225783 A1 US 20100225783A1 US 12717765 US12717765 US 12717765 US 71776510 A US71776510 A US 71776510A US 2010225783 A1 US2010225783 A1 US 2010225783A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
image
system
exposure
prism
temporally aligned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12717765
Inventor
Paul A. Wagner
Original Assignee
Wagner Paul A
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/235Circuitry or methods for compensating for variation in the brightness of the object, e.g. based on electric image signals provided by an electronic image sensor
    • H04N5/2355Circuitry or methods for compensating for variation in the brightness of the object, e.g. based on electric image signals provided by an electronic image sensor by increasing the dynamic range of the final image compared to the dynamic range of the electronic image sensor, e.g. by adding correct exposed portions of short and long exposed images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2251Constructional details
    • H04N5/2254Mounting of optical parts, e.g. lenses, shutters, filters or optical parts peculiar to the presence or use of an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/235Circuitry or methods for compensating for variation in the brightness of the object, e.g. based on electric image signals provided by an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/235Circuitry or methods for compensating for variation in the brightness of the object, e.g. based on electric image signals provided by an electronic image sensor
    • H04N5/2356Bracketing, i.e. taking a series of images with varying exposure conditions

Abstract

The invention provides an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.

Description

    REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Patent Application No. 61/157,494, filed Mar. 4, 2009, the complete disclosure of which is incorporated herein, in the entirety.
  • COPYRIGHT NOTICE
  • A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent files and records, but otherwise reserves all other copyright rights.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • This invention relates generally to imaging systems, and more particularly, to imaging systems that provide varying exposures for production of high dynamic range images.
  • 2. Description of Related Art
  • High dynamic range imaging (HDRI) is a term applied in image processing, computer graphics and photography, and generally relates to systems or techniques for providing a greater dynamic range of exposures. HDRI is most commonly employed in situations where the range between light and dark areas is great, and subsequently a normal exposure, or even a digitally enhanced exposure, are not adequate to resolve all of the image area.
  • HDRI manipulates images and exposures to accurately represent the wide range of intensity levels found in real scenes, from direct sunlight to shadows. With HDRI, the user employs multiple exposures and bracketing with photo merging, to get greater detail throughout the tonal range.
  • More particularly, HDRI processing involves merging several exposures of a given scene into a, typically, 32-bit HDRI source file, which is then “tone mapped” to produce an image in which adjustments of qualities of light and contrast are applied locally to the HDRI source image.
  • HDRI images are best captured originally in a digital format with a much higher bit depth than the current generation of digital imaging devices. Current devices are built around an 8-bit per channel architecture. That means that both the cameras and output displays have a maximum tonal range of 8-bits per RGB color channel.
  • HDRI formats are typically 32-bits per channel. A few next generation cameras and displays are capable of handling this kind of imagery natively. It will probably be quite a few years until HDRI displays become common but HDRI cameras and acquisition techniques are already emerging.
  • HDRI images are typically tone-mapped back to 8-bits per channel, essentially compressing the extended information into the smaller dynamic range. This is typically done automatically with a variety of existing software algorithms, or manually with artistic input through programs like Adobe Photoshop.
  • So in a typical workflow for HDRI the artist first captures the HDRI image, and then the image is tone-mapped back to desired output device such as ink on paper, an 8-bit RGB monitor, or even a 32-bit HDRI monitor (requiring no tone mapping).
  • The real challenge with HDRI is not the file formats or computer algorithms to tone map them to 8-bit displays. Those challenges have already been largely met. For example, open EXR is an example of a robust open source HDRI format developed by Industrial Light and Magic. The hardest part of capturing HDR images is the physical devices used to capture the imagery. So far only two ways of capturing HDR images are available.
  • The first is to use exotic high end cameras with special imaging chips (CMOS or CCD) like the Spheron HDR. Both CCD (charge-coupled device) and CMOS (complimentary metal-oxide semiconductor) image sensors convert light into electrons, though CMOS sensors are much less expensive to manufacture than CCD sensors. These types of cameras are typically used by professionals in controlled environments for the primary purpose of creating spherical photos to illuminate computer generated images (another important use of HDRI). They are not point and shoot cameras and are not capable of motion photography.
  • The second is shooting multiple varying exposures in rapid succession (known as exposure bracketing) then combining those images taking the highlights from the underexposed images, mid tones from the normally exposed images, and shadows from the over exposed images to create a composite HDR image that retains massive detail in the highlights and shadows where normal cameras would lose detail.
  • Both of these techniques have substantial disadvantages. The second technique can be done with conventional hardware, but it is time consuming and takes substantial expertise to pull off. In addition, because the images are not temporally aligned, meaning they were taken one after another at different moments in time, there can be changes in the scene that produce artifacts when the HDRI software attempts to eliminate or synthesize the objects in motion across the frame. An example would be a car moving through the frame.
  • Even a slight movement of the camera between exposures will be noticeable in the resulting combined image. Moving objects will be “ghosted” in the HDRI image. As such this technique is totally useless for motion photography and can only be used with substantial success in still photography applications.
  • For this reason, exposure bracketed HDRI is typically restricted to still subjects, and any animals, cars, pedestrians, moving leaves or litter, clouds, etc., in fact anything that is shifting within the frame will preclude HDRI, or at the very least lead to unhappy results.
  • Further, producing HDRI from multiple images can be a time consuming and frustrating task. HDRI requires multiple, huge files, multiple steps, and typically specialized and complicated software.
  • The first technique is very expensive and requires exotic hardware or sophisticated electronic and software systems. While imaging chips are moving ever forward in sensitivity and dynamic range, they still do not produce the dramatic results that the first technique of changing exposures does. In addition, these special cameras are not capable of shooting higher frame rates required to shoot motion pictures. These products are used for narrow specialized purposes.
  • Proposed solutions to the problems associated with the second technique are reflected in various published patents at the United States Patent and Trademark Office. For example, United States Patent Application No. 20060221209, to McGuire, et al., published Oct. 5, 2006, teaches an apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions. Disclosed therein is a camera system that acquires multiple optical characteristics at multiple resolutions of a scene. The camera system includes multiple optical elements arranged as a tree having a multiple of nodes connected by edges. The system employs filters at the end of the chain, and lenses are placed in front of each of the sensors, creating additional sources of optical distortion.
  • United States Patent Application No. 20070126918, to Lee, published Jun. 7, 2007, discloses cameras that can provide improved images by combining several shots of a scene taken with different exposure and focus levels is provided. In addition, cameras are provided, which have pixel-wise exposure control means so that high quality images are obtained for a scene with a high level of contrast. The system is complicated, and employs light reducing filters to create exposures of varying intensity. Much of the light is lost, reducing clarity and introducing sources of distortion and noise to the images.
  • United States Patent Application No. 20080149812, to Ward, et al., published Jun. 26, 2008, discloses an electronic camera comprising two or more image sensor arrays. At least one of the image sensor arrays has a high dynamic range. The camera also comprises a shutter for selectively allowing light to reach the two or more image sensor arrays, readout circuitry for selectively reading out pixel data from the image sensor arrays, and, a controller configured to control the shutter and the readout circuitry. The controller comprises a processor and a memory having computer-readable code embodied therein which, when executed by the processor, causes the controller to open the shutter for an image capture period to allow the two or more image sensor arrays to capture pixel data, and, read out pixel data from the two or more image sensor arrays. This is essentially a total digital solution to the problem of controlling exposure levels for different images for high dynamic range processing.
  • Finally, United States Patent Application No. 20070177004, to Kolehmainen, et al., published Aug. 2, 2007, is directed to an image creating method and imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image. The apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality. Multiple lenses are required to implement this method, which is expensive and creates parallax and optic imagery distortions with each lens addition.
  • None of the prior approaches have been able to provide a simple means for capturing multiple images that overcome the difficulties of temporal misalignment, and that are simple and quickly resolved into a high definition range image.
  • What is needed is an inexpensive solution that can be easily integrated into products with conventional form factors. This solution would ideally be easy to use, compact, and able to shoot at high frame rates with no introduction of temporal alignment problems and associated artifacts.
  • SUMMARY OF THE INVENTION
  • By this invention is provided an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.
  • In one embodiment of the invention, the prism splits the intensity of said incoming image to achieve a desired EV output interval between temporally aligned images.
  • In a different embodiment, the capturing device further comprises image detection sensors, and the ISO of the sensors is adjusted to achieve a desired EV output interval between said images.
  • In another aspect of the invention, the system comprises an image processing device connected to said image capturing device.
  • In one embodiment, the image processing device comprises a computer processor.
  • In a different embodiment, the device further comprises a tone-mapping processor.
  • In a different aspect, the system comprises an eyepiece for viewing the image to be captured by the lens.
  • In a still further aspect, the system comprises a digital readout monitor.
  • In another embodiment, the prism is capable of splitting the image into three or more levels of exposure.
  • In a different embodiment, the three levels of exposure are about 14%, about 29% and about 57%, respectively, of the exposure level of the original image.
  • In a different embodiment, the three levels of exposure are about 5%, about 19% and about 76%, respectively, of the exposure level of the original image.
  • In a different embodiment, the three levels of exposure are about 1%, about 11% and about 88%, respectively, of the exposure level of the original image.
  • In a still different embodiment, the prism is capable of splitting the image into four or more levels of exposure.
  • In another embodiment, the prism is capable of splitting the image into five or more levels of exposure.
  • In a different aspect, the invention provides a method for temporally aligning bracketed exposures of a single image, the method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture the temporally aligned images at different levels of exposure.
  • These and other features and advantages of this invention are described in, or are apparent from, the following detailed description of various exemplary embodiments of the apparatus and methods according to this invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete understanding of the present invention and the attendant features and advantages thereof may be had by reference to the following detailed description when considered in conjunction with the accompanying drawings wherein:
  • FIG. 1 shows a diagrammatic view of the system produced according to the invention, demonstrating variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings.
  • FIG. 2 shows a diagrammatic view of a system of FIG. 1 and further showing additional components of the system for processing the images.
  • FIG. 3 shows a perspective drawing of a two-way prism that could be utilized with the invention.
  • FIG. 4 shows a perspective drawing of a three-way prism that could be utilized with the invention.
  • FIG. 5 shows a perspective drawing of a four-way prism that could be utilized with the invention.
  • FIG. 6 shows a perspective drawing of a five-way prism that could be utilized with the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The optical imaging system of the present invention provides an improvement to high dynamic range imaging, and assemblies therefore, that allows temporally aligned exposure bracketing. The system is simple, elegant, leverages existing technologies, allows for motion capture with no temporal distortion, and is relatively inexpensive to implement.
  • The present optical imaging system allows the user to capture light with confidence that the under and over exposed regions in the image will be imaged properly. The user simply captures all the available light with and image capturing device, and determines later how to map that information to the output device. With the optical imaging system the user can create stunning imagery that is otherwise impossible to capture, even with the most sophisticated of the current generation of normal photography equipment.
  • Before the present invention is described in greater detail, it is to be understood that this invention is not limited to particular embodiments described, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
  • Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although any materials similar or equivalent to those described herein can also be used in the practice or testing of the present invention, the preferred materials are now described.
  • All publications and patents cited in this specification are herein incorporated by reference as if each individual publication or patent were specifically and individually indicated to be incorporated by reference and are incorporated herein by reference to disclose and describe the materials in connection with which the publications are cited. The citation of any publication is for its disclosure prior to the filing date and should not be construed as an admission that the present invention is not entitled to antedate such publication by virtue of prior invention. Further, the dates of publication provided may be different from the actual publication dates which may need to be independently confirmed.
  • It must be noted that as used herein and in the appended claims, the singular forms “a,” “an”, and “the” include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.
  • As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present invention.
  • For example, although the foregoing drawings and references refer to color images and processors, the system and methods work equally well for black and white (grayscale) images and sensors. For instance, some applications for scientific or industrial use may prefer grayscale imagery.
  • Further, while unusual in present day camera art, it is possible to build an imaging apparatus without a primary lens (i.e., a pinhole camera or a slit scanner). These applications are more likely in industrial or scientific applications. The invention can easily be adapted for designs that don't include a front end lens, but rather a simple aperture or the like.
  • Generally speaking, the systems and methods utilize prism splitting by full spectrum brackets to several image detecting sensors of an image capturing device. The system eliminates exotic image sensors as a necessary feature. The system allows multiple exposures from existing commodity sensors simultaneously by simply dividing the incoming light for an image into multiple and different levels of exposure for the same image.
  • The temporally aligned imaging system can be analogized to Technicolor. Before color film stock was developed, Hollywood was in search of a way to shoot films in color. Technicolor, Inc. was the first company to develop a way to create color pictures from black and white film stock. It utilized three rolls of black and white film exposed simultaneously through a special set of beam splitters with red, green, and blue filters on them.
  • Simply put, each black and white film negative recorded just the red, green, or blue information. This process was done in reverse with a projector that ran all three rolls of film simultaneously with the correct color filter in front of each. When the images are aligned properly, a full color picture is realized.
  • As better color film stocks emerged, this process fell out of favor, until video cameras emerged. In the early days of video, color sensors were not very sharp, and had difficulty producing high resolution images, or good color saturation and reproduction. Black and white sensors were far sharper and had a higher dynamic range. So the Technicolor principle of using three image sensors and a beam splitter to feed each an identical simultaneous image was dusted off and put into use for a new generation of imaging products. Three black and white CCD were used with a new and vastly improved beam splitter called a trichroic prism.
  • This technique is used to this day in professional level video cameras, sometimes referred to as 3CCD sensor. The three red, green and blue sensors not only allow for sharper more saturated colors but also help enhance the dynamic range of the images they help create. But just as better color film stocks helped to usher out the era of the Technicolor process, better CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems. In fact some of the highest end professional cameras like the lineup from RED Digital Cinema Camera Company as well as every professional Digital SLR use only one full color sensor. It is quite apparent that sensor technology has progressed to the point where a single color sensor can replace and even outperform 3CCD sensor systems.
  • In one aspect, the temporally aligned exposure bracketing system employs trichroic prisms adapted to split the entire spectrum to each of multiple full color sensors, at different exposure levels, rather than splitting out the spectrum into different colors.
  • The system allows a color neutral change in the amount, rather than the spectrum, of light going to each sensor, by the application of such prisms for the temporal alignment of images for HRDI. By “color neutral”, it is meant that while the temporally aligned images created by the prism may vary in intensity between themselves, or between themselves and the incoming image, they are not substantially different from one another in color spectrum, i.e., the prism creates split images that are similar in color spectrum, or spectrally neutral, even if differing substantially in intensity.
  • All of the commonly understood color separation prism layouts may also be used for neutral separation. In reference now to FIG. 1, the system 10 comprises an optical imaging system having an aperture 20 for capturing incoming light 30. Internal to the system is a neutral prism 100 that is used to reflect the captured light to generate a color-neutral separation.
  • In FIG. 1, the neutral film prism 100 is depicted as a three-way prism that splits the light to three separate full color sensors image 101, 102 and 103. Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a “stop”) up and down with the intensity spectrum, and a camera can then capture the images simultaneously. In FIG. 1, two consecutive neutral films 104 and 105 are used, respectively capturing 57.1429% ( 4/7) of the light followed by a neutral film of 33.33% (⅓) for the remainder light. The neutral prism thus fractionates a captured image into three temporally aligned exposures 106, 107 and 108, that have relative light intensities of 1/7, 2/7 and 4/7 of the incoming light.
  • The film coatings 104 and 105 for the prism 100 may be of any of numerous coatings known to the art and capable of achieving a color neutral split, or separation, of the image, by reflection of the incoming light 30. Two examples of such spectrally neutral films include a thin film metallic coating, typically aluminum or silver, with or without a set of dielectric layers, and a set of dielectric layers consisting of high and low refractive index materials with the thin film stack designed to reflect a certain percentage of the incident light over the visible wavelength range. These and related types of thin film coatings 104 and 105 shall be termed “spectrally neutral film” or, alternatively, “neutral film.”
  • The following table provides a demonstration for calculating the percentages for such a system, using a prism for splitting a captured image into temporally aligned exposures 106, 107 and 108 at levels of 14.2857%, 28.5714% and 57.1429%, respectively.
  • TABLE 1
    sensor percent light ratio light
    +1 EV 14.2857% 1/7
    standard EV 28.5714% 2/7
    −1 EV 57.1429% 4/7
    neutral film
    percent neutral film ratio
    57.1429% 4/7
    33.3333% 1/3
  • Thus, with color image sensors that do not need the RGB color split, the prism is harnessed for the purpose of splitting out different exposures of the same image, that are temporally aligned (taken at the same moment).
  • Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a “stop”) up and down with the intensity spectrum that would allow a camera to capture the images simultaneously. For instance, this can be accomplished by splitting the incoming light into different intensities directly in the prism, adjusting the ISO sensitivity in the sensors or some combination of the two.
  • At one extreme, the system could split the light intensity in the prism 100 into equal amounts of roughly 33% each and then adjust the ISO of the sensors 101, 102 and 103 respectively to achieve different EV output intervals. At another extreme, the system could split the light intensity within the prism 100 into the desired EV intervals for the light 106, 107 and 108. Thus, even while leaving the ISO of the sensors the same, the desired different EV output intervals are achieved for the recorded images. Any combination between these two extremes may be more or less desirable for various applications.
  • FIG. 2 illustrates some additional components of a system 10. In FIG. 2 is seen the deployment of a tone mapping processor 110 and an HDRI 120 processor that are used for combining the images. The processing chip is used to combine the 3 images in real time to an HDRI image, and another chip is used to complete the tone mapping. These functions can also be combined into a single processing chip.
  • Systems for controlling the action of the lens and associated hardware, including light responsive software controllers, are well known to the art.
  • In addition, the individual sensors could benefit from some tuning for their respective exposure levels to reduce noise and other artifacts associated with under and over exposure, in ways known to the art.
  • A high quality standard camera lens 140 can be used with the system 10 to gather and focus light from the light aperture.
  • The system 10 also will typically include an eyepiece and/or monitor 150 for aligning the images for capture from the lens onto the sensors.
  • Additional features of the system typically would include mass storage for either the 8 bit tone mapped data 160, or the raw 32 bit HDRI data 170. Other HDRI formats are known, for instance 16 bit and 14 bit formats, though the standard is evolving toward the higher 32 bit format.
  • The ISO is a function of how sensitive the sensor/film is to light. The exposure generated by a particular aperture, shutter speed, and sensitivity combination can be represented by its exposure value “EV”. Zero EV is defined by the combination of an aperture of f/1 and a shutter speed of 1s at ISO 100.
  • The term “exposure value” is used to represent shutter speed and aperture combinations only. An exposure value which takes into account the ISO sensitivity is called “Light Value” or LV and represents the luminance of the scene. For the sake of simplicity, as is the case in this patent, Light Value is often referred to as “exposure value”, grouping aperture, shutter speed and sensitivity in one familiar variable. This is because in a digital camera it is as easy to change sensitivity as it is to change aperture and shutter speed.
  • Each time the amount of light collected by the sensor is halved (e.g., by doubling shutter speed or by halving the aperture), the EV will increase by 1. For instance, 6 EV represents half the amount of light as 5 EV.
  • Table 2 shows the additional variations possible for adjusting output intervals on top of the prismatic split, for +/−3EV, +/−2EV and +/−1EV.
  • TABLE 2
    output sensor 1 Sensor 2 sensor 3
    interval (+1 EV in) (Standard EV in) (−1 EV in)
    +/−3 EV 25 ISO 100 ISO 400 ISO
    +/−2 EV 50 ISO 100 ISO 200 ISO
    +/−1 EV 100 ISO  100 ISO 100 ISO
  • The various exposure intervals can be modified or enhanced by using different combinations of prism splits with sensor sensitivity settings. This is accomplished by using differential exposure values (EV) to amplify the differences created by the prismatic split at the level of the sensors.
  • Table 3 shows results for a diagrammatic view of a system produced according to the invention that as shown in FIGS. 1 and 2, only deploying a prism with two splits of light 104 and 105 corresponding to 76.1905% ( 16/21) followed by 20.00% (⅕) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 76.1905%, 19.0476% and 4.7619%, respectively.
  • TABLE 3
    sensor percent light ratio light
    +2 EV 4.7619% 1/21
    standard EV 19.0476% 4/21
    −2 EV 76.1905% 16/21 
    neutral film
    percent neutral film ratio
    76.1905% 16/21
    20.0000% 1/5
  • Table 4 shows the results where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/−3EV, +/−2EV and +/−1EV. Table 4 shows the various ISO settings for each sensor that is used to produce alternative EV output intervals from each sensor (these settings are for +/−1EV input values only) as found in Table 3 (these settings are for +/−2EV input values only).
  • TABLE 4
    output sensor 1 sensor 2 sensor 3
    interval (+2 EV in) (Standard EV in) (−2 EV in)
    +/−3 EV  50 ISO 100 ISO 200 ISO
    +/−2 EV 100 ISO 100 ISO 100 ISO
    +/−1 EV 200 ISO 100 ISO  50 ISO
  • Table 5 is the results for a system produced according to the invention as depicted in FIGS. 1 and 2, only showing a prism with two splits of light 104 and 105 corresponding to 87.6712% ( 64/73) followed by 11.11% ( 1/9) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 87.6712%, 10.9589% and 1.3699%, respectively.
  • TABLE 5
    sensor percent light ratio light
    +3 EV 1.3699% 1/73
    standard EV 10.9589% 8/73
    −3 EV 87.6712% 64/73 
    neutral film
    percent neutral film ratio
    87.6712% 64/73
    11.1111% 1/9
  • Table 6 is the settings for a system as would be configured for the Table 5 percentages, where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/−3EV, +/−2EV and +/−1EV.
  • TABLE 6
    output sensor 1 Sensor 2 sensor 3
    interval (+3 EV in) (Standard EV in) (−3 EV in)
    +/−3 EV 100 ISO 100 ISO 100 ISO 
    +/−2 EV 200 ISO 100 ISO 50 ISO
    +/−1 EV 400 ISO 100 ISO 25 ISO
  • The system depicted in FIGS. 1 and 2, and through Tables 1 through 6, exemplifies a wide range of exposure levels that can be achieved, but are not exhaustive by any means. These are intended as examples only, and even more possibilities exist, including narrower or greater exposure ranges and configurations and settings of the prism splits with sensor sensitivity settings.
  • Further, while the use of a three-way prism is demonstrated in FIGS. 1 and 2, other neutral prism configurations could be utilized. FIGS. 3 through 6 demonstrate configurations for two-way, three-way, four-way and five-way neutral prism configurations, respectively.
  • Use of different prism splits will be desirable for different applications. In a very minimal configuration a 2-way configuration could work (FIG. 3), although not as well for some applications. However, a two-way neutral prism likely represents the least expensive implementation of the device, and may likely be used in consumer versions of many products produced for the cost savings.
  • On the other hand, in some scientific or professional applications, the greater control from more elaborate splits possible from the four-way and five-way neutral prism splits shown in FIGS. 5 and 6 may be desired.
  • While this invention has been described in conjunction with the specific embodiments outlined above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the preferred embodiments of the invention, as set forth above, are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of this invention.

Claims (28)

  1. 1. An optical imaging system for temporally aligning bracketed exposures of a single image, said system comprising a light aperture, a prism and a image capturing device,
    wherein said prism is capable of splitting an incoming image from said light aperture into at least two color neutral, temporally aligned images,
    whereby said image capturing device captures said temporally aligned images at different levels of exposure.
  2. 2. The system of claim 1 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
  3. 3. The system of claim 1 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
  4. 4. The system of claim 1 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
  5. 5. The system of claim 1 wherein said prism further comprises at least one neutral film coating.
  6. 6. The system of claim 1 further comprising an image processing device connected to said image capturing device.
  7. 7. The system of claim 1 wherein said image processing device comprises a computer processor.
  8. 8. The system of claim 7 further comprising a tone-mapping processor.
  9. 9. The system of claim 8 wherein the image processing device and tone-mapping processor are contained on a single integrated circuit.
  10. 10. The system of claim 1 further comprising a digital readout monitor.
  11. 11. The system of claim 1 further comprising a lens associated with said aperture.
  12. 12. The system of claim 1 further comprising an eyepiece for viewing said incoming image.
  13. 13. The system of claim 1 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
  14. 14. The system of claim 13 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
  15. 15. The system of claim 13 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
  16. 16. The system of claim 13 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
  17. 17. The system of claim 1 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
  18. 18. The system of claim 1 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
  19. 19. A method for temporally aligning bracketed exposures of a single image, said method comprising the steps of
    a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and
    b) using an image capturing device to capture said temporally aligned images at different levels of exposure,
    wherein said prism produces a color neutral split of said temporally aligned images.
  20. 20. The method of claim 19 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
  21. 21. The method of claim 19 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
  22. 22. The method of claim 19 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
  23. 23. The method of claim 19 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
  24. 24. The method of claim 23 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
  25. 25. The method of claim 23 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
  26. 26. The method of claim 23 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
  27. 27. The method of claim 19 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
  28. 28. The method of claim 19 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
US12717765 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging Abandoned US20100225783A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15749409 true 2009-03-04 2009-03-04
US12717765 US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12717765 US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging
US14514077 US20150029361A1 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14514077 Continuation US20150029361A1 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging

Publications (1)

Publication Number Publication Date
US20100225783A1 true true US20100225783A1 (en) 2010-09-09

Family

ID=42677914

Family Applications (2)

Application Number Title Priority Date Filing Date
US12717765 Abandoned US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging
US14514077 Pending US20150029361A1 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14514077 Pending US20150029361A1 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging

Country Status (4)

Country Link
US (2) US20100225783A1 (en)
EP (1) EP2404209A4 (en)
KR (1) KR20120073159A (en)
WO (1) WO2010102135A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012009151A1 (en) * 2012-05-08 2013-11-14 Steinbichler Optotechnik Gmbh Method for detecting intensity-modulated optical radiation field by device, involves irradiating surface of object with optical radiation and generating intensity-modulated optical radiation field
US9077910B2 (en) 2011-04-06 2015-07-07 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
US20150369565A1 (en) * 2014-06-20 2015-12-24 Matthew Flint Kepler Optical Device Having a Light Separation Element
US9245348B2 (en) 2012-06-15 2016-01-26 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
US20160205291A1 (en) * 2015-01-09 2016-07-14 PathPartner Technology Consulting Pvt. Ltd. System and Method for Minimizing Motion Artifacts During the Fusion of an Image Bracket Based On Preview Frame Analysis
CN106060351A (en) * 2016-06-29 2016-10-26 联想(北京)有限公司 Image processing device and image processing method
US20170237879A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Color matching across multiple sensors in an optical system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9979906B2 (en) 2016-08-03 2018-05-22 Waymo Llc Beam split extended dynamic range image capture system

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672196A (en) * 1984-02-02 1987-06-09 Canino Lawrence S Method and apparatus for measuring properties of thin materials using polarized light
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US5315384A (en) * 1990-10-30 1994-05-24 Simco/Ramic Corporation Color line scan video camera for inspection system
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US6204881B1 (en) * 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US20030174413A1 (en) * 2002-03-13 2003-09-18 Satoshi Yahagi Autofocus system
US20040008267A1 (en) * 2002-07-11 2004-01-15 Eastman Kodak Company Method and apparatus for generating images used in extended range image composition
US20040130649A1 (en) * 2003-01-03 2004-07-08 Chulhee Lee Cameras
US20050041113A1 (en) * 2001-04-13 2005-02-24 Nayar Shree K. Method and apparatus for recording a sequence of images using a moving optical element
US6864916B1 (en) * 1999-06-04 2005-03-08 The Trustees Of Columbia University In The City Of New York Apparatus and method for high dynamic range imaging using spatially varying exposures
US20050104900A1 (en) * 2003-11-14 2005-05-19 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20050275747A1 (en) * 2002-03-27 2005-12-15 Nayar Shree K Imaging method and system
US7010174B2 (en) * 2003-04-29 2006-03-07 Microsoft Corporation System and process for generating high dynamic range video
US20060055894A1 (en) * 2004-09-08 2006-03-16 Seiko Epson Corporation Projector
US20060082692A1 (en) * 2004-10-15 2006-04-20 Seiko Epson Corporation Image display device and projector
US7084905B1 (en) * 2000-02-23 2006-08-01 The Trustees Of Columbia University In The City Of New York Method and apparatus for obtaining high dynamic range images
US20060221209A1 (en) * 2005-03-29 2006-10-05 Mcguire Morgan Apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions
US7158687B2 (en) * 2002-04-23 2007-01-02 Olympus Corporation Image combination device
US20070177004A1 (en) * 2006-06-08 2007-08-02 Timo Kolehmainen Image creating method and imaging device
US20070229766A1 (en) * 2006-03-29 2007-10-04 Seiko Epson Corporation Modulation Apparatus and Projector
US20080149812A1 (en) * 2006-12-12 2008-06-26 Brightside Technologies Inc. Hdr camera with multiple sensors
US20080158245A1 (en) * 2006-12-29 2008-07-03 Texas Instruments Incorporated High dynamic range display systems
US20080158400A1 (en) * 2002-06-26 2008-07-03 Pixim, Incorporated Digital Image Capture Having an Ultra-High Dynamic Range
US20080291289A1 (en) * 2007-02-20 2008-11-27 Seiko Epson Corporation Image pickup device, image pickup system, image pickup method, and image processing device
US20090213466A1 (en) * 2006-09-14 2009-08-27 3M Innovative Properties Company Beam splitter apparatus and system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1198418A (en) * 1997-09-24 1999-04-09 Toyota Central Res & Dev Lab Inc Image pickup device
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
US7057659B1 (en) * 1999-07-08 2006-06-06 Olympus Corporation Image pickup device and image pickup optical system
JP2001157109A (en) * 1999-11-24 2001-06-08 Nikon Corp Electronic camera and recording medium for image data processing
EP1271935A1 (en) * 2001-06-29 2003-01-02 Kappa opto-electronics GmbH Apparatus for taking digital images with two simultaneously controlled image sensors
JP4687492B2 (en) * 2006-02-14 2011-05-25 株式会社ニコン Camera, imaging methods, the exposure calculation unit and program
US7949182B2 (en) * 2007-01-25 2011-05-24 Hewlett-Packard Development Company, L.P. Combining differently exposed images of the same object
US7978239B2 (en) * 2007-03-01 2011-07-12 Eastman Kodak Company Digital camera using multiple image sensors to provide improved temporal sampling
DE102007026337B4 (en) * 2007-06-06 2016-11-03 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg digital camera
JP4960907B2 (en) * 2008-03-11 2012-06-27 富士フイルム株式会社 Imaging apparatus and imaging method

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672196A (en) * 1984-02-02 1987-06-09 Canino Lawrence S Method and apparatus for measuring properties of thin materials using polarized light
US5315384A (en) * 1990-10-30 1994-05-24 Simco/Ramic Corporation Color line scan video camera for inspection system
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6204881B1 (en) * 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US6864916B1 (en) * 1999-06-04 2005-03-08 The Trustees Of Columbia University In The City Of New York Apparatus and method for high dynamic range imaging using spatially varying exposures
US7084905B1 (en) * 2000-02-23 2006-08-01 The Trustees Of Columbia University In The City Of New York Method and apparatus for obtaining high dynamic range images
US20050041113A1 (en) * 2001-04-13 2005-02-24 Nayar Shree K. Method and apparatus for recording a sequence of images using a moving optical element
US20030174413A1 (en) * 2002-03-13 2003-09-18 Satoshi Yahagi Autofocus system
US20050275747A1 (en) * 2002-03-27 2005-12-15 Nayar Shree K Imaging method and system
US7158687B2 (en) * 2002-04-23 2007-01-02 Olympus Corporation Image combination device
US20080158400A1 (en) * 2002-06-26 2008-07-03 Pixim, Incorporated Digital Image Capture Having an Ultra-High Dynamic Range
US20040008267A1 (en) * 2002-07-11 2004-01-15 Eastman Kodak Company Method and apparatus for generating images used in extended range image composition
US20070126920A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras capable of focus adjusting
US20070126919A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras capable of providing multiple focus levels
US20070126918A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras with multiple sensors
US20040130649A1 (en) * 2003-01-03 2004-07-08 Chulhee Lee Cameras
US7010174B2 (en) * 2003-04-29 2006-03-07 Microsoft Corporation System and process for generating high dynamic range video
US20060158462A1 (en) * 2003-11-14 2006-07-20 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20050104900A1 (en) * 2003-11-14 2005-05-19 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20060055894A1 (en) * 2004-09-08 2006-03-16 Seiko Epson Corporation Projector
US20060082692A1 (en) * 2004-10-15 2006-04-20 Seiko Epson Corporation Image display device and projector
US20060221209A1 (en) * 2005-03-29 2006-10-05 Mcguire Morgan Apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions
US20070229766A1 (en) * 2006-03-29 2007-10-04 Seiko Epson Corporation Modulation Apparatus and Projector
US20070177004A1 (en) * 2006-06-08 2007-08-02 Timo Kolehmainen Image creating method and imaging device
US20090213466A1 (en) * 2006-09-14 2009-08-27 3M Innovative Properties Company Beam splitter apparatus and system
US20080149812A1 (en) * 2006-12-12 2008-06-26 Brightside Technologies Inc. Hdr camera with multiple sensors
US20080158245A1 (en) * 2006-12-29 2008-07-03 Texas Instruments Incorporated High dynamic range display systems
US20080291289A1 (en) * 2007-02-20 2008-11-27 Seiko Epson Corporation Image pickup device, image pickup system, image pickup method, and image processing device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9077910B2 (en) 2011-04-06 2015-07-07 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
US9549123B2 (en) 2011-04-06 2017-01-17 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
DE102012009151B4 (en) * 2012-05-08 2018-01-18 Steinbichler Optotechnik Gmbh Method and apparatus for detecting an intensity-modulated optical radiation field
DE102012009151A1 (en) * 2012-05-08 2013-11-14 Steinbichler Optotechnik Gmbh Method for detecting intensity-modulated optical radiation field by device, involves irradiating surface of object with optical radiation and generating intensity-modulated optical radiation field
US9615040B2 (en) 2012-06-15 2017-04-04 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
US9245348B2 (en) 2012-06-15 2016-01-26 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
WO2015196178A3 (en) * 2014-06-20 2016-02-25 Trackingpoint, Inc. Optical device having a light separation element
US20150369565A1 (en) * 2014-06-20 2015-12-24 Matthew Flint Kepler Optical Device Having a Light Separation Element
US20160205291A1 (en) * 2015-01-09 2016-07-14 PathPartner Technology Consulting Pvt. Ltd. System and Method for Minimizing Motion Artifacts During the Fusion of an Image Bracket Based On Preview Frame Analysis
US9948829B2 (en) * 2016-02-12 2018-04-17 Contrast, Inc. Color matching across multiple sensors in an optical system
US20170237890A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Devices and Methods for High Dynamic Range Video
US20170237879A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Color matching across multiple sensors in an optical system
CN106060351A (en) * 2016-06-29 2016-10-26 联想(北京)有限公司 Image processing device and image processing method

Also Published As

Publication number Publication date Type
US20150029361A1 (en) 2015-01-29 application
EP2404209A1 (en) 2012-01-11 application
EP2404209A4 (en) 2012-10-17 application
KR20120073159A (en) 2012-07-04 application
WO2010102135A1 (en) 2010-09-10 application

Similar Documents

Publication Publication Date Title
US7692696B2 (en) Digital image acquisition system with portrait mode
US7245325B2 (en) Photographing device with light quantity adjustment
US20060239579A1 (en) Non Uniform Blending of Exposure and/or Focus Bracketed Photographic Images
US8687087B2 (en) Digital camera with selectively increased dynamic range by control of parameters during image acquisition
US20080143841A1 (en) Image stabilization using multi-exposure pattern
US20090167928A1 (en) Image processing apparatus and photographing apparatus
US6995793B1 (en) Video tap for a digital motion camera that simulates the look of post processing
US8212889B2 (en) Method for activating a function, namely an alteration of sharpness, using a colour digital image
US20100073527A1 (en) Imaging apparatus
US20090160968A1 (en) Camera using preview image to select exposure
US8199222B2 (en) Low-light video frame enhancement
US20080199056A1 (en) Image-processing device and image-processing method, image-pickup device, and computer program
US7151560B2 (en) Method and apparatus for producing calibration data for a digital camera
US20030001958A1 (en) White balance adjustment method, image processing apparatus and electronic camera
Tocci et al. A versatile HDR video production system
US20090091645A1 (en) Multi-exposure pattern for enhancing dynamic range of images
US20090274387A1 (en) Method of capturing high dynamic range images with objects in the scene
US20130113988A1 (en) Flash system for multi-aperture imaging
US20090015689A1 (en) Multi-eye image pickup apparatus and adjusting method
US20120127334A1 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
US20130021447A1 (en) Dual image capture processing
US20080056704A1 (en) Method, apparatus and system for dynamic range estimation of imaged scenes
US20110205381A1 (en) Tone mapping for low-light video frame enhancement
WO2012057622A1 (en) System and method for imaging using multi aperture camera
US20110187914A1 (en) Digital photographing apparatus, method of controlling the same, and computer-readable medium