GB2499668A - Exposure Controller - Google Patents

Exposure Controller Download PDF

Info

Publication number
GB2499668A
GB2499668A GB1203354.4A GB201203354A GB2499668A GB 2499668 A GB2499668 A GB 2499668A GB 201203354 A GB201203354 A GB 201203354A GB 2499668 A GB2499668 A GB 2499668A
Authority
GB
United Kingdom
Prior art keywords
exposure
image
factor
pixel
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1203354.4A
Other versions
GB2499668B (en
GB201203354D0 (en
Inventor
Llya Romanenko
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apical Ltd
Original Assignee
Apical Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apical Ltd filed Critical Apical Ltd
Priority to GB1203354.4A priority Critical patent/GB2499668B/en
Publication of GB201203354D0 publication Critical patent/GB201203354D0/en
Priority to PCT/EP2013/053929 priority patent/WO2013127849A1/en
Publication of GB2499668A publication Critical patent/GB2499668A/en
Priority to US14/469,479 priority patent/US9584732B2/en
Application granted granted Critical
Publication of GB2499668B publication Critical patent/GB2499668B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/533Control of the integration time by using differing integration times for different sensor regions

Abstract

An image sensor provides a first image with a first exposure and a second image with a second exposure. An exposure controller sets the first exposure in dependence on pixel 5 intensities of at least one of the images. It sets the second exposure in dependence on a factor and the first exposure. The factor is determined in dependence on pixel intensities of at least one of the images.

Description

1
Exposure controller
Technical Field
The present invention relates to an exposure controller for an image sensor, a 5 system including such a controller, a method for use in the controller and a computer program implementing the method.
Background
An exposure controller is disclosed in patent US 7,430,011 for controlling the 10 exposure of a plurality of dark pixels and of a plurality of bright pixels in an image sensor. The controller sets the exposure of the dark and bright pixels independently by determining the number of dark pixels that have a signal level below a 'low' threshold and the number of bright pixels that have a signal level above a 'high' threshold, respectively. The two partial images obtained by the dark and bright pixels are 15 combined by adding signal levels of the corresponding dark and bright pixels to one merged pixel in a merged image with a large dynamic range.
If the known setting of the exposures is applied to a high dynamic range capture system involving multiple images captured sequentially in time, or involving alternation of exposure line by line as the sensor is scanned, the merged image may 20 have a low quality.
Summary
In accordance with the present invention, there is provided an exposure controller for controlling an exposure of a plurality of pixels of an image sensor, the 25 image sensor providing a first image having a first exposure and a second image having a second exposure, the exposure controller being arranged to set the first exposure in dependence on pixel intensities of at least one of the images, to determine a factor in dependence on pixel intensities of at least one of the images, and to set the second exposure in dependence on the factor and the first exposure. 30 The inventor has discovered that the low quality of the merged image in the prior art is at least in part due to artefacts caused by the process of merging the two images, which artefacts increase with increasing difference between the first and
2
second exposure. In image sensors where the time periods over which the first and second image are captured are different, motion may cause artefacts. This problem of artefacts can be mitigated by a smooth evolution of the two exposures as the dynamic range of the scene to be imaged increases. The invention provides this smooth 5 evolution by making the first and second exposure interdependent instead of setting them independently as in the prior art. The invention makes the second exposure dependent on the first exposure and a factor determined by the first and/or second image. An appropriate choice of the relation between the two exposures and the factor allows the smooth evolution, thereby reducing the artefacts. 10 The invention also relates to a system for capturing images including an image sensor and an exposure controller according to the invention.
A further aspect of the relates to a method of controlling an exposure of a plurality of pixels of an image sensor, the image sensor providing a first image having a first exposure and a second image having a second, different exposure, the method 15 including the step of setting the first exposure in dependence on pixel intensities of at least one of the images, determining a factor in dependence on pixel intensities of at least one of the images, and setting the second exposure in dependence on the factor and the first exposure.
The invention further relates to a computer program for controlling an 20 exposure of a plurality of pixels of an image sensor adapted to perform the method according to the invention and to a data carrier including such a computer program.
Further features and advantages of the invention will become apparent from the following description of preferred embodiments of the invention, given by way of example only, which is made with reference to the accompanying drawings.
25
Brief Description of the Drawings
Figure 1 shows a system for capturing images;
Figure 2 shows schematically a procedure for setting exposures;
Figure 3a and 3b show time sequences of captured images; and 30 Figure 4 shows an example of a pixel intensity histogram and a target histogram.
3
Detailed Description
Figure 1 shows a system 1 for capturing images, e.g. a camera, including an electronic image sensor 2, an image processor 3 and an exposure controller. The image processor and the exposure controller may be integrated in one processor. An 5 image of a scene is captured by the image sensor. The sensitivity of an image sensor to incident light is determined by the exposure, which sets the time period over which light is captured by each pixel of the image sensor and / or the gain applied by an amplifier to the electrical signal generated by the pixel. The captured image is transferred to the image processor as an array of pixel intensities. The exposure 10 controller obtains data from the image processor, usually pixel intensities, derives an exposure from this data and transmits the exposure to the sensor for automatically setting the exposure of the sensor at a value optimum for the scene imaged onto the image sensor.
The dynamic range of an image refers to the ratio in pixel intensities between 15 the brightest and darkest pixels within that image. Conventional image sensors capture images with dynamic ranges up to approximately 80 dB. Image sensor-based systems capable of capturing images with dynamic ranges significantly in excess of this value are typically referred to as high dynamic range.
The system in Figure 1 is suitable for high dynamic range image capture. It 20 captures two or more images that may have different exposures, which are merged into a single image. The dynamic range of the captured images is usually smaller than the dynamic range of the merged image. The different exposures of the two or more images may be realised by different time periods over which light is captured by each pixel and / or the gain applied by an amplifier to the signal output by a pixel. Hence, a 25 low/high exposure has a short/long time period and /or a low/high amplification. The different time periods for capturing the images may by successive in time or partially or completely overlapping.
In the embodiment described below a first image and a second image are captured sequentially in time. Alternatively, the first and second image may be 30 captured within a single frame by means of alternating the exposure between each line or pair of lines to produce two images each of half the vertical resolution. The first image has a good contrast in the highlights of the scene and the second image has a
4
good contrast in the shadows of the scene. The two captured images are merged into one image, using a known merging method, as for example disclosed in "Being 'undigital'with digital cameras: Extending dynamic range by combining differently exposed pictures" by S. Mann and R. Picard in Proceedings of IS&T 46th annual 5 conference May 1995, pages 422-428. The merged images may form a video stream by combining the captured first and second image in each subsequent pair of frames to a series of merged images.
Figure 2 shows schematically a method for setting the exposure for the capture of the two images. The exposure of the first image is set using a first image captured 10 by the image sensor 2 in step 10 of Figure 2. In step 11 the pixel data of the first image is transferred to the image processor 3. The exposure controller 3 generates pixel statistics of the pixel data in step 12. The exposure controller separates the pixel data in shadows, midtones and highlights in step 13. It uses this data to determine an exposure shift in step 14, which is used in step 15 to set the first exposure, i.e. the 15 exposure for the first image. The exposure controller uses the data obtained in step 13 also to calculate an exposure factor in step 16. This factor is used in step 17 to set a second exposure for the second image in dependence on the first exposure. After capture of the second image in step 18, the pixel data are also transferred to the image processor 3 in step 19. The first and second image are combined by the image 20 processor into a merged image in step 20.
Figure 3a shows a time sequence of first images 30 and 32 and second images 31 and 33, captured by the image sensor. Pixel data of the first image 30 is used to determine the exposure of the next first image 32 and the second image 33 directly following this first image 32, as indicated by the line 34. Figure 3b shows the same 25 time sequence of images, but with a different exposure control as shown by line 35. The pixel data of the first image 30 is used to determine the exposure of the next second image 31 and the directly following first image 32.
The first exposure may be determined from the pixel data in a known manner, using a prior art automatic exposure control. However, preferably use is made of the 30 division of the pixel data in step 13 in zones, such as shadows, midtones and highlights, which division is also used for the determining the second exposure. The first exposure is based on a weighted average of a distribution of the pixel intensities.
5
To this end an intensity histogram having a number of zones is defined, i.e. a distribution of pixel intensities over intensity zones. Each zone has an associated weighting factor for weighting the pixel population in the zone. The division of the intensity histogram in a number of zones each with a zone weighting factor allows 5 weighting the intensity distribution, for example towards shadows, midtones or highlights.
In step 14 the following balance brightness Hbai of the image is determined using:
wherein N is the number of zones along a pixel intensity scale, Cc is the central 10 intensity, Q is the centre value of each zone, Z; is the number of pixels in each zone and T; is a weighting factor, which can also be considered as the target zone population.
A value of N equal to 3 has shown to provide a stable method for setting the exposures. In this case the three zones are called 'shadows', 'midtones' and 15 'highlights'. The number of zones is preferably larger than 3, even substantially larger. In this case, the weighting factors can be divided into multiple regions, e.g. three regions, which can be called 'shadows', 'midtones' and 'highlights'; the weighting factor T; can be constant for each zone i within a region. Although the described embodiments of the invention have zones and/or regions that are contiguous 20 in intensity, they may cover non-contiguous intensity intervals.
The top part of Figure 4 shows an example of a pixel intensity histogram showing the population P as a function of pixel intensity and having 14 zones divided into three regions S (shadows), M (midtones) and H (highlights). The lower part of Figure 4 shows an example of a target zone population T as a function of pixel 25 intensity, with N=14 and Ts = 0.5 P; TM= 0.45 P; TH= 0.05 P; Cc= 7.5; Ci= 2.5; C2= 7.5 and C3= 12.5. If all pixels are symmetrically distributed in the middle region, Hbai = 7.5. In the target population half of the pixels have intensities that fall in the S region, i.e. in the shadows, and only 5 % of the pixels fall in the highlights. The weighting reduces the number of pixels in highlights relative to the total number of
6
pixels. Hence, the target population relates to a low exposure, giving a good contrast in the highlights of the scene. It preserves highlights in the case of high dynamic range scenes but for low dynamic range scenes it sets a balanced exposure wherein the majority of the pixels are close to the target exposure.
5 To obtain the first, low exposure, the difference brightness Ha is determined from
~ Hbai ~ Htarget where Htarget is the target brightness what is input to the auto-exposure algorithm for determining the first exposure. Htarget is commonly set around the middle of the intensity range. Htarget = 7 in the example of Figure 4; hence, when Hbai = 7, the first 10 exposure will not change. The difference brightness is used to determine the direction of change of the first exposure. If HA < 0, the first exposure Es should be increased by an amount proportional to the magnitude of Ha. If Ha > 0, the first exposure should be decreased by an amount proportional to its magnitude. The change in exposure or exposure shift is determined in step 14 of Figure 2. The auto-exposure converges to 15 Htarget when Ha is equal to or close to zero.
The first exposure may alternatively be determined by controlling the number of pixels clipped at their maximum intensity. The number of clipped pixels may be controlled to be lower than a pre-determined threshold. This can be implemented by setting Ha > 0 if this number exceeds a threshold value and Ha < 0 otherwise. The 20 stability of convergence of the exposure is better for the above method where Ha is based on the balance brightness.
The second exposure is set in dependence on the first exposure and a factor depending on the pixel intensities of the first image and / or second image. The factor may be an additive factor, such that the second exposure is equal to the first exposure 25 plus the factor. Preferably, the factor is a multiplicative factor and the second exposure is proportional to the first exposure and the factor. In a preferred embodiment the second exposure is a high exposure EH that is related to the first, low exposure EL:
Eh — El * Elrat
7
where the factor Hrat is a brightness ratio that may be derived from the intensity histogram of the first image. When the factor Hrat increases with increasing dynamic range of the scene, the second exposure increases smoothly with respect to the first exposure. The factor has preferably a lower limit of one.
5 The factor depends preferably on a pixel population of shadows and highlights relative to a pixel population of midtones of at least one of the images. In this case the factor is a measure for the population of shadows and highlights, and therewith a measure of the dynamic range of the scene imaged. The relation between the factor and the dynamic range of the scene permits a control of the first and second exposure 10 in dependence on the dynamic range of the scene.
In a special embodiment, the factor or brightness ratio is based on a sum of a fixed constant and a weighted average of the pixel populations. The value of Hrat changes smoothly from a low dynamic range scene with the intensity population concentrated in the midtones to a high dynamic range scene having shadows and 15 highlights. This provides a smooth evolution of the exposures from a single exposure to two different exposures when the dynamic range of the scene increases. In a particular embodiment Hrat is given by
{j^ies,H 7^) + ^
Hrat ~ y,x
(SiEM 7\) + ^
where R is a constant parameter, which controls the sensitivity of the brightness ratio to differences in zone populations, i.e. to the dynamic range of the scene. If R is large, 20 Hrat becomes small for any image. The value of R is sensor dependent and is determined empirically; a typical value is 1/8. The target zone populations are preferably the same as used in the determination of the first exposure. The minimum value of Hrat is preferably set to unity, such that Hrat is larger than or equal to unity.
For a low dynamic range scene, the weighted populations in zones in shadows 25 and highlights will be small compared to the weighted population in midtones, such that Hrat is small and EH ~ EL. However, for a high dynamic range scene, where the weighted populations of zones in shadow and highlights regions significantly exceeds that of midtones, Hrat is large and EH > EL. The second image will show a good contrast in the shadows.
8
In an alternative embodiment, the above equation for Hrat is changed by taking the sum in the numerator over all zones instead of only over the shadows and highlights zones. The above equation provides a better stability near Hrat = 1.
In a preferred embodiment of the method a threshold Hmax is defined, such that 5 FTral = min (Hrat, Hmax). If the brightness ratio between the first and second image is larger than an image sensor dependent value, the merging of the first and second image will result in an image having a degraded quality. A typical value for Hmax is 8 or 16.
When the brightness ratio becomes relatively large, e.g. larger than 8 or 16, the 10 quality of the merged image will be improved when the merged image is a combination of three or more images instead of only the first and second image. The exposure of the one or more intermediate images is intermediate the first and second exposure. In this method the first and second exposure is determined as in the above embodiments. If Hrat is less than the threshold Hmax, no additional images are 15 captured. If Hrat is larger than Hmax, a middle exposure is determined, e.g. as
I iyt7 f-
Em = El+-^(Eh-El)
max
The middle exposure will lie midway between the first and second exposure when the brightness ratio is large and lie closer to the low exposure as the brightness ratio becomes smaller.
In the above described embodiments, the pixel data for determining the first 20 and second exposure is taken from the first image, i.e. the low exposure image. Alternatively, the pixel data may be taken from only the high exposure image. The pixel data may also be taken from a combination of the first and second image or from the merged image, which does not have detrimental effects caused by any clipping at the maximum intensity of the high exposure image. However, use of the low exposure 25 image only is computationally easier than the alternatives.
The embodiments of the method can be implemented in a computer program. The computer program may be stored in a memory of the exposure controller 4 in Figure 1.
The above embodiments are to be understood as illustrative examples of the 30 invention. Further embodiments of the invention are envisaged. It is to be understood
9
that any feature described in relation to any one embodiment may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, or any combination of any other of the embodiments. Furthermore, equivalents and modifications not described 5 above may also be employed without departing from the scope of the invention, which is defined in the accompanying claims.
10

Claims (1)

  1. Claims:
    1. An exposure controller for controlling an exposure of a plurality of pixels of an image sensor, the image sensor providing a first image having a first
    5 exposure and a second image having a second exposure,
    the exposure controller being arranged to set the first exposure in dependence on pixel intensities of at least one of the images, to determine a factor in dependence on pixel intensities of at least one of the images, and to set the second exposure in dependence on the factor and the first exposure.
    10
    2. An exposure controller according to claim 1, wherein the factor depends on a pixel population of shadows and highlights relative to a pixel population of midtones.
    15 3. An exposure controller according to claim 1 or 2, wherein the factor is multiplicative and the second exposure is proportional to the first exposure and the factor.
    4. An exposure controller according to claim 1, 2 or 3, wherein the factor
    20 and/or the first exposure is derived from an intensity histogram having a number of zones, each zone having a weighting factor for weighting the pixel population in the zone.
    5. An exposure controller according to claim 4, wherein the weighting
    25 reduces the number of pixels in highlights relative to the total number of pixels.
    6. An exposure controller according to claim 4 or 5, wherein the factor is based on a sum of a fixed constant and a weighted average of the pixel populations.
    30 7. An exposure controller according to any one of claims 1 to 6, wherein the first exposure is based on a weighted average of a distribution of the pixel intensities.
    11
    8. An exposure controller according to any one of claims 1 to 7, wherein the first exposure is a low exposure having a number of pixels clipped at a maximum intensity, the number being lower than a pre-determined threshold.
    5
    9. An exposure controller according to any one of claims 1 to 8 wherein the image sensor is arranged to provide a third image having a third exposure, the third exposure being between the first exposure and the second exposure.
    10 10. A system for capturing images including an image sensor and an exposure controller according to any one of claims 1 to 9.
    11. A method of controlling an exposure of a plurality of pixels of an image sensor, the image sensor providing a first image having a first exposure and a
    15 second image having a second, different exposure, the method including the step of setting the first exposure in dependence on pixel intensities of at least one of the images,
    determining a factor in dependence on pixel intensities of at least one of the images, and
    20 setting the second exposure in dependence on the factor and the first exposure.
    12. A method according to claim 11, wherein the factor depends on a pixel population of shadows and highlights relative to a pixel population of midtones.
    25 13. A method according to claim 11 or 12, wherein the factor is a ratio and the second exposure is proportional to the first exposure and the ratio.
    14. A method according to claim 11, 12 or 13, including the step of deriving the factor and/or the first exposure from an intensity histogram having a
    30 number of zones, each zone having a weighting factor for weighting the pixel population in the zone.
    12
    15. A computer program for controlling an exposure of a plurality of pixels of an image sensor adapted to perform the method of any of claims 10 to 14.
GB1203354.4A 2012-02-27 2012-02-27 Exposure controller Expired - Fee Related GB2499668B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1203354.4A GB2499668B (en) 2012-02-27 2012-02-27 Exposure controller
PCT/EP2013/053929 WO2013127849A1 (en) 2012-02-27 2013-02-27 Exposure controller
US14/469,479 US9584732B2 (en) 2012-02-27 2014-08-26 Exposure controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1203354.4A GB2499668B (en) 2012-02-27 2012-02-27 Exposure controller

Publications (3)

Publication Number Publication Date
GB201203354D0 GB201203354D0 (en) 2012-04-11
GB2499668A true GB2499668A (en) 2013-08-28
GB2499668B GB2499668B (en) 2019-03-06

Family

ID=45991779

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1203354.4A Expired - Fee Related GB2499668B (en) 2012-02-27 2012-02-27 Exposure controller

Country Status (3)

Country Link
US (1) US9584732B2 (en)
GB (1) GB2499668B (en)
WO (1) WO2013127849A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9436870B1 (en) 2014-06-06 2016-09-06 Amazon Technologies, Inc. Automatic camera selection for head tracking using exposure control
US9131150B1 (en) * 2014-06-06 2015-09-08 Amazon Technologies, Inc. Automatic exposure control and illumination for head tracking
US10311599B2 (en) * 2016-11-03 2019-06-04 Caterpillar Inc. System and method for diagnosis of lighting system
GB2569593B (en) * 2017-12-20 2021-05-26 Apical Ltd Exposure ratio control
EP3964035A1 (en) * 2019-04-30 2022-03-09 Signify Holding B.V. Camera-based lighting control
CN115550556B (en) * 2021-06-25 2023-10-24 荣耀终端有限公司 Exposure intensity adjusting method and related device
US11595589B2 (en) 2021-07-22 2023-02-28 Arthrex, Inc. Surgical camera system with high dynamic range

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1255410A2 (en) * 2001-05-02 2002-11-06 Agilent Technologies, Inc. (a Delaware corporation) System and method for capturing color images that extends the dynamic range of an image sensor
US20070285526A1 (en) * 2006-05-31 2007-12-13 Ess Technology, Inc. CMOS imager system with interleaved readout for providing an image with increased dynamic range
WO2009029810A1 (en) * 2007-08-31 2009-03-05 Historx, Inc. Automatic exposure time selection for imaging tissue
GB2464574A (en) * 2008-10-27 2010-04-28 Huawei Tech Co Ltd Combining multiple images to enhance dynamic range

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6879731B2 (en) * 2003-04-29 2005-04-12 Microsoft Corporation System and process for generating high dynamic range video
US8446480B2 (en) * 2006-12-20 2013-05-21 Nokia Corporation Exposure control based on image sensor cost function
JP4127411B1 (en) * 2007-04-13 2008-07-30 キヤノン株式会社 Image processing apparatus and method
BRPI0814365A2 (en) * 2007-07-25 2018-07-31 Candela Microsystems S Pte Ltd exposure control of an imaging system
US8339475B2 (en) * 2008-12-19 2012-12-25 Qualcomm Incorporated High dynamic range image combining
JP2011091584A (en) * 2009-10-21 2011-05-06 Seiko Epson Corp Imaging device, imaging method and electronic equipment
KR101633460B1 (en) * 2009-10-21 2016-06-24 삼성전자주식회사 Method and Apparatus for controlling multi-exposure

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1255410A2 (en) * 2001-05-02 2002-11-06 Agilent Technologies, Inc. (a Delaware corporation) System and method for capturing color images that extends the dynamic range of an image sensor
US20070285526A1 (en) * 2006-05-31 2007-12-13 Ess Technology, Inc. CMOS imager system with interleaved readout for providing an image with increased dynamic range
WO2009029810A1 (en) * 2007-08-31 2009-03-05 Historx, Inc. Automatic exposure time selection for imaging tissue
GB2464574A (en) * 2008-10-27 2010-04-28 Huawei Tech Co Ltd Combining multiple images to enhance dynamic range

Also Published As

Publication number Publication date
WO2013127849A1 (en) 2013-09-06
GB2499668B (en) 2019-03-06
US20140362282A1 (en) 2014-12-11
GB201203354D0 (en) 2012-04-11
US9584732B2 (en) 2017-02-28

Similar Documents

Publication Publication Date Title
US9584732B2 (en) Exposure controller
US9124811B2 (en) Apparatus and method for processing image by wide dynamic range process
US10063826B2 (en) Image processing apparatus and image processing method thereof
JP4622629B2 (en) Imaging device
KR101247646B1 (en) Image combining apparatus, image combining method and recording medium
US9350905B2 (en) Image signal processing apparatus, image signal processing method, and image capturing apparatus
US20100231748A1 (en) Imaging device
US11838649B2 (en) Image capturing device and control method thereof and medium
JP2008005081A (en) Authentication apparatus
JP3478452B2 (en) Backlight detection method, backlight detection device, and imaging camera
US10108878B2 (en) Image processing apparatus, image processing method, and storage medium for tone control of each object region in an image
JP5149055B2 (en) Imaging device
US9191573B2 (en) Image capturing apparatus for determining an exposure condition by calculating aphotmetric value for each object region and method of controlling the same
US20220021800A1 (en) Image capturing apparatus, method of controlling image capturing apparatus, and storage medium
KR20110125154A (en) Apparatus and method for auto adjusting brightness of image taking device
KR20160030350A (en) Apparatus for processing image and method for processing image
JP4629002B2 (en) Imaging device
JP5257487B2 (en) Imaging apparatus, imaging method, and program
JP5520863B2 (en) Image signal processing device
JP2015037222A (en) Image processing apparatus, imaging apparatus, control method, and program
US11012630B1 (en) Image processor and image processing method
JP2010271507A (en) Imaging device, exposure adjustment method, and program
JP4811494B2 (en) Imaging device
JP2009177669A (en) Imaging device
JP2004120203A (en) Imaging unit, and automatic exposure processing method

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20220929 AND 20221005

PCNP Patent ceased through non-payment of renewal fee

Effective date: 20230227