WO2010002379A1 - Digital camera control system - Google Patents

Digital camera control system Download PDF

Info

Publication number
WO2010002379A1
WO2010002379A1 PCT/US2008/068719 US2008068719W WO2010002379A1 WO 2010002379 A1 WO2010002379 A1 WO 2010002379A1 US 2008068719 W US2008068719 W US 2008068719W WO 2010002379 A1 WO2010002379 A1 WO 2010002379A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
digital image
plate
dynamic range
image
Prior art date
Application number
PCT/US2008/068719
Other languages
French (fr)
Inventor
James F. Alves
Original Assignee
Alves James F
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alves James F filed Critical Alves James F
Priority to PCT/US2008/068719 priority Critical patent/WO2010002379A1/en
Publication of WO2010002379A1 publication Critical patent/WO2010002379A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/003Apparatus for photographing CRT-screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present invention relates to a control system to optimize imaging parameters of a digital video camera.
  • High - speed digital cameras have become commonly used in toll collection and traffic law enforcement applications. These cameras must acquire images of fast moving objects with an image quality sufficient to identify the object(s) of interest.
  • the objective is often to be able to identify the vehicle by clearly imaging the vehicle's license plate.
  • both the plate's alpha-numerics and the state name must be legible in the image.
  • the larger alpha-numerics are often, but not always, block print in a uniform color on a uniform or nearly uniform background.
  • the state names however are often much smaller font sizes and printed in different colors in various scripts.
  • license plates are designed more for aesthetics rather than to maximize legibility especially on those plates sporting multi-colored and/or scenic backgrounds.
  • a camera system In order to recognize characters, a camera system must provide sufficient contrast between the lettering and the background.
  • White I ⁇ license plates illuminated by direct sunlight (or strong reflections of nighttime lighting) generally yield sufficient video image contrast to easily recognize characters, however the illumination onto and therefore the radiance from such plates may easily be so high as to saturate the camera's sensor. In these cases the gain of the camera needs to be low enough to prevent image saturation. However portions of license plates that are in shadows (or during low nighttime illumination conditions) often lack enough video image contrast to support character recognition. In these cases the gain of the camera needs to be increased to create the required contrast for plate legibility. Imaging license plates therefore requires a means to allocate the dynamic range of the imaging sensor to simultaneously address both low light levels requiring increased gain and high light levels requiring decreased gain.
  • a field of view or multiple cameras to observe at least one lane width, sufficient grey-scale/color contrast resolution to separate characters and State names from their background colors and sufficient dynamic range to prevent saturating the image whether the plate is in direct sunlight, in shadow, or under artificial illumination.
  • the camera For vehicles traveling at freeway speeds, the camera has only a fraction of a second to capture an image. License plates on vehicles may pass through the camera's field of view in 1 A second so there is little time for making camera gain controls during the time the vehicle's license plate is in the camera's field of view.
  • the first portion of the vehicle to appear in the camera's field of view may not be indicative of the irradiance levels seen at the end of the vehicle so first adjustments may not be best.
  • the camera should be adjusted to take a good image of the vehicle and its license plate prior to both of them appearing in the camera's field of view.
  • Prior art gain control systems have relied upon using an external sensor to measure plate lighting conditions and define exposure settings.
  • pixel value based control systems typically take several images in rapid succession at various gain settings to try to ensure that at least one results in a good image of the license plate. This may be difficult or impossible to achieve if the vehicle is moving at high speed and adds to the storage requirements of the system since multiple images must be saved until a determination as to which image is best for plate legibility.
  • a camera control system is described that meets the demanding needs of high speed traffic imaging applications.
  • the system relies upon environmental, geographic, illumination and electronic sensor response models that provide optimized settings for a given geographic location, camera direction, time of day, time of year and wide range of weather conditions.
  • the model uses environmental parameters coupled with a detailed illumination models to preset the camera for the conditions at the particular time, location and orientation that the imaging occurs.
  • the controller is able to set optimized image sensor parameters a priori. There is no need to continuously take images to sense the possible range of vehicle and plate radiance levels or read radiance levels from an external light sensor.
  • the camera is always ready for the particular camera location, direction, environmental factors and particular lighting conditions.
  • the control system includes a sun direction calculation to determine the azimuth and elevation of the sun with respect to the object being imaged for any time of day and any time of year.
  • a solar irradiance model determines the light hitting the object directly. Lighting of the object during daytime comes not just from direct solar light but also from reflected light. Solar light is scattered by the atmosphere, reflected off the earth and back again from the atmosphere thereby increasing the total illumination. Additionally light is reflected off road surfaces.
  • the models further include shadowing of the object by nearby stationary objects as well as potential shadowing by the vehicles' structure itself. The sum of the models provides an estimate of the total irradiance potentially available to the object.
  • the models further include an accounting of the change in object irradiance caused by the tilt of the object with respect to the sources of illumination.
  • the tilt is assumed to be a nearly vertical surface oriented along the direction of traffic flow.
  • the reflective response is modeled.
  • An optical imaging model then is used to calculate the light irradiance that would be imaged upon a sensor at the image plane of the camera. The relative geometric arrangement of the optics, sensor, object and various sources of illumination are accounted for to provide a calculated peak possible and lowest expected irradiance onto the sensor plane.
  • the optical model for irradiance onto the sensor is then mated with a response model for the camera image generation process.
  • the response of each pixel in the image can be adequately modeled as an offset plus a proportionality constant times the product of the magnitude of the irradiance illuminating it, an integration time, and an electronics gain.
  • the offset and proportionality constant are camera specific values that are measured during a calibration step prior to the camera being installed and utilized by the control algorithm to predict image values based on the predicted irradiance values onto the sensor.
  • the control algorithm uses the predicted sensor irradiance values to establish the image acquisition settings for the sensor to prevent saturation of the pixels that could correspond to license plate objects and to create an acceptable level of plate information contrast to ensure legibility.
  • the settings are changed during the day to account for changing lighting conditions as the sun advances across the sky.
  • the settings include integration times, pixel charge accumulation reset levels, and the particular times that these reset level should occur during the image acquisition integration cycle.
  • This invention describes a single reset time and level which results in a dual-slope pixel response to incoming irradiance levels, but multiple reset times and levels could also be employed by as simple extension of the technique described herein.
  • a dual-slope implementation results in a higher gain for plate objects acquired under low lighting conditions and a lower gain for plate objects acquired above some light level set by the selection of the pixel reset level and reset time.
  • the invention results in an imaging system that anticipates and corrects for changing lighting conditions without the need for external sensors and/or taking multiple images at different gain settings per license plate passage.
  • the resulting imaging system is faster responding, provides lower cost, lowers image storage requirements, and is more reliable than prior art in that it anticipates illumination settings prior to the actual image acquisition step and it eliminates extraneous components such as external image sensors.
  • Figure 1 is a block diagram of an image acquisition system in which the invention may be practiced.
  • Figures 2 and 3 are block diagram charts of various embodiments of the invention.
  • Figure 4 is a diagram of the coordinate system used for the illumination model embodiments of the invention.
  • Figure 5 depicts the irradiance model for lighting of a license plate.
  • Figure 6 depicts a road surface shadowing model embodiment accounting for the road shadowing from the vehicle carrying the plate, adjacent vehicles and nearby structures.
  • Figure 7 depicts a road shadowing model embodiment accounting for the road shadowing from a canopy.
  • Figure 8 depicts a plate radiance model embodiment for a beam illumination.
  • Figure 9 depicts a plate radiance model embodiment for diffuse illumination.
  • Figure 10 depicts a plate radiance mode embodiment for various sources of illumination.
  • Figure 11 depicts a plate radiance model embodiment further showing camera placement geometry.
  • Figure 12 depicts the geometric arrangement related to an image formation model embodiment of the invention.
  • Figure 13 depicts a camera response model embodiment of the invention.
  • Figure 14 shows the camera transfer function.
  • Figure 15 shows the pixel voltage output versus integration time.
  • Figure 16 shows the double-slope extended dynamic range embodiment for pixel output.
  • Figure 17 shows the pixel output voltage versus luminance and a contrast control algorithm embodiment of the invention.
  • Figure 18 shows depicts an image histogram used in a contrast control embodiment.
  • Figure 19 shows sensor output in 8 bit counts versus luminance and the contrast control algorithm embodiment factors.
  • Figure 20 shows a graph of sensor output voltage and luminance and a second graph of sensor voltage output versus integration time and a mapping between the two of factors related to a contrast control algorithm embodiment.
  • Figure 21 shows an exemplary acquired image and factors affecting contrast control and imaging requirements.
  • Figure 22 is a graph showing sensor voltage output versus luminance and factors related to worst case contrast requirements for imaging.
  • the present invention is a set of models and algorithms for estimating irradiance of an object to be imaged as a function of environmental factors, the resulting radiance of the object and irradiance upon an imaging sensor and control algorithms for the sensor based upon the environmental factors and particular imaging requirements.
  • the requirements are contrast sufficient to recognize characters upon a vehicle license plate tag.
  • the license plate features of interest are the background of the licenses plate, white being the brightest possible plate background, the numeric characters of the plate typically larger and black and the state or country designation lettering.
  • the imaging requirements are to have sufficient contrast between the license plate background and the numeric character and the state designation characters.
  • Figure 1 depicts a camera typical of which the invention may be practiced.
  • the camera consists of a near UV through near IR wavelengths pass filter 101 upon which the light irradiating from the intended object impinges.
  • the light will be composed of the license plate background irradiance, the plate character irradiance and the state name.
  • Light is focused upon an image sensor 102 using lens optical elements known in the art and not shown.
  • the signal from the sensor is a sum of the sensor response plus noise 103 from the sensor electronics. Control parameters for the sensor include the integration time and the reference black voltage level.
  • a dynamic range enhancement double slope embodiment of the invention includes a double-slope reset voltage (Vd-s) and double-slope reset time (Td -S ).
  • the double slope reset time occurs at a time shorter than the total integration time (T) for the image acquisition and the reset voltage is appropriately selected intermediate value between the black reference voltage and the saturation level voltage for the sensor.
  • the signal is then fed into an amplifier 104, through a low pass filter 105 and then into analog to digital converter (ADC) 107.
  • ADC analog to digital converter
  • noise 106 added to the signal from the amplifier and filter circuitry.
  • portions of the sensors are connected to two or more ADC's. Although this aids in high-speed image acquisition and processing it can also lead to a channel mismatch between the multiple ADC's.
  • Embodiments of the invention provide for matching of the output of multiple ADC's.
  • a feature of an exemplary ADC's is that the top voltage is fixed but the bottom voltage or reset level may be variable or adjusted.
  • Output from the analog to digital converter is a 10 bit digital video signal with a range of counts from max(i) to min(i) which is fed into the logic circuitry 108 and simultaneously on to image buffer circuitry 109 and a mapping circuitry logic 110 for mapping the 10 bit video signal into an 8 bit output.
  • the mapping circuitry logic includes embodiments of the invention that enable removal of artifacts due to non- functioning or malfunctioning pixels as well as accounting for channel mismatch in systems with multiple ADC channels.
  • the resultant output 111 is an 8 bit still image of the object of interest, in this case a vehicle and its license plate driving on a roadway.
  • FIGS 2 and 3 depict block diagrams of various embodiments of the invention. Details of each of the various embodiments are further discussed under the designated headings and in conjunction with the later Figures.
  • Environmental variables 201 of date, time, latitude and longitude for the camera (and object) placement are fed to an estimator embodiment 202 that calculates the effect of atmospheric and earth reflectance on the illumination of the object.
  • the same data is also fed into a calculator 203 that determines the sun direction, azimuth and elevation, with respect to the object of interest. Both of these sets of calculations enable an estimate of the solar irradiance 205.
  • the irradiance model embodiment includes effects of direct beam illumination as well as the effects of both clear sky and partly cloudy sky diffuse irradiance.
  • License plate orientation and road reflectance parameters 204 are combined with the output of the solar irradiance model 205 and fed to a road surface reflectance and shadowing model embodiment 206.
  • the output of the road surface reflectance and shadowing embodiment is the diffuse reflected irradiance on the license plate from the road under the lighting conditions fed in at the first step 201 and the geometric and reflectance parameters 204.
  • the output of the solar irradiance model and the license plate orientation parameters and road reflectance parameters 204 are also the input to a plate location shadowing model embodiment of the invention 207. In this model the direct beam and diffuse sky irradiance that impinge on a license plate are calculated.
  • the plate may be tilted at any angle from vertical.
  • the output of the road surface reflectance and shadowing model 206 and the output of the plate location shadowing model 207 are combined as input to the plate irradiance model 208 embodiment of the invention.
  • Output of this model is the total irradiance for illumination onto the license plate.
  • This irradiance model takes into account the time of day, season of the year, and direction of travel, as well as shadowing effects both from the vehicle itself and neighboring vehicles as well as roadway structures.
  • the model provides values for the radiance of the peak and shaded diffuse reflectance of the plate in the direction of the camera. Additional parameters related to the specific geometric relationships between the camera optics and the transmission properties of the optics 303 are fed along with the plate radiance into an optical imaging model 305. This model calculates the peak and shaded irradiance from the license plate that impinges onto the sensor.
  • a camera response model 306 uses the irradiance onto the sensor with the parameters that describe the camera and sensor 304.
  • the camera response model along with the camera transfer gains 307 obtained from calibration of the camera are fed into the camera control algorithm 310. Note that the calibration of the camera is a measure of the camera sensor response done offline and prior to the controls of the described system.
  • the algorithm further requires input of the desired plate pixel values under the extremes of clear sky direct sun and for a plate in a shadow 308 filtered through the camera electronics model 309 to output control values for the imaging integration time (T) the double slope reset voltage (Vd -S ) and the double slope reset time (Td -S ). These parameters are used to control the acquisition of a still image of the licenses plate from a rapidly moving vehicle under varying environmental conditions.
  • Embodiments of the invention enable setting of the parameters T, Td- S and Vd-s a priori. The camera is thereby ready to acquire an image without the need to read and calibrate in real time as is typically done in the prior art. Acquisition of the image is done through an external triggering mechanism that is not part of the instant invention.
  • Figure 4 depicts the sun direction calculator embodiment of the invention and introduces the coordinate system used for this and the other model embodiments.
  • the origin of the coordinate system 401 is set at a point in the roadway that represents the intersection of the nominal trigger point for beginning image acquisition 403 and the center point 412 of the license plate 402.
  • the x axis is in the roadway parallel to the license plate's long axis and perpendicular to the direction of travel of the vehicle.
  • the y axis 404 is located in the roadway.
  • the Zenith 405 completes an orthogonal axis system. Based upon the known latitude and longitude of the origin, and the known direction of travel of the vehicle, a true north direction 406 is determined.
  • the parameters of the sun elevation (sun el) and the parameter ⁇ are used in the subsequent irradiance and shadow models discussed below. The parameters are recalculated periodically during the day and fed into the remaining models to adjust for the time of day and date. Clear-Sky and earth reflectance parameters 202
  • Figure 5 is a diagram of the factors affecting the irradiance of the license plate 501.
  • Light from the sun 502 may reach the plate through a direct beam 505 as long as the plate is not shadowed.
  • the direct beam intensity is reduced by scattering and absorption in the stratosphere 503 and further reduced by scattering, adsorption and reflectance off clouds in the troposphere 504.
  • the direct beam light from the sun is also reflected off the earth surface 506 and re-reflected 507 back to the earth primarily from clouds.
  • In the vicinity of the plate there is also a reflection off the road 508 that further illuminates the plate.
  • the plate is therefore illuminated by the direct beam from the sun, by re-reflected light from the sky and by light reflected off the road surface.
  • An embodiment of the invention provides an estimate of the total irradiance upon the plate that takes into account all of these sources of illumination.
  • the direct beam irradiance for a cloudless sky is estimated from models equivalent to that presented in
  • the total solar irradiance is given by three factors: the normal beam irradiance, P n , the diffuse sky irradiance, P e ds, and the diffuse road irradiance P rs . If the nominal plate location is within a shadow then only diffuse irradiance from the sky and the raod will illuminate the plate. If the plate is not within a shadow then all three sources apply.
  • the normal sky irradiance P n is calculated using models such as that described immediately above.
  • the Diffuse sky irradiance assumes a reflective parameter for the earth surface to estimate the direct beam radiation (P n ) that is reflected skyward and then an additional cloud factor for the re-reflection of this radiation back to earth.
  • the re-reflected radiation (P e ds) is diffusely scattered off the clouds.
  • the radiation reflected off the road surface requires an estimate of the reflectivity of the road surfaces and uses this parameter to calculate the amount of direct solar radiation (P n ) that is diffusely reflected from the road (P rs ).
  • FIG. 6 is a diagram of the model for shadows arising form the vehicle carrying the plate and other nearby ground level objects.
  • Figure 7 is a model for the shadow created by the overhead structure of the canopy of a typical toll booth structure. Referring to Figure 6, the coordinate system is the same as has already introduced.
  • the nominal plate location 601 is located above the origin of the coordinate system.
  • the vehicle carrying the plate, adjacent vehicles in adjacent lanes and toll booths and the support structures are modeled as an infinitely wide vertical plane 602 located along the x-axis, nominal trigger line with a fixed vertical height of h e ff 603.
  • h e ff is estimated based upon the actual structures within the particular tollbooth setting. Typical values for heff are 3 to 10 feet depending upon actual structures in the area to be imaged.
  • Yshdw is the y-coordinate of the shadow edge 604
  • sun el is the solar elevation and ⁇ is the difference between the solar azimuth and the plate azimuth defined earlier.
  • the sun el is greater than 0 implying that the time is between sunrise and sunset.
  • the value of Yshdw is greater than zero for shadows in the foreground and Yshdw is less than zero for shadows in the background as shown in Figure 6.
  • shadows in imaging the license plate that arise form the canopy of the toll booth are estimated. Referring to Figure 7, again the nominal plate position 701 is in the same now familiar coordinate system.
  • the canopy 702 is modeled as an infinitely wide (in the x-direction) horizontal plane 703 located above and parallel to the x axis with a fixed length in the y-direction and with vertical front 704 and rear 705 facades.
  • the front edge of the canopy may be displaced from the x-axis nominal trigger line and the coordinate is given by Y& on t.
  • the rear edge of the canopy is located at Yrear and the width of the canopy horizontal plane is the difference between these coordinates.
  • the dimensions of the front and rear facades are given by their z-coordinates as indicated in the Figure.
  • the shadow cast by the canopy 706 is an infinitely wide stripe parallel to the x-axis trigger line with coordinate locations for the front and rear edges given by the following formulae:
  • the sun el is greater than zero and less than 180 degrees. License Plate Reflectance Model 302
  • Reflectance off the license plate or the radiance of the plate must be calculated as a function of viewing angle.
  • the ultimate goal is to combine the lighting and shadowing models to provide an estimate of the plate light that reaches the camera sensor.
  • the license plate material is assumed in all the following cases to behave as a diffuse Lambertian reflector. In another embodiment specular components are included as well. Radiant power per unit solid angle per unit projected area is constant in the direction of view. The emitted radiant power falls off as the cosine of the viewing angle.
  • Figure 8 depicts the radiance resulting from the direct beam irradiance of the license plate.
  • the solar beam 802 is incident upon the plate 801 at an angle ⁇ 803 from the plate normal 804.
  • the plate material is assumed to be a white retro-reflective license plate sheeting such as that manufactured by the 3M ® company since this is commonly employed white material for license plate objects.
  • a portion of the beam will be absorbed by the plate material, a portion will be retro-reflected along the incident beam 802 and the remainder will be diffusely reflected in angles as depicted by the rays 805.
  • the absorption coefficient and the coefficient of retro-reflectivity are functions of the material properties of the license plate sheeting material.
  • the absorption is 5%.
  • the background of the plate is assumed to be white. All wavelengths of light within the camera's spectral response range are therefore assumed to be reflected equally by the plate.
  • the coefficient of retro-reflectance of direct beam irradiance is given by a simple approximation:
  • P en is the direct beam irradiance. It is a rare event that the sun, plate, and the camera are aligned such that the extremely narrow retro-reflected beam is actually imaged by the camera.
  • the diffuse radiance from the plate due to direct beam illumination will however always be imaged by the camera as long as direct beam illumination is hitting the plate (see shadow models above).
  • the diffuse reflected radiance due to direct beam irradiance is given by:
  • Bb is the diffuse beam radiance and p p is the license plate material diffuse reflectance.
  • the license plate is also illuminated by light diffuse reflected from the roadway and sky and the radiance from the plate will have a component for this diffuse irradiance.
  • the diffuse irradiance arises both from light reflected off the road and from light reflected off the earth surface and then reflected back to earth from mainly clouds.
  • Figure 9 depicts the plate radiance due to this diffuse irradiance component.
  • Diffuse irradiance 902 impinges onto the plate surface 901. An observer at any point 905 in the vicinity of the plate will observe the same radiance regardless of the viewing angle ⁇ 904.
  • the reflected diffuse radiance is given by:
  • FIG. 10 depicts the total radiance from the plate 1001 due to diffuse irradiance.
  • the total diffuse irradiance arises from the sky (P e ds) 1002 and from the road surface (P rs ) 1003.
  • the diffuse sky irradiance arises from the direct sunlight that is reflected off the earth and then re-reflected off clouds and particulate back to earth.
  • the diffuse road irradiance 1003 arises from light that is reflected off the local road surface.
  • B s is radiance from the plate due to diffuse sky irradiance P e ds and B r is the radiance off the plate due to diffuse road irradiance P rs .
  • P is the license plate surface material diffuse reflectance.
  • FIG. 11 depicts the geometric relationships between the license plate 1101 and camera 1106.
  • the license plate 1101 is located at the nominal plate position 1111 as defined in previous discussions of the coordinate system used in the invention.
  • the angular direction of the plate is accounted for by calculating irradiance and radiance in terms of the plate surface normal vector
  • Illumination and reflection from the plate is defined in terms of the surface normal vector.
  • the direct sun beam 1103 is incident on the plate at an angle ⁇ 1104 from the plate normal vector.
  • the optical axis 1107 of the camera makes an angle ⁇ 1109 with a line 1110 drawn from the nominal plate position center to the camera's optical focal point.
  • the total diffuse plate radiance 1105 is the sum of the diffuse beam radiance plus the direct beam diffuse radiance discussed above.
  • the extremely narrow retro-reflected beam is rarely imaged by the camera.
  • Figure 12 depicts a more detailed view of the camera optics in the coordinate system.
  • the plate is at the nominal plate position 1201.
  • An area of the plate dA produces an image dA' at the image plane 1207 of the camera sensor.
  • the optics of the camera system 1203 have an optical axis 1204 that makes an angle ⁇ 1205 with the line 1209 drawn from the center of the optics 1208 and the nominal plate position.
  • the object plane 1202 is along the optical axis 1204 and at a distance s from the center of the optics 1208.
  • the image of the plate is formed at the image plane 1207 which is at a distance s' from the center of the center of the optics 1208.
  • the radius of effective aperture of the lens is shown as r 1210.
  • 1/fL 1/s + 1/s' (12)
  • fL the focal length of the lens implies that for s»fL, s' ⁇ fL.
  • the off axis irradiance L received on the image plane at plate pixel area dA' is :
  • B B' for a Lambertian emitter and lossless optics where B is the object radiance and B' is the image radiance.
  • the solid angle formed by off-axis projection of the exit pupil and position of the nominal plate in the image is ⁇ ' ⁇ :
  • L is the irradiance on the sensor due to the light reflected from the license plate.
  • the contrast control embodiment of the invention sets camera parameters such that the sensor saturates at the value for radiance of the license plates' white background.
  • the light irradiating the sensor can now be combined with the response parameters for the camera to give an estimate of the voltages generated by the sensor in response to the light.
  • the nominal spectral response of the sensor 1301 is scaled by the spectral transmittance of the optics 1302 including the window that encloses the camera, the lens, and filters to provide a nominal spectral response 1303 for the sensor.
  • the terrestrial global solar radiation reference spectrum 1304 is scaled by the integral of the intensity over the wavelength region of interest 1305 to provide a photo- synthetically active radiance (PAR) normalized solar spectrum 1306.
  • PAR photo- synthetically active radiance
  • the cross product 1308 of these spectra produces the PAR Normalized Spectral response R( ⁇ )1309.
  • the voltage generated by the sensor may be now calculated as follows.
  • the integral of the PAR normalized spectral response R(I) over the region of interest defines Rf c t r :
  • the plate pixel voltage change is given by:
  • the constants k and therefore ko represent material properties of the sensor and are determined empirically through calibration of the camera that typically takes place prior to site installation.
  • the camera sensor response is depicted in Figure 14.
  • the positive x-axis 1401 represents light input and the positive y-axis 1402 is the voltage output for a given light input.
  • the sensor response curve includes a linear region 1403 and a saturation region 1404.
  • the output of the analog to digital converter is a 10 bit number ranging from 0 to 1023 at saturation shown along the negative x-axis 1405. In another embodiment, explained below, this 10-bit output is mapped to an 8-bit output 1407 shown along the negative y-axis 1408.
  • the response of the sensor is also a function of the integration time as shown in Figure 15.
  • the total integration time (T) 1507, shown along the x-axis 1502 begins at the time of a reset pulse 1505 and ends at the time at which output is read 1506.
  • the output voltage of the sensor ranges from the voltage reset level 1503 to as high as the saturation level 1504.
  • the outputs for three different light levels are shown.
  • the slopes of the curves are proportional to the light levels.
  • the voltage out will be proportional to the input light intensity.
  • the response will reach saturation prior to the end of the integration time.
  • exposure parameters are selected such that the effective gain for the low light level is increased to increase contrast under low light conditions and the effective gain for the high light level conditions is decreased to provide usable output short of saturation.
  • the contrast control embodiments adjust total and intermediate integration times and intermediate reset voltages to allocate the dynamic range for both low and high level lighting situations and make full use of the entire dynamic range regardless of lighting conditions.
  • Camera Control Algorithm 310 The predictive model embodiments of the invention discussed above, essentially provides a virtual light sensor that predicts the maximum daylight plate irradiance for a plate that is located in the shade and the maximum daylight plate irradiance for a plate that is located in bright sunshine. The predictions are based upon sun position and shading caused by the vehicle itself and nearby cars and structures. The camera imaging model and camera response model embodiments predict the camera image pixel values of the plate object to these varying irradiation predictions. These predicted values are then used to set control parameters for the camera.
  • the control algorithm makes use of a double slope integration control to provide optimum imaging for the broad range of predicted lighting conditions.
  • Figure 16 provides a diagrammatic description of a double slope control algorithm.
  • the x- axis 1602 represents lapsed time during the imaging process and the y-axis 1601 represents the sensor output.
  • the sensor output ranges from the reset level 1603 to the saturation level 1604 and the total integration time (T) starts at the time of the reset pulse 1605 and ends at the readout time 1606.
  • T total integration time
  • the measurement begins with a reset pulse setting the voltage to the reset level and beginning the integration cycle.
  • a double slope reset pulse resets the voltage for some of the pixels to a double slope voltage level 1613. Only the pixels with a voltage greater that the double slope level 1613 at the time of reset 1612 are reset. Those below the reset voltage at the time of the reset pulse 1612 are not affected.
  • the integration continues after the reset pulse until the readout time 1606.
  • the time from the reset pulse to the readout is the double slope reset time (Td -S ).
  • Td -S The time from the reset pulse to the readout is the double slope reset time (Td -S ).
  • Curves 1608 and 1609 would represent responses for relatively low lighting conditions. At the time of the double-slope reset pulse 1612 both of these curves are below the double-slope level 1613 and are therefore not reset. Note that curve 1608 never reaches the double-slope level and curve 1609 exceeds the double slope level prior to the end of the integration time. Both response curves represent situations that would not be reset.
  • the response curves 1610 and 1611 represent higher intensity lighting conditions that would result in reset at the time of the double slope reset pulse.
  • Curve 1610 represents an intermediate lighting and response level and these pixel locations would be reset to the double slope level 1613 by the reset pulse at the time of the double-slope reset pulse 1612. Curve 1611 had reached saturation 1604 prior to the time of the double slope reset pulse and is likewise reset to the doubles slope level by the reset pulse.
  • the curve is shown offset form 1610 for visualization purposes only the sensor would be reset at the time of the reset pulse and not before. Integration continues for all curves until the read - out time 1606.
  • the double slope procedure extends the dynamic range since previously saturated pixels now will not saturate.
  • the camera parameters of integration time (T), double slope time (Td -S ) and the double slope reset voltage (Vd -S ) are selected to provide sufficient contrast for the image of the plate to enable character recognition for both the alphanumeric plate number as well as the state identification.
  • the camera parameters are selected to provide sufficient contrast to noise ratio as defined as:
  • Contrast/noise ratio
  • the pixel count of character is the digital output of the camera for regions of the acquired license plate image occupied by characters, either numeric characters of the plate number or the script characters of the state identification.
  • Pixel count of the background is the digital output of the camera for the background regions of the license plate.
  • Noise is the total noise arising from the sensor and electronics already discussed above.
  • the image contrast to noise ratio is adjusted through control of the three parameters, T, Td- S and Vd- S using only external environmental factors to select values for the parameters is one of the novel features of the invention.
  • the camera further has the ability to adjust the gain of the amplifier 104 ( Figure 1). In this embodiment the gain is raised just sufficiently to enable a sufficiently short integration time T. T must be short enough to provide a blur free image for character recognition. Raising the amplifier gain also raises the noise therefore the gain is set just sufficiently high to enable the required fast integration time.
  • the sensor response and plate luminance are shown along the y-axis 1701 and x-axis 1702 respectively.
  • the sensor response ranging from the reset reference voltage V re f 1703 to the saturation voltage V sa t 1704, is divided into two regions one for plates in the shade 1711 and a second region for plates located in the sun 1710.
  • the luminance shown along the x-axis 1702 defines boundaries for these regions at Lknee 1708 and at L sat 1709.
  • Lknee is the maximum possible white plate diffuse irradiance for a plate illuminated solely by diffuse irradiance from the road and diffuse irradiance from the sky i.e. shaded.
  • L sat is the peak
  • the control system results in an output voltage ranging from V re f 1703 the reset value for an imaging interval and V sa t 1704 the voltage for a saturated sensor.
  • the slope for the plates in the shade 1705 is greater than the slope for the region where plates would be illuminated in the sun 1706. This results in enhanced contrast and therefore ability to recognize characters on plates in the more difficult shaded situations. This sacrifices somewhat the contrast for the plates in the sun. However a sunlit plate typically has more than adequate contrast for readability.
  • the sensor parameters are selected such that 1 A of the dynamic range of the sensor output is allocated to the radiance of plates in the shade 1711 and 1 A of the dynamic range of the sensor is allocated to the radiance of plates in the sun 1710.
  • the allocation of the dynamic range is dependent upon the object to be imaged. The inventor has found that for license plates the allocation as provided enables character recognition across a wide range of environmental lighting conditions.
  • the primary target of the image may be consistently shadowed and the dynamic range is allocated to provide more data on the low luminance portion of the image.
  • the allocation of dynamic range may be weighted to the high luminance portion of the image.
  • the output of the A to D converter is shown in the negative x-axis region of Figure 17.
  • the 10 bit output of the A to D 1713 ranges from 0 to 1023. This is mapped by an auto-stretch embodiment of the invention, discussed below in conjunction with Figure 18, to an 8 bit output 1714. Mapping of the 10 bit output of the A to D to the ultimate 8 bit output of the camera allows correction for camera and sensor discrepancies such as dead pixels and A/D channel imbalance.
  • Figure 18 depicts a histogram of the 10 bit output of the A/D.
  • the mapping algorithm applies a filter algorithm to adjust for errors, defects and noise in the 10 bit image.
  • the histograms of real images vary continuously. Counts of pixels at low A to D output 1802 correspond to dark regions of the license plate image and counts of pixels at high A to D output 1803 correspond to the white background areas of the plate. In another embodiment an observed collapse of these two regions 1802, 1803 into a single histogram feature (not shown) is indicative of the need to turn on external lighting. Discontinuities within the histogram are an indicator of non- valid data points arising from defective pixels or noise.
  • the algorithm searches from the low end of the histogram until it encounters a smoothly changing portion of the histogram.
  • Isolated discontinuities or non- valid data points such as 1801 are ignored.
  • discontinuities at the high end of the histogram are also an indicator of non- valid data points arising from sensor defects or noise.
  • the discontinuity or non- valid data point at the high end of the A to D output 1804 would also be ignored or effectively filtered out.
  • the mapping algorithm then scales the digital image by mapping the point 1805 to 0 and the point 1806 to 255. Thereby the regions of the data containing real image data are retained and scaled to cover the full dynamic range of the output while regions containing non- valid data points arising from noise or errors are rejected. This ensures all the useful data of the image and the full 8-bit dynamic range is retained while rejecting errors and noise.
  • the output of multiple ADC's are separately filtered and mapped to the 8 bit output.
  • the mapped data may then be combined.
  • the scaling of the endpoints will scale the output of each ADC to the same range. Mismatched ADC's or channel imbalance is thereby accounted for and corrected.
  • the sensor output may not reach saturation.
  • the total integration time (T) is selected such that the white plate background is just at saturation.
  • the integration time (T) is also limited by the high-speed nature of the image.
  • the mapping of the histogram points will ensure that the output covers the full 8-bit dynamic range even when the lighting is too low to reach saturation. The mapping is not restricted to a map of 10 bit to 8 bit.
  • mapping may be from a 10 bit ADC output to a 10 bit scaled output.
  • Intermediate points may be calculated using for example a nearest neighbor, linear, bilinear, cubic or bicubic or similar algorithms as are known in the art. Algebraically, the output of the sensor is given as follows:
  • Vwht & Vbik A to D converter settings
  • k 2 slope of the count versus luminance curve after the reset ko and therefore ki and k 2 are calibration factors for the sensor.
  • the factor of 512 reflects the fact that 1 A of the dynamic range of the 10 bit A to D is allocated to the region of luminescence below the knee or shaded license plates and 1 A of the dynamic range is allocated to the region above the knee or radiance indicative of a non-shaded brightly lit license plate.
  • the allocation of the dynamic range is dependent upon the object to be imaged. The inventor has found that for license plates the allocation as provided enables character recognition across a wide range of environmental lighting conditions.
  • the primary target of the image may be consistently shadowed and the dynamic range is allocated to provide more data on the low luminance portion of the image.
  • the allocation of dynamic range may be weighted to the high luminance portion of the image.
  • the control algorithm embodiment of the invention is further exemplified in Figure 19.
  • the 8 bit output of the sensor is shown on the y-axis 1901 and the luminous exitance of the plate is shown on the x-axis 1902.
  • the factors kl and k2 are sensor specific calibration factors. During daytime hours estimates of the sun, sky and road irradiance, coupled with properties of the plate produce estimates of the plate radiance. These estimates are used to select Lknee and L sat . For radiance less than Lknee 1908 the 8 bit output is given by:
  • I k 2 *T d _ s *L 2 + I(Vd-s) (29) where Td- S is the double slope integration time and L 2 is the radiance of the plate in this range of luminance and I(Vd -s ) is the selected 8 bit output at the double slope reset voltage.
  • the Luminance 1910 corresponds to a maximum 8 bit output of 255. The highest irradiance on the plate occurs in a partly cloudy sky where both direct beam and re-reflected diffuse sky irradiance and road irradiance illuminate the plate.
  • the contrast control algorithm sets Lknee to the minimum expected white plate exitance caused by clear sky diffuse irradiance, reduced by a factor to account for the nearby structures blocking the plate's view of the sky, and road reflection of the clear sky irradiance with an accounting for shadows cast onto the road by the vehicle and nearby structures.
  • the contrast control algorithm sets L sat to the maximum expected white plate exitance caused by direct sun-beam and partly cloudy sky diffuse irradiance, again reduced by a factor to account for nearby structures blocking the plate's view of the sky and the road reflection of a partly cloudy sky diffuse irradiance, with accounting for shadows cast onto the road surface by nearby structures.
  • Figure 20 depicts the relation of the camera transfer function to the double slope parameters.
  • the camera transfer function 2001 variables of light level at saturation (L sat ) 2003, light level at which the transfer function changes gain (Lknee) 2004 and voltage level of the transfer function at Lknee (Vknee) 2005 map to the three double slope parameters of integration time (T) 2006, double slope reset time (Td-s) 2107 and the effective double slope reset voltage (Vd -S ) 2108 as shown in the graphical relation of voltage versus measurement time 2102.
  • T, Td- S and Vd- S of equations 22, 23 and 24 respectively apply.
  • the slope of the voltage versus luminance curve is ki*T.
  • the slope of the curve is k2*Td- s .
  • the contrast control embodiment is based upon an integration time, an intermediate reset time and an intermediate reset voltage.
  • the algorithm must account for the timing requirements of data sampling and integration in the context of the camera processor.
  • the calculated values for T and Td-s are adjusted for processing delays in the camera electronics.
  • Figure 21 depicts an image of a typical license plate.
  • the section of the plate image 2101 includes the alphanumeric text 2102 of a license plate number and the script text 2103 of the state identification.
  • a 14 foot field of view is required.
  • the more demanding imaging problem is the state identification.
  • the state-stroke width 2104 is only 1/16" wide. For the exemplar field of view and camera resolution there is an 85% probability of the state-stroke filling 50% or more of a pixel.
  • the histogram of Figure 18 is used to trigger the need for external lighting.
  • the points 1805 approach 1806 the lighting is not sufficient to provide the required contrast and an external lighting circuit is activated.
  • an external sensor is used to indicate a nighttime or other inadequate lighting condition.
  • the control parameters for the camera in a nighttime setting are dependent upon the lighting used and the arrangement of the camera with respect to the lighting and the license plate.
  • the parameters of T, Td- S and Vd- S are empirically determined.
  • the integration time (T) is selected based upon the high speed imaging requirements for a non-blurred image. Td- S and Vd- S are calculated as described above to allocate the dynamic range of the camera.
  • the dynamic range is allocated to equally to a high luminance region and to a low luminance region and T, Td- S and Vd- S are not varied through while artificially illuminated is used.
  • the gain of the camera is adjusted to enable sufficiently short integration time (T). Summary
  • a digital camera control system that requires no light sensors is described.
  • the control system relies on modeled external environmental geophysical solar parameters, geometric relationships between the object to be imaged and surrounding potentially shadowing objects and the material properties of the object to be imaged such as reflectivity are combined to produce the estimated irradiance on a camera sensor for the particular time of day, date and geometric relationship between the object and the sun.
  • the system enables setting optimized camera settings with no external or internal light sensors.
  • the system therefore provides a method to rapidly determine an optimum camera settings for any time of day and ensures the camera is always ready to capture a high contrast image of a fast moving transient object.
  • the system is demonstrated for use in a license plate imaging application.

Abstract

A digital camera control system that requires no light sensors is described. The control system relies on modeled external environmental geophysical solar parameters, geometric relationships between the object to be imaged and surrounding potentially shadowing objects and the material properties of the object to be imaged such as reflectivity are combined to produce the estimated irradiance on a camera sensor for the particular time of day, date and geometric relationship between the object and the sun. The system enables setting optimized camera settings with no external or internal light sensors. The system therefore provides a method to rapidly determine an optimum camera settings for any time of day and ensures the camera is always ready to capture a high contrast image of a fast moving transient object. The system is demonstrated for use in a license plate imaging application.

Description

DIGITAL CAMERA CONTROL SYSTEM
BACKGROUND OF THE INVENTION TECHNICAL FIELD The present invention relates to a control system to optimize imaging parameters of a digital video camera.
RELATED BACKGROUND ART
High - speed digital cameras have become commonly used in toll collection and traffic law enforcement applications. These cameras must acquire images of fast moving objects with an image quality sufficient to identify the object(s) of interest. In the case of vehicles on a motorway, the objective is often to be able to identify the vehicle by clearly imaging the vehicle's license plate. Typically both the plate's alpha-numerics and the state name must be legible in the image. The larger alpha-numerics are often, but not always, block print in a uniform color on a uniform or nearly uniform background. The state names however are often much smaller font sizes and printed in different colors in various scripts. Generally modern license plates are designed more for aesthetics rather than to maximize legibility especially on those plates sporting multi-colored and/or scenic backgrounds. In order to recognize characters, a camera system must provide sufficient contrast between the lettering and the background. White IΛlicense plates illuminated by direct sunlight (or strong reflections of nighttime lighting) generally yield sufficient video image contrast to easily recognize characters, however the illumination onto and therefore the radiance from such plates may easily be so high as to saturate the camera's sensor. In these cases the gain of the camera needs to be low enough to prevent image saturation. However portions of license plates that are in shadows (or during low nighttime illumination conditions) often lack enough video image contrast to support character recognition. In these cases the gain of the camera needs to be increased to create the required contrast for plate legibility. Imaging license plates therefore requires a means to allocate the dynamic range of the imaging sensor to simultaneously address both low light levels requiring increased gain and high light levels requiring decreased gain. To capture and make the video image of license plate characters and State name information legible requires high spatial resolution, a field of view (or multiple cameras) to observe at least one lane width, sufficient grey-scale/color contrast resolution to separate characters and State names from their background colors and sufficient dynamic range to prevent saturating the image whether the plate is in direct sunlight, in shadow, or under artificial illumination.
Recent advances in video camera technology have resulted in high spatial resolution sensors where pixels do not suffer from blooming or smearing problems (both of which can reduce or eliminate alpha-numeric contrast), have low noise so that small differences in alpha-numeric to local background contrasts can be resolved, and have gains that can vary across the array depending on the amount of light received (this feature supports imaging wide dynamic ranges without saturating). In order to provide sufficient contrast between characters and background of a license plate, a control system must set gain/exposure settings to make full use of the range of capabilities provided by these new sensors. The requirements to produce legible images are different in a high light situation versus a low light setting. For example, if a portion of a license plate is in shadow a high gain is needed to ensure good alpha-numeric to background contrast, whereas for a portion of a license plate in direct sunlight there is a naturally high contrast so that the gain can be lowered to prevent saturation of the image sensor and analog to digital electronics. A control system must be employed to make these tradeoffs. The conditions under which the picture must be taken are also constantly changing. On short time scales the lighting during the daylight hours can change due to cloud cover and shadowing caused by other vehicles or even the vehicle that is being imaged. For typical traffic imaging applications the cameras are typically operational night and day and through all seasons of the year. The lighting conditions change as the angle of the sun changes relative to the plate surface during the course of a day and more slowly over the course of seasons. At night there is often artificial illumination provided to image the vehicle and its plate. Fixed exposure settings will not provide the image quality and brightness and contrast required to read the plate information.
For vehicles traveling at freeway speeds, the camera has only a fraction of a second to capture an image. License plates on vehicles may pass through the camera's field of view in 1A second so there is little time for making camera gain controls during the time the vehicle's license plate is in the camera's field of view. In addition the first portion of the vehicle to appear in the camera's field of view may not be indicative of the irradiance levels seen at the end of the vehicle so first adjustments may not be best. Ideally the camera should be adjusted to take a good image of the vehicle and its license plate prior to both of them appearing in the camera's field of view. Prior art gain control systems have relied upon using an external sensor to measure plate lighting conditions and define exposure settings. Other systems monitor the pixel values inside a portion of the camera's field of view to determine how to control camera gain. Both of these systems have drawbacks. External sensors add cost and complexity to the system and are often difficult to install in the optimum location to perform their light measurement function. Systems that use pixel value measurements for camera control suffer from time lags between when the light measurements are made versus when the camera gain control changes take affect and the relatively slow sampling rates of the pixel data relative to rapidly moving vehicles. , In addition, the uncertainty of which pixel values correspond to actual license plate light levels (rather than the myriad of other radiance levels that can appear in the scene) means that the control input has a high uncertainty. Because of these issues, pixel value based control systems typically take several images in rapid succession at various gain settings to try to ensure that at least one results in a good image of the license plate. This may be difficult or impossible to achieve if the vehicle is moving at high speed and adds to the storage requirements of the system since multiple images must be saved until a determination as to which image is best for plate legibility.
There is a need for a control system for digital cameras that does not require light sensors (either internally using camera pixel values or externally using an auxiliary sensor) and yet the camera exposure controls are continuously adapted to ensure that a recognizable image of a vehicle and its license plate can be captured at the precise moment that the top and bottom of the vehicle and its license plate appear well framed within the camera's field of view. There is a need for a control system that does not suffer from any appreciable delays in setting proper camera controls or result in any significant delays in initiating a full frame image capture relative to the fastest expected rate of license plate movement through the camera's field of view. There is a need for a control system that ensures readability of both the numbers on the plates and the lettering that identifies the state of registration of the plate. There is a need for a system that can account for changing lighting conditions both on a short time scale (such as sudden cloud coverage of direct sunlight) and on long, even seasonal, time scales. There is a need for a system that can optimally take advantage of the full response range of the sensor. There is a need for a system that will provide sufficient contrast to read numbers, letters and otherwise identify an object in all conditions from low light levels where the object is shadowed to high light levels.
DISCLOSURE OF THE INVENTION A camera control system is described that meets the demanding needs of high speed traffic imaging applications. The system relies upon environmental, geographic, illumination and electronic sensor response models that provide optimized settings for a given geographic location, camera direction, time of day, time of year and wide range of weather conditions. The model uses environmental parameters coupled with a detailed illumination models to preset the camera for the conditions at the particular time, location and orientation that the imaging occurs. Through the use of models and known environmental factors the controller is able to set optimized image sensor parameters a priori. There is no need to continuously take images to sense the possible range of vehicle and plate radiance levels or read radiance levels from an external light sensor. The camera is always ready for the particular camera location, direction, environmental factors and particular lighting conditions.
The control system includes a sun direction calculation to determine the azimuth and elevation of the sun with respect to the object being imaged for any time of day and any time of year. Once the direction of the sun with respect to the object and the camera system is known, a solar irradiance model determines the light hitting the object directly. Lighting of the object during daytime comes not just from direct solar light but also from reflected light. Solar light is scattered by the atmosphere, reflected off the earth and back again from the atmosphere thereby increasing the total illumination. Additionally light is reflected off road surfaces. The models further include shadowing of the object by nearby stationary objects as well as potential shadowing by the vehicles' structure itself. The sum of the models provides an estimate of the total irradiance potentially available to the object. The models further include an accounting of the change in object irradiance caused by the tilt of the object with respect to the sources of illumination. In the case of a license plate, the tilt is assumed to be a nearly vertical surface oriented along the direction of traffic flow. Once the lighting of the object is known then the reflective response is modeled. Based upon material and geometric properties of the object the amount of light reflected from background and feature regions of the plate can be estimated. An optical imaging model then is used to calculate the light irradiance that would be imaged upon a sensor at the image plane of the camera. The relative geometric arrangement of the optics, sensor, object and various sources of illumination are accounted for to provide a calculated peak possible and lowest expected irradiance onto the sensor plane. Again this calculation is completed for the particular current time of day and season of the year. Since it takes a minute or more for any apparent solar movement in the sky, appropriate irradiance estimates can be developed well in advance of the appearance of any vehicle inside the camera's field of view. Thus new camera gain parameters can be established well in advance and applied quickly to the real time imaging problem. The optical model for irradiance onto the sensor is then mated with a response model for the camera image generation process. The response of each pixel in the image can be adequately modeled as an offset plus a proportionality constant times the product of the magnitude of the irradiance illuminating it, an integration time, and an electronics gain. The offset and proportionality constant are camera specific values that are measured during a calibration step prior to the camera being installed and utilized by the control algorithm to predict image values based on the predicted irradiance values onto the sensor.
The control algorithm uses the predicted sensor irradiance values to establish the image acquisition settings for the sensor to prevent saturation of the pixels that could correspond to license plate objects and to create an acceptable level of plate information contrast to ensure legibility. The settings are changed during the day to account for changing lighting conditions as the sun advances across the sky. The settings include integration times, pixel charge accumulation reset levels, and the particular times that these reset level should occur during the image acquisition integration cycle. This invention describes a single reset time and level which results in a dual-slope pixel response to incoming irradiance levels, but multiple reset times and levels could also be employed by as simple extension of the technique described herein. A dual-slope implementation results in a higher gain for plate objects acquired under low lighting conditions and a lower gain for plate objects acquired above some light level set by the selection of the pixel reset level and reset time. The invention results in an imaging system that anticipates and corrects for changing lighting conditions without the need for external sensors and/or taking multiple images at different gain settings per license plate passage. The resulting imaging system is faster responding, provides lower cost, lowers image storage requirements, and is more reliable than prior art in that it anticipates illumination settings prior to the actual image acquisition step and it eliminates extraneous components such as external image sensors.
In the following discussion the imaging of vehicle license plate tags is used for exemplary purposes to illustrate the invention. Application to other image acquisition problems would be apparent to those skilled in the art and are intended to be included within the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram of an image acquisition system in which the invention may be practiced. Figures 2 and 3 are block diagram charts of various embodiments of the invention. Figure 4 is a diagram of the coordinate system used for the illumination model embodiments of the invention.
Figure 5 depicts the irradiance model for lighting of a license plate.
Figure 6 depicts a road surface shadowing model embodiment accounting for the road shadowing from the vehicle carrying the plate, adjacent vehicles and nearby structures.
Figure 7 depicts a road shadowing model embodiment accounting for the road shadowing from a canopy.
Figure 8 depicts a plate radiance model embodiment for a beam illumination. Figure 9 depicts a plate radiance model embodiment for diffuse illumination. Figure 10 depicts a plate radiance mode embodiment for various sources of illumination.
Figure 11 depicts a plate radiance model embodiment further showing camera placement geometry.
Figure 12 depicts the geometric arrangement related to an image formation model embodiment of the invention.
Figure 13 depicts a camera response model embodiment of the invention. Figure 14 shows the camera transfer function.
Figure 15 shows the pixel voltage output versus integration time.
Figure 16 shows the double-slope extended dynamic range embodiment for pixel output.
Figure 17 shows the pixel output voltage versus luminance and a contrast control algorithm embodiment of the invention. Figure 18 shows depicts an image histogram used in a contrast control embodiment.
Figure 19 shows sensor output in 8 bit counts versus luminance and the contrast control algorithm embodiment factors.
Figure 20 shows a graph of sensor output voltage and luminance and a second graph of sensor voltage output versus integration time and a mapping between the two of factors related to a contrast control algorithm embodiment. Figure 21 shows an exemplary acquired image and factors affecting contrast control and imaging requirements.
Figure 22 is a graph showing sensor voltage output versus luminance and factors related to worst case contrast requirements for imaging. DETAILED DESCRIPTION
The present invention is a set of models and algorithms for estimating irradiance of an object to be imaged as a function of environmental factors, the resulting radiance of the object and irradiance upon an imaging sensor and control algorithms for the sensor based upon the environmental factors and particular imaging requirements. In the exemplary case the requirements are contrast sufficient to recognize characters upon a vehicle license plate tag. The license plate features of interest are the background of the licenses plate, white being the brightest possible plate background, the numeric characters of the plate typically larger and black and the state or country designation lettering. The imaging requirements are to have sufficient contrast between the license plate background and the numeric character and the state designation characters. Figure 1 depicts a camera typical of which the invention may be practiced. The camera consists of a near UV through near IR wavelengths pass filter 101 upon which the light irradiating from the intended object impinges. In the case of focusing upon a license plate the light will be composed of the license plate background irradiance, the plate character irradiance and the state name. Light is focused upon an image sensor 102 using lens optical elements known in the art and not shown. The signal from the sensor is a sum of the sensor response plus noise 103 from the sensor electronics. Control parameters for the sensor include the integration time and the reference black voltage level. A dynamic range enhancement double slope embodiment of the invention includes a double-slope reset voltage (Vd-s) and double-slope reset time (Td-S). The double slope reset time occurs at a time shorter than the total integration time (T) for the image acquisition and the reset voltage is appropriately selected intermediate value between the black reference voltage and the saturation level voltage for the sensor. The signal is then fed into an amplifier 104, through a low pass filter 105 and then into analog to digital converter (ADC) 107. There is another element of noise 106 added to the signal from the amplifier and filter circuitry. In another embodiment portions of the sensors are connected to two or more ADC's. Although this aids in high-speed image acquisition and processing it can also lead to a channel mismatch between the multiple ADC's. Embodiments of the invention provide for matching of the output of multiple ADC's. A feature of an exemplary ADC's is that the top voltage is fixed but the bottom voltage or reset level may be variable or adjusted. Output from the analog to digital converter is a 10 bit digital video signal with a range of counts from max(i) to min(i) which is fed into the logic circuitry 108 and simultaneously on to image buffer circuitry 109 and a mapping circuitry logic 110 for mapping the 10 bit video signal into an 8 bit output. The mapping circuitry logic includes embodiments of the invention that enable removal of artifacts due to non- functioning or malfunctioning pixels as well as accounting for channel mismatch in systems with multiple ADC channels. The resultant output 111 is an 8 bit still image of the object of interest, in this case a vehicle and its license plate driving on a roadway. Figures 2 and 3 depict block diagrams of various embodiments of the invention. Details of each of the various embodiments are further discussed under the designated headings and in conjunction with the later Figures. Environmental variables 201 of date, time, latitude and longitude for the camera (and object) placement are fed to an estimator embodiment 202 that calculates the effect of atmospheric and earth reflectance on the illumination of the object. The same data is also fed into a calculator 203 that determines the sun direction, azimuth and elevation, with respect to the object of interest. Both of these sets of calculations enable an estimate of the solar irradiance 205. The irradiance model embodiment includes effects of direct beam illumination as well as the effects of both clear sky and partly cloudy sky diffuse irradiance. License plate orientation and road reflectance parameters 204 are combined with the output of the solar irradiance model 205 and fed to a road surface reflectance and shadowing model embodiment 206. The output of the road surface reflectance and shadowing embodiment is the diffuse reflected irradiance on the license plate from the road under the lighting conditions fed in at the first step 201 and the geometric and reflectance parameters 204. The output of the solar irradiance model and the license plate orientation parameters and road reflectance parameters 204 are also the input to a plate location shadowing model embodiment of the invention 207. In this model the direct beam and diffuse sky irradiance that impinge on a license plate are calculated. The plate may be tilted at any angle from vertical. The output of the road surface reflectance and shadowing model 206 and the output of the plate location shadowing model 207 are combined as input to the plate irradiance model 208 embodiment of the invention. Output of this model is the total irradiance for illumination onto the license plate. This irradiance model takes into account the time of day, season of the year, and direction of travel, as well as shadowing effects both from the vehicle itself and neighboring vehicles as well as roadway structures. Once the total irradiance onto the plate is known the plate reflectance, the plate reflectance parameters along with the geometric parameters 301 describing the relative positioning and orientation of the license plate and the camera are combined with this total irradiance information into a license plate reflectance model 302. The model provides values for the radiance of the peak and shaded diffuse reflectance of the plate in the direction of the camera. Additional parameters related to the specific geometric relationships between the camera optics and the transmission properties of the optics 303 are fed along with the plate radiance into an optical imaging model 305. This model calculates the peak and shaded irradiance from the license plate that impinges onto the sensor. A camera response model 306 uses the irradiance onto the sensor with the parameters that describe the camera and sensor 304. The camera response model along with the camera transfer gains 307 obtained from calibration of the camera are fed into the camera control algorithm 310. Note that the calibration of the camera is a measure of the camera sensor response done offline and prior to the controls of the described system. The algorithm further requires input of the desired plate pixel values under the extremes of clear sky direct sun and for a plate in a shadow 308 filtered through the camera electronics model 309 to output control values for the imaging integration time (T) the double slope reset voltage (Vd-S) and the double slope reset time (Td-S). These parameters are used to control the acquisition of a still image of the licenses plate from a rapidly moving vehicle under varying environmental conditions. Embodiments of the invention enable setting of the parameters T, Td-S and Vd-s a priori. The camera is thereby ready to acquire an image without the need to read and calibrate in real time as is typically done in the prior art. Acquisition of the image is done through an external triggering mechanism that is not part of the instant invention. The image will be shown to have sufficient contrast for identification and recognition of both the numerals and the state or region of registration identified on the license plate. The following discussion provides details for each of the major embodiments discussed above. The sub-heading titles and corresponding numbers in the following sections refer to the embodiments and corresponding numbers of Figures 2 and 3. Sun Direction Calculator 203
Figure 4 depicts the sun direction calculator embodiment of the invention and introduces the coordinate system used for this and the other model embodiments. The origin of the coordinate system 401 is set at a point in the roadway that represents the intersection of the nominal trigger point for beginning image acquisition 403 and the center point 412 of the license plate 402. The x axis is in the roadway parallel to the license plate's long axis and perpendicular to the direction of travel of the vehicle. The y axis 404 is located in the roadway. The Zenith 405 completes an orthogonal axis system. Based upon the known latitude and longitude of the origin, and the known direction of travel of the vehicle, a true north direction 406 is determined. The parameters of date and time of day are then used to provide the elevation 408 and azimuth 409 of the sun for the time and location of the origin. Such azimuth and elevation data may be calculated using programs described in the publication: Ibrahim Reda and Afshin Andreas, Solar Position Algorithm for Solar Radiation
Applications. National Renewable Energy Lab, TP-560-34302, November, 2005 which is incorporated by reference. Tabular data are available from references such as provided by United States Naval Operations (see
Figure imgf000011_0001
The azimuth of the license plate 410 is also calculated in the same coordinate systems. The parameter Ψ is calculated as the difference in the azimuth of the sun and the plate:
Ψ = sun az - plate az (1)
The parameters of the sun elevation (sun el) and the parameter Ψ are used in the subsequent irradiance and shadow models discussed below. The parameters are recalculated periodically during the day and fed into the remaining models to adjust for the time of day and date. Clear-Sky and earth reflectance parameters 202
Figure 5 is a diagram of the factors affecting the irradiance of the license plate 501. Light from the sun 502 may reach the plate through a direct beam 505 as long as the plate is not shadowed. The direct beam intensity is reduced by scattering and absorption in the stratosphere 503 and further reduced by scattering, adsorption and reflectance off clouds in the troposphere 504. The direct beam light from the sun is also reflected off the earth surface 506 and re-reflected 507 back to the earth primarily from clouds. In the vicinity of the plate there is also a reflection off the road 508 that further illuminates the plate. The plate is therefore illuminated by the direct beam from the sun, by re-reflected light from the sky and by light reflected off the road surface. An embodiment of the invention provides an estimate of the total irradiance upon the plate that takes into account all of these sources of illumination. The direct beam irradiance for a cloudless sky is estimated from models equivalent to that presented in
Gueymard, CA. , REST2: High-performance solar radiation model for cloudless-sky irradiance. illuminance, and photosynthetically active radiation - Validation with a benchmark dataset. Sol. Energy. Elsevier (2007), which is incorporated by reference. Photosynthetically active radiation (PAR) is coincidentally the wavelength response band for the imaging sensors used in the exemplary camera system. Solar Irradiance Model 205
The total solar irradiance is given by three factors: the normal beam irradiance, Pn, the diffuse sky irradiance, Peds, and the diffuse road irradiance Prs. If the nominal plate location is within a shadow then only diffuse irradiance from the sky and the raod will illuminate the plate. If the plate is not within a shadow then all three sources apply. The normal sky irradiance Pn is calculated using models such as that described immediately above. The Diffuse sky irradiance assumes a reflective parameter for the earth surface to estimate the direct beam radiation (Pn) that is reflected skyward and then an additional cloud factor for the re-reflection of this radiation back to earth. The re-reflected radiation (Peds) is diffusely scattered off the clouds. Similarly the radiation reflected off the road surface requires an estimate of the reflectivity of the road surfaces and uses this parameter to calculate the amount of direct solar radiation (Pn) that is diffusely reflected from the road (Prs).
Shadowing Models 206. 207
Total irradiance on the plate must take into account not just the sources of light but also the shadowing of the plate. Embodiments of the invention account for two primary sources of shadows. Figure 6 is a diagram of the model for shadows arising form the vehicle carrying the plate and other nearby ground level objects. Figure 7 is a model for the shadow created by the overhead structure of the canopy of a typical toll booth structure. Referring to Figure 6, the coordinate system is the same as has already introduced. The nominal plate location 601 is located above the origin of the coordinate system. The vehicle carrying the plate, adjacent vehicles in adjacent lanes and toll booths and the support structures are modeled as an infinitely wide vertical plane 602 located along the x-axis, nominal trigger line with a fixed vertical height of heff 603. heff is estimated based upon the actual structures within the particular tollbooth setting. Typical values for heff are 3 to 10 feet depending upon actual structures in the area to be imaged. The resultant shadow 604 is then an infinitely wide strip with one edge located on the x- axis trigger line and the other edge located in the y-direction given by: Yshdw = -heff*cos(sun_el)*cosΨ/sin(sun_el) (2)
Where Yshdw is the y-coordinate of the shadow edge 604, sun el is the solar elevation and Ψ is the difference between the solar azimuth and the plate azimuth defined earlier. The sun el is greater than 0 implying that the time is between sunrise and sunset. The value of Yshdw is greater than zero for shadows in the foreground and Yshdw is less than zero for shadows in the background as shown in Figure 6. In another embodiment, shadows in imaging the license plate that arise form the canopy of the toll booth are estimated. Referring to Figure 7, again the nominal plate position 701 is in the same now familiar coordinate system. The canopy 702 is modeled as an infinitely wide (in the x-direction) horizontal plane 703 located above and parallel to the x axis with a fixed length in the y-direction and with vertical front 704 and rear 705 facades. The front edge of the canopy may be displaced from the x-axis nominal trigger line and the coordinate is given by Y&ont. The rear edge of the canopy is located at Yrear and the width of the canopy horizontal plane is the difference between these coordinates. The dimensions of the front and rear facades are given by their z-coordinates as indicated in the Figure. The shadow cast by the canopy 706 is an infinitely wide stripe parallel to the x-axis trigger line with coordinate locations for the front and rear edges given by the following formulae:
Projection slope fctr = cos(sun_el)*cos(Ψ)/sin(sun_el) (3)
Yfmtshdw = max[(Yfrnt-zfhlgh*projection_slope_fctr),(Yfrnt - zfiow*projection_slope_fctr)] (4) Yrearshdw = min[(Yrear - zrhigh*projection slope_fcrt),(yrear-zriow*projection_slope_fctr)] (5) The sun el is greater than zero and less than 180 degrees. License Plate Reflectance Model 302
Reflectance off the license plate or the radiance of the plate must be calculated as a function of viewing angle. The ultimate goal is to combine the lighting and shadowing models to provide an estimate of the plate light that reaches the camera sensor. The license plate material is assumed in all the following cases to behave as a diffuse Lambertian reflector. In another embodiment specular components are included as well. Radiant power per unit solid angle per unit projected area is constant in the direction of view. The emitted radiant power falls off as the cosine of the viewing angle. Figure 8 depicts the radiance resulting from the direct beam irradiance of the license plate. The solar beam 802 is incident upon the plate 801 at an angle γ 803 from the plate normal 804. The plate material is assumed to be a white retro-reflective license plate sheeting such as that manufactured by the 3M ® company since this is commonly employed white material for license plate objects. A portion of the beam will be absorbed by the plate material, a portion will be retro-reflected along the incident beam 802 and the remainder will be diffusely reflected in angles as depicted by the rays 805. The absorption coefficient and the coefficient of retro-reflectivity are functions of the material properties of the license plate sheeting material. For the exemplar material (such as 3M ® 4750E and 4780E retro-reflective sheeting), the absorption is 5%. The background of the plate is assumed to be white. All wavelengths of light within the camera's spectral response range are therefore assumed to be reflected equally by the plate. The coefficient of retro-reflectance of direct beam irradiance is given by a simple approximation:
RA(O) = -0.0158*α+0.7787 with RA(α) > 0 (6) Again, more complex models can be employed if required. The retro-reflected beam is given by :
Figure imgf000014_0001
Where Pen is the direct beam irradiance. It is a rare event that the sun, plate, and the camera are aligned such that the extremely narrow retro-reflected beam is actually imaged by the camera. The diffuse radiance from the plate due to direct beam illumination will however always be imaged by the camera as long as direct beam illumination is hitting the plate (see shadow models above). The diffuse reflected radiance due to direct beam irradiance is given by:
Bb = Pen*(l-RA(α))*pp*cos(γ)/π (8)
Where Bb is the diffuse beam radiance and pp is the license plate material diffuse reflectance. The license plate is also illuminated by light diffuse reflected from the roadway and sky and the radiance from the plate will have a component for this diffuse irradiance. The diffuse irradiance arises both from light reflected off the road and from light reflected off the earth surface and then reflected back to earth from mainly clouds. Figure 9 depicts the plate radiance due to this diffuse irradiance component. Diffuse irradiance 902 impinges onto the plate surface 901. An observer at any point 905 in the vicinity of the plate will observe the same radiance regardless of the viewing angle θ 904. The reflected diffuse radiance is given by:
B = pP/π (9)
where B is the radiance from the plate due to diffuse irradiance, p is the plate diffuse reflectance and P is the incident irradiance. Figure 10 depicts the total radiance from the plate 1001 due to diffuse irradiance. The total diffuse irradiance arises from the sky (Peds) 1002 and from the road surface (Prs) 1003. The diffuse sky irradiance arises from the direct sunlight that is reflected off the earth and then re-reflected off clouds and particulate back to earth. The diffuse road irradiance 1003 arises from light that is reflected off the local road surface. In both cases equation 9 applies for the radiance from the plate: Bs = pPeds/π (10)
Br = pPrs/π (11)
where Bs is radiance from the plate due to diffuse sky irradiance Peds and Br is the radiance off the plate due to diffuse road irradiance Prs. P is the license plate surface material diffuse reflectance. Optical Imaging Model 305
Further embodiments of the invention include the total radiance from the plate that impinges upon the camera's sensor. Figure 11 depicts the geometric relationships between the license plate 1101 and camera 1106. The license plate 1101 is located at the nominal plate position 1111 as defined in previous discussions of the coordinate system used in the invention. The angular direction of the plate is accounted for by calculating irradiance and radiance in terms of the plate surface normal vector
1102. Illumination and reflection from the plate is defined in terms of the surface normal vector. The direct sun beam 1103 is incident on the plate at an angle γ 1104 from the plate normal vector. The optical axis 1107 of the camera makes an angle θ 1109 with a line 1110 drawn from the nominal plate position center to the camera's optical focal point. The total diffuse plate radiance 1105 is the sum of the diffuse beam radiance plus the direct beam diffuse radiance discussed above. The extremely narrow retro-reflected beam is rarely imaged by the camera. Figure 12 depicts a more detailed view of the camera optics in the coordinate system. The plate is at the nominal plate position 1201. An area of the plate dA produces an image dA' at the image plane 1207 of the camera sensor. The optics of the camera system 1203 have an optical axis 1204 that makes an angle θ 1205 with the line 1209 drawn from the center of the optics 1208 and the nominal plate position. The object plane 1202 is along the optical axis 1204 and at a distance s from the center of the optics 1208. The image of the plate is formed at the image plane 1207 which is at a distance s' from the center of the center of the optics 1208. The radius of effective aperture of the lens is shown as r 1210. Using the lens equation :
1/fL = 1/s + 1/s' (12) where fL is the focal length of the lens implies that for s»fL, s'~fL. The f-number of the optics is defined as : f# = fL/(2r) (13) and substituting gives: r/s' = l/(2f#)
The off axis irradiance L received on the image plane at plate pixel area dA' is :
L=B'Ω'θcos(θ) (14)
B=B' for a Lambertian emitter and lossless optics where B is the object radiance and B' is the image radiance. The solid angle formed by off-axis projection of the exit pupil and position of the nominal plate in the image is Ω'θ:
Ω'θ = [πr2cos(θ)]/[s'cos(θ)]2 = π(r/s')2cos3(θ) (15)
Substituting gives:
L = (π/4)Bcos4(θ)/(f#)2 ( 16)
L is the irradiance on the sensor due to the light reflected from the license plate. The contrast control embodiment of the invention, discussed below, sets camera parameters such that the sensor saturates at the value for radiance of the license plates' white background.
Camera Response Model 306
The light irradiating the sensor can now be combined with the response parameters for the camera to give an estimate of the voltages generated by the sensor in response to the light. Referring to Figure 13, The nominal spectral response of the sensor 1301 is scaled by the spectral transmittance of the optics 1302 including the window that encloses the camera, the lens, and filters to provide a nominal spectral response 1303 for the sensor. Similarly the terrestrial global solar radiation reference spectrum 1304 is scaled by the integral of the intensity over the wavelength region of interest 1305 to provide a photo- synthetically active radiance (PAR) normalized solar spectrum 1306. The cross product 1308 of these spectra produces the PAR Normalized Spectral response R(λ)1309. The voltage generated by the sensor may be now calculated as follows. The integral of the PAR normalized spectral response R(I) over the region of interest defines Rfctr:
Rfctr = JR(λ)dλ (17)
The plate pixel voltage change is given by:
ΔVpixei = dA'*fillfdtr*Rfctr*L*T*k = ko*L*T (18)
Where ΔVpixei = the plate pixel voltage change dA' = physical size of the pixel cell fillfctr = fraction of dA' that responds to light L = PAR irradiance incident on the pixel T = sensor integration time k = sensor conversion gain.
The constants k and therefore ko represent material properties of the sensor and are determined empirically through calibration of the camera that typically takes place prior to site installation.
Camera Electronics Model 309
The camera sensor response is depicted in Figure 14. The positive x-axis 1401 represents light input and the positive y-axis 1402 is the voltage output for a given light input. The sensor response curve includes a linear region 1403 and a saturation region 1404. The output of the analog to digital converter is a 10 bit number ranging from 0 to 1023 at saturation shown along the negative x-axis 1405. In another embodiment, explained below, this 10-bit output is mapped to an 8-bit output 1407 shown along the negative y-axis 1408. The response of the sensor is also a function of the integration time as shown in Figure 15. The total integration time (T) 1507, shown along the x-axis 1502, begins at the time of a reset pulse 1505 and ends at the time at which output is read 1506. Likewise the output voltage of the sensor, shown on the y-axis 1501, ranges from the voltage reset level 1503 to as high as the saturation level 1504. The outputs for three different light levels are shown. The slopes of the curves are proportional to the light levels. At low light levels 1508 and intermediate light levels 1509 the voltage out will be proportional to the input light intensity. However at high light levels the response will reach saturation prior to the end of the integration time. In the contrast control embodiment of the invention exposure parameters are selected such that the effective gain for the low light level is increased to increase contrast under low light conditions and the effective gain for the high light level conditions is decreased to provide usable output short of saturation. The contrast control embodiments adjust total and intermediate integration times and intermediate reset voltages to allocate the dynamic range for both low and high level lighting situations and make full use of the entire dynamic range regardless of lighting conditions.
Camera Control Algorithm 310 The predictive model embodiments of the invention discussed above, essentially provides a virtual light sensor that predicts the maximum daylight plate irradiance for a plate that is located in the shade and the maximum daylight plate irradiance for a plate that is located in bright sunshine. The predictions are based upon sun position and shading caused by the vehicle itself and nearby cars and structures. The camera imaging model and camera response model embodiments predict the camera image pixel values of the plate object to these varying irradiation predictions. These predicted values are then used to set control parameters for the camera. The control algorithm makes use of a double slope integration control to provide optimum imaging for the broad range of predicted lighting conditions. Figure 16 provides a diagrammatic description of a double slope control algorithm. The x- axis 1602 represents lapsed time during the imaging process and the y-axis 1601 represents the sensor output. The sensor output ranges from the reset level 1603 to the saturation level 1604 and the total integration time (T) starts at the time of the reset pulse 1605 and ends at the readout time 1606. The measurement begins with a reset pulse setting the voltage to the reset level and beginning the integration cycle. At some intermediate time 1612 a double slope reset pulse resets the voltage for some of the pixels to a double slope voltage level 1613. Only the pixels with a voltage greater that the double slope level 1613 at the time of reset 1612 are reset. Those below the reset voltage at the time of the reset pulse 1612 are not affected. The integration continues after the reset pulse until the readout time 1606. The time from the reset pulse to the readout is the double slope reset time (Td-S). Four different exemplary curves are shown in Figure 16. Curves 1608 and 1609 would represent responses for relatively low lighting conditions. At the time of the double-slope reset pulse 1612 both of these curves are below the double-slope level 1613 and are therefore not reset. Note that curve 1608 never reaches the double-slope level and curve 1609 exceeds the double slope level prior to the end of the integration time. Both response curves represent situations that would not be reset. The response curves 1610 and 1611 represent higher intensity lighting conditions that would result in reset at the time of the double slope reset pulse. Curve 1610 represents an intermediate lighting and response level and these pixel locations would be reset to the double slope level 1613 by the reset pulse at the time of the double-slope reset pulse 1612. Curve 1611 had reached saturation 1604 prior to the time of the double slope reset pulse and is likewise reset to the doubles slope level by the reset pulse. The curve is shown offset form 1610 for visualization purposes only the sensor would be reset at the time of the reset pulse and not before. Integration continues for all curves until the read - out time 1606. The double slope procedure extends the dynamic range since previously saturated pixels now will not saturate. In a contrast control algorithm embodiment of the invention the camera parameters of integration time (T), double slope time (Td-S) and the double slope reset voltage (Vd-S) are selected to provide sufficient contrast for the image of the plate to enable character recognition for both the alphanumeric plate number as well as the state identification. The camera parameters are selected to provide sufficient contrast to noise ratio as defined as:
Contrast/noise ratio = |Pixel count of character - Pixel count of background|/Noise (19)
Where the pixel count of character is the digital output of the camera for regions of the acquired license plate image occupied by characters, either numeric characters of the plate number or the script characters of the state identification. Pixel count of the background is the digital output of the camera for the background regions of the license plate. Noise is the total noise arising from the sensor and electronics already discussed above. In a first embodiment the image contrast to noise ratio is adjusted through control of the three parameters, T, Td-S and Vd-S using only external environmental factors to select values for the parameters is one of the novel features of the invention. In another embodiment the camera further has the ability to adjust the gain of the amplifier 104 (Figure 1). In this embodiment the gain is raised just sufficiently to enable a sufficiently short integration time T. T must be short enough to provide a blur free image for character recognition. Raising the amplifier gain also raises the noise therefore the gain is set just sufficiently high to enable the required fast integration time.
Referring now to Figure 17, the double - slope concept is applied to the license plate imaging. The sensor response and plate luminance are shown along the y-axis 1701 and x-axis 1702 respectively. The sensor response, ranging from the reset reference voltage Vref 1703 to the saturation voltage Vsat 1704, is divided into two regions one for plates in the shade 1711 and a second region for plates located in the sun 1710. The luminance shown along the x-axis 1702 defines boundaries for these regions at Lknee 1708 and at Lsat 1709. Lknee is the maximum possible white plate diffuse irradiance for a plate illuminated solely by diffuse irradiance from the road and diffuse irradiance from the sky i.e. shaded. Lsat is the peak The control system results in an output voltage ranging from Vref 1703 the reset value for an imaging interval and Vsat 1704 the voltage for a saturated sensor. The slope for the plates in the shade 1705 is greater than the slope for the region where plates would be illuminated in the sun 1706. This results in enhanced contrast and therefore ability to recognize characters on plates in the more difficult shaded situations. This sacrifices somewhat the contrast for the plates in the sun. However a sunlit plate typically has more than adequate contrast for readability. In one embodiment the sensor parameters are selected such that 1A of the dynamic range of the sensor output is allocated to the radiance of plates in the shade 1711 and 1A of the dynamic range of the sensor is allocated to the radiance of plates in the sun 1710. The allocation of the dynamic range is dependent upon the object to be imaged. The inventor has found that for license plates the allocation as provided enables character recognition across a wide range of environmental lighting conditions. In another embodiment the primary target of the image may be consistently shadowed and the dynamic range is allocated to provide more data on the low luminance portion of the image. In another embodiment the allocation of dynamic range may be weighted to the high luminance portion of the image.
The output of the A to D converter is shown in the negative x-axis region of Figure 17. The 10 bit output of the A to D 1713 ranges from 0 to 1023. This is mapped by an auto-stretch embodiment of the invention, discussed below in conjunction with Figure 18, to an 8 bit output 1714. Mapping of the 10 bit output of the A to D to the ultimate 8 bit output of the camera allows correction for camera and sensor discrepancies such as dead pixels and A/D channel imbalance.
Figure 18 depicts a histogram of the 10 bit output of the A/D. The mapping algorithm applies a filter algorithm to adjust for errors, defects and noise in the 10 bit image. The histograms of real images vary continuously. Counts of pixels at low A to D output 1802 correspond to dark regions of the license plate image and counts of pixels at high A to D output 1803 correspond to the white background areas of the plate. In another embodiment an observed collapse of these two regions 1802, 1803 into a single histogram feature (not shown) is indicative of the need to turn on external lighting. Discontinuities within the histogram are an indicator of non- valid data points arising from defective pixels or noise. The algorithm searches from the low end of the histogram until it encounters a smoothly changing portion of the histogram. Isolated discontinuities or non- valid data points, such as 1801, are ignored. Similarly discontinuities at the high end of the histogram are also an indicator of non- valid data points arising from sensor defects or noise. In the exemplary histogram of Figure 18 the discontinuity or non- valid data point at the high end of the A to D output 1804 would also be ignored or effectively filtered out. The mapping algorithm then scales the digital image by mapping the point 1805 to 0 and the point 1806 to 255. Thereby the regions of the data containing real image data are retained and scaled to cover the full dynamic range of the output while regions containing non- valid data points arising from noise or errors are rejected. This ensures all the useful data of the image and the full 8-bit dynamic range is retained while rejecting errors and noise. In another embodiment the output of multiple ADC's are separately filtered and mapped to the 8 bit output. The mapped data may then be combined. The scaling of the endpoints will scale the output of each ADC to the same range. Mismatched ADC's or channel imbalance is thereby accounted for and corrected. In another embodiment as lighting decreases the sensor output may not reach saturation. Typically the total integration time (T) is selected such that the white plate background is just at saturation. However the integration time (T) is also limited by the high-speed nature of the image. The mapping of the histogram points will ensure that the output covers the full 8-bit dynamic range even when the lighting is too low to reach saturation. The mapping is not restricted to a map of 10 bit to 8 bit. In another embodiment the mapping may be from a 10 bit ADC output to a 10 bit scaled output. Intermediate points may be calculated using for example a nearest neighbor, linear, bilinear, cubic or bicubic or similar algorithms as are known in the art. Algebraically, the output of the sensor is given as follows:
I = min[ΔVplxei + Vref - VMk), Vsat] * 1023/(Vwht - VMk) ( 19)
= ki*L*T = 512 /(ki*T) for 0<L<Lknee (20)
= min[(k2*(L-Lknee)*Tds + 512, 1023] = 512/(k2*Tds) +Iw, for Iw <L<Lsat (21) where ΔvpiXei = license plate voltage change = ko*L*T
Vref = black level
Vwht & Vbik = A to D converter settings
Vsat = sensor saturation voltage ki = ko * 1023 (for a 10 bit A to D output) k2 = slope of the count versus luminance curve after the reset ko and therefore ki and k2 are calibration factors for the sensor.
Ideally ki = k2, however in practice it has been found that the response does not necessarily follow this ideal and therefore both parameters ki and k2 are required for some camera systems.
The control solution is calculated as follows: T = (Vknee - Vref) / (kl*Lknee) (22)
Td-s = (Vsat " Vknee) / (k2*(Lsat - Lknee)) (23) Vd-s = Vknee -
Figure imgf000022_0001
(24)
Or in terms of the A to D output:
Figure imgf000022_0002
Td.s = 512*ki*T / (k2*(ki*Lsat*T - 512)) (26) Vd-s = 512*(l-(k2/k1)*Td.s / T) (27)
The factor of 512 reflects the fact that 1A of the dynamic range of the 10 bit A to D is allocated to the region of luminescence below the knee or shaded license plates and 1A of the dynamic range is allocated to the region above the knee or radiance indicative of a non-shaded brightly lit license plate. The allocation of the dynamic range is dependent upon the object to be imaged. The inventor has found that for license plates the allocation as provided enables character recognition across a wide range of environmental lighting conditions. In another embodiment the primary target of the image may be consistently shadowed and the dynamic range is allocated to provide more data on the low luminance portion of the image. In another embodiment the allocation of dynamic range may be weighted to the high luminance portion of the image. The control algorithm embodiment of the invention is further exemplified in Figure 19. The 8 bit output of the sensor is shown on the y-axis 1901 and the luminous exitance of the plate is shown on the x-axis 1902. The factors kl and k2 are sensor specific calibration factors. During daytime hours estimates of the sun, sky and road irradiance, coupled with properties of the plate produce estimates of the plate radiance. These estimates are used to select Lknee and Lsat. For radiance less than Lknee 1908 the 8 bit output is given by:
Figure imgf000022_0003
Where T is the integration time and Li is the radiance of the plate in this range. For radiance above Lknee yet below saturation 1909, the 8 bit output is given by:
I = k2*Td_s*L2 + I(Vd-s) (29) where Td-S is the double slope integration time and L2 is the radiance of the plate in this range of luminance and I(Vd-s) is the selected 8 bit output at the double slope reset voltage. At saturation the Luminance 1910 corresponds to a maximum 8 bit output of 255. The highest irradiance on the plate occurs in a partly cloudy sky where both direct beam and re-reflected diffuse sky irradiance and road irradiance illuminate the plate. The contrast control algorithm sets Lknee to the minimum expected white plate exitance caused by clear sky diffuse irradiance, reduced by a factor to account for the nearby structures blocking the plate's view of the sky, and road reflection of the clear sky irradiance with an accounting for shadows cast onto the road by the vehicle and nearby structures. The contrast control algorithm sets Lsat to the maximum expected white plate exitance caused by direct sun-beam and partly cloudy sky diffuse irradiance, again reduced by a factor to account for nearby structures blocking the plate's view of the sky and the road reflection of a partly cloudy sky diffuse irradiance, with accounting for shadows cast onto the road surface by nearby structures.
Figure 20 depicts the relation of the camera transfer function to the double slope parameters. The camera transfer function 2001 variables of light level at saturation (Lsat) 2003, light level at which the transfer function changes gain (Lknee) 2004 and voltage level of the transfer function at Lknee (Vknee) 2005 map to the three double slope parameters of integration time (T) 2006, double slope reset time (Td-s) 2107 and the effective double slope reset voltage (Vd-S) 2108 as shown in the graphical relation of voltage versus measurement time 2102. The formula for T, Td-S and Vd-S of equations 22, 23 and 24 respectively apply. In the region below Lknee 2009 the slope of the voltage versus luminance curve is ki*T. In the region above Lknee 2010, the slope of the curve is k2*Td-s.
Because the contrast control embodiment is based upon an integration time, an intermediate reset time and an intermediate reset voltage. The algorithm must account for the timing requirements of data sampling and integration in the context of the camera processor. In one embodiment the calculated values for T and Td-s are adjusted for processing delays in the camera electronics.
Example
Figure 21 depicts an image of a typical license plate. The section of the plate image 2101 includes the alphanumeric text 2102 of a license plate number and the script text 2103 of the state identification. In order to capture the image of the plate a 14 foot field of view is required. For a sensor with 2048 pixels per line there will be 146 pixels per foot or 3 pixels will cover the typical 1A inch stroke width of the alphanumeric plate number. The more demanding imaging problem is the state identification. Typically the state-stroke width 2104 is only 1/16" wide. For the exemplar field of view and camera resolution there is an 85% probability of the state-stroke filling 50% or more of a pixel. An additional challenge is that the difference in radiance between the state-stroke and the plate background is limited because of the non-black colors typically used for the state identification. In the example shown red is used for the state identification. The state-stroke radiance can typically be as high as 1/3 of the diffuse plate background radiance. The automated software for identification typically requires a contrast to noise ratio (Δ/2σ) of > 5. For typical noise of σ = 1.5 counts, this implies need a contrast or difference in counts between the stroke and the background of Δ > 15. The requirement is D > 15 between the full background pixels and the pixels 50% filled by the state-stroke over all lighting conditions including sun retro-reflection. The result for this exemplary application is depicted in Figure 22. To calculate the response calibration factors for a typical sensor such as the LUPA-4000 manufactured by Cypress Semiconductor were used. The worst case difference between Lknee 2201 and Lsat 2202 is a radiance ration of 10:1. Because Vi of the dynamic range is allocated to this range this provides a calibration for the pixel count Δ per radiance. The worst case for contrast will occur for plates in shadows and the worst case ratio for red lettering of the state-stroke versus the background radiance is 3:1. Using calibration factors for this particular sensor and mapping the radiance values to the 8 bit output of the sensor 2205 results in an estimate of the difference between white background counts and a pixel of Vi background and Vi state-stroke of 14 counts. In other words this worst case is right at the boundary of the minimum required Δ of 15.
Nighttime Control
In another embodiment as the Lsat of Figure 22 approaches Lknee there is seen a need for additional lighting. In one embodiment external artificial lighting is turned on when Lsat is approaches Lknee.
In another embodiment the histogram of Figure 18 is used to trigger the need for external lighting. When the points 1805 approach 1806 the lighting is not sufficient to provide the required contrast and an external lighting circuit is activated. In another embodiment an external sensor is used to indicate a nighttime or other inadequate lighting condition. The control parameters for the camera in a nighttime setting are dependent upon the lighting used and the arrangement of the camera with respect to the lighting and the license plate. The parameters of T, Td-S and Vd-S are empirically determined. The integration time (T) is selected based upon the high speed imaging requirements for a non-blurred image. Td-S and Vd-S are calculated as described above to allocate the dynamic range of the camera. In a preferred embodiment the dynamic range is allocated to equally to a high luminance region and to a low luminance region and T, Td-S and Vd-S are not varied through while artificially illuminated is used. In another embodiment the gain of the camera is adjusted to enable sufficiently short integration time (T). Summary
A digital camera control system that requires no light sensors is described. The control system relies on modeled external environmental geophysical solar parameters, geometric relationships between the object to be imaged and surrounding potentially shadowing objects and the material properties of the object to be imaged such as reflectivity are combined to produce the estimated irradiance on a camera sensor for the particular time of day, date and geometric relationship between the object and the sun. The system enables setting optimized camera settings with no external or internal light sensors. The system therefore provides a method to rapidly determine an optimum camera settings for any time of day and ensures the camera is always ready to capture a high contrast image of a fast moving transient object. The system is demonstrated for use in a license plate imaging application.
Those skilled in the art will appreciate that various adaptations and modifications of the preferred embodiments can be configured without departing from the scope and spirit of the invention. Therefore, it is to be understood that the invention may be practiced other than as specifically described herein, within the scope of the appended claims.

Claims

What is claimed is:
1. A camera control system comprising: a) a solar irradiance model with an output including solar irradiance, b) a shadowing model with an output including shadow locations, c) an object reflectance model with an output including object radiance, d) a camera imaging model with an output including camera response versus sensor irradiance, and, e) camera electronics comprising exposure control parameters, f) where the exposure control parameters are set prior to imaging an object based upon the outputs of the solar irradiance, shadowing, object reflectance and camera imaging models, and g) a digital image, having a dynamic range, acquired of an object by the camera electronics set to the exposure control parameters.
2. The camera control system of claim 1 where the exposure control parameters comprise a double slope integration capability including control parameters of integration time (T), an intermediate double slope reset time (Td-S) and an intermediate double slope reset voltage (Vd-S).
3. The camera control system of claim 1 further comprising a mapping algorithm for the digital image that filters out non- valid data points of the digital image at high luminance and that filters out non- valid data points of the digital image at low luminance from the digital image and scales the digital image.
4. The camera control system of claim 3 where the mapping algorithm maps a ten bit digital image to an eight bit digital image.
5. The camera control system of claim 2 where Td-S and Vd-S are selected to allocate a portion of the dynamic range of the image to capture images of objects under high luminance and to allocate a second portion of the dynamic range of the image to capture images of objects under low luminance.
6. The camera of claim 5 where one half of the dynamic range is allocated to capture images of objects illuminated by direct sunlight and one half of the dynamic range is allocated to capture images of objects in shadows.
7. The camera control system of claim 1 where the camera control system is used for an automated traffic imaging camera and the object is a license plate.
8. A camera comprising: a) a multi-element light sensing array, b) a video amplifier for amplifying signals produced by the array, c) electronics for setting exposure control parameters, d) an exposure control system, said exposure control system comprising: i) a solar irradiance model with an output including solar irradiance, ii) a shadowing model with an output including shadow locations, iii) an object reflectance model with an output including object radiance, iv) a camera imaging model with an output including camera response versus sensor irradiance, where the exposure control parameters are set based upon the outputs of the solar irradiance, shadowing, object reflectance and camera imaging models prior to acquiring a digital image of an object.
9. The camera of claim 8 where the exposure control parameters comprise a double slope integration capability including control parameters of integration time (T), an intermediate double slope reset time (Td-s) and an intermediate double slope reset voltage (Vd-S).
10. The camera of claim 8 further comprising a mapping algorithm for the digital image that filters out non-valid data points of the digital image at high luminance and that filters out non-valid data points of the digital image at low luminance from the digital image and scales the digital image.
11. The camera of claim 10 where the mapping algorithm maps a ten bit digital image to an eight bit digital image.
12. The camera of claim 9 where Td-S and Vd-S are selected to allocate a portion of the dynamic range of the image to capture images of objects under high luminance and to allocate a second portion of the dynamic range of the image to capture images of objects under low luminance.
13. The camera of claim 12 where one half of the dynamic range is allocated to capture images of objects illuminated by direct sunlight and one half of the dynamic range is allocated to capture images of objects in shadows.
14. The camera of claim 8 where the camera is an automated traffic imaging camera and the object is a license plate.
15. A method of controlling the exposure parameters of an electronic imaging camera comprising: a) a solar irradiance model with an output including solar irradiance, b) a shadowing model with an output including shadow locations, c) an object reflectance model with an output including object radiance, d) a camera imaging model with an output including camera response versus sensor irradiance, and, e) camera electronics comprising exposure control parameters, f) where the exposure control parameters are set prior to imaging an object based upon the outputs of the solar irradiance, shadowing, object reflectance and camera imaging models, and g) a digital image, having a dynamic range, is acquired of an object by the camera electronics set to the exposure control parameters.
16. The method of claim 15 where the exposure control parameters comprise a double slope integration capability including control parameters of integration time (T), an intermediate double slope reset time (Td-s) and an intermediate double slope reset voltage (Vd-S).
17. The method of claim 15 further comprising a mapping algorithm for the digital image that filters out non- valid data points of the digital image at high luminance and that filters out non- valid data points of the digital image at low luminance from the digital image and scales the digital image.
18. The method of claim 17 where the mapping algorithm maps a ten bit digital image to an eight bit digital image.
19. The method of claim 16 where Td s and Vd-S are selected to allocate a portion of the dynamic range of the image to capture images of objects under high luminance and to allocate a second portion of the dynamic range of the image to capture images of objects under low luminance.
20. The method of claim 19 where one half of the dynamic range is allocated to capture images of objects illuminated by direct sunlight and one half of the dynamic range is allocated to capture images of objects in shadows.
PCT/US2008/068719 2008-06-30 2008-06-30 Digital camera control system WO2010002379A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2008/068719 WO2010002379A1 (en) 2008-06-30 2008-06-30 Digital camera control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2008/068719 WO2010002379A1 (en) 2008-06-30 2008-06-30 Digital camera control system

Publications (1)

Publication Number Publication Date
WO2010002379A1 true WO2010002379A1 (en) 2010-01-07

Family

ID=41466238

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/068719 WO2010002379A1 (en) 2008-06-30 2008-06-30 Digital camera control system

Country Status (1)

Country Link
WO (1) WO2010002379A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2423899A1 (en) * 2009-04-20 2012-02-29 Imagsa Technologies S. A. Method for identifying reflecting objects subjected to variable lighting conditions and system for performing said method
FR2983295A1 (en) * 2011-11-29 2013-05-31 Renault Sa System for detecting dazzling of front camera on top of/behind windscreen of car to assist driver to e.g. detect presence of rain, has activation unit for activating dazzling determination unit when directions of vehicle and sun are aligned
RU2628916C2 (en) * 2015-10-14 2017-08-22 Общество с ограниченной ответственностью "АВТОДОРИЯ" (ООО "АВТОДОРИЯ") Method and system of controlling stationary camera
CN107509165A (en) * 2017-07-27 2017-12-22 中兴软创科技股份有限公司 A kind of method for being calculated based on big data, determining AP positions
WO2020079398A1 (en) * 2018-10-15 2020-04-23 Bae Systems Plc Reduction of the visual & audible signatures of the uav to minimise detection during long duration surveillance operations
CN111818281A (en) * 2020-07-15 2020-10-23 北京集创北方科技股份有限公司 Image acquisition parameter adjusting method and device and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174865A1 (en) * 2002-03-15 2003-09-18 Mark William Vernon Vehicle license plate imaging and reading system for day and night
US20050091013A1 (en) * 2003-10-27 2005-04-28 International Business Machines Corporation Incorporation of a phase map into fast model-based optical proximity correction simulation kernels to account for near and mid-range flare
US20060269105A1 (en) * 2005-05-24 2006-11-30 Langlinais Ashton L Methods, Apparatus and Products for Image Capture
US20070195183A1 (en) * 2003-08-05 2007-08-23 Ilia Ovsiannikov Method and circuit for determining the response curve knee point in active pixel image sensors with extended dynamic range

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174865A1 (en) * 2002-03-15 2003-09-18 Mark William Vernon Vehicle license plate imaging and reading system for day and night
US20070195183A1 (en) * 2003-08-05 2007-08-23 Ilia Ovsiannikov Method and circuit for determining the response curve knee point in active pixel image sensors with extended dynamic range
US20050091013A1 (en) * 2003-10-27 2005-04-28 International Business Machines Corporation Incorporation of a phase map into fast model-based optical proximity correction simulation kernels to account for near and mid-range flare
US20060269105A1 (en) * 2005-05-24 2006-11-30 Langlinais Ashton L Methods, Apparatus and Products for Image Capture

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2423899A1 (en) * 2009-04-20 2012-02-29 Imagsa Technologies S. A. Method for identifying reflecting objects subjected to variable lighting conditions and system for performing said method
EP2423899A4 (en) * 2009-04-20 2013-07-03 Imagsa Technologies S A Method for identifying reflecting objects subjected to variable lighting conditions and system for performing said method
FR2983295A1 (en) * 2011-11-29 2013-05-31 Renault Sa System for detecting dazzling of front camera on top of/behind windscreen of car to assist driver to e.g. detect presence of rain, has activation unit for activating dazzling determination unit when directions of vehicle and sun are aligned
RU2628916C2 (en) * 2015-10-14 2017-08-22 Общество с ограниченной ответственностью "АВТОДОРИЯ" (ООО "АВТОДОРИЯ") Method and system of controlling stationary camera
CN107509165A (en) * 2017-07-27 2017-12-22 中兴软创科技股份有限公司 A kind of method for being calculated based on big data, determining AP positions
WO2020079398A1 (en) * 2018-10-15 2020-04-23 Bae Systems Plc Reduction of the visual & audible signatures of the uav to minimise detection during long duration surveillance operations
WO2020079399A1 (en) * 2018-10-15 2020-04-23 Bae Systems Plc Aircraft
US11192647B2 (en) 2018-10-15 2021-12-07 Bae Systems Plc Visual and audible signature reduction of an unmanned aerial vehicle (UAV) to minimize detection during long duration surveillance operations
CN111818281A (en) * 2020-07-15 2020-10-23 北京集创北方科技股份有限公司 Image acquisition parameter adjusting method and device and computer readable storage medium
CN111818281B (en) * 2020-07-15 2022-07-22 北京集创北方科技股份有限公司 Image acquisition parameter adjusting method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
US9965696B2 (en) Digital camera control system
Narasimhan et al. All the images of an outdoor scene
US7899207B2 (en) Image-based visibility measurement
Chow et al. Intra-hour forecasting with a total sky imager at the UC San Diego solar energy testbed
ES2881617T3 (en) Dirt measuring device for photovoltaic arrays employing microscopic imaging
Kuhn et al. Shadow camera system for the generation of solar irradiance maps
JP4985394B2 (en) Image processing apparatus and method, program, and recording medium
WO2010002379A1 (en) Digital camera control system
US8077995B1 (en) Infrared camera systems and methods using environmental information
CN105872398A (en) Space camera self-adaption exposure method
CN102254315B (en) Atmospheric visibility observation method implemented by using double digital cameras
CN110120077A (en) A kind of in-orbit relative radiometric calibration method of area array cameras based on attitude of satellite adjustment
Buluswar et al. Color models for outdoor machine vision
KR101969841B1 (en) Whole-sky camera-based a cloud observation system using the precision illumination data
Schläpfer et al. Correction of shadowing in imaging spectroscopy data by quantification of the proportion of diffuse illumination
US11544918B2 (en) Vehicle to infrastructure system and method with long wave infrared capability
JP2009239501A (en) Photographing system and photographing method
Kurkela et al. Camera preparation and performance for 3D luminance mapping of road environments
CN111355896B (en) Method for acquiring automatic exposure parameters of all-day camera
KR101934345B1 (en) Field analysis system for improving recognition rate of car number reading at night living crime prevention
EP3662655B1 (en) Sky monitoring system
JP2016127312A (en) Imaging information processing unit and imaging information processing system
JP6901647B1 (en) Visibility estimation device, visibility estimation method, and recording medium
Alves et al. Optical engineering application of modeled photosynthetically active radiation (PAR) for high-speed digital camera dynamic range optimization
Buluswar Color-based models for outdoor machine vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08781155

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08781155

Country of ref document: EP

Kind code of ref document: A1