IL279275A - Device, systems and methods for scene image acquisition - Google Patents

Device, systems and methods for scene image acquisition

Info

Publication number
IL279275A
IL279275A IL279275A IL27927520A IL279275A IL 279275 A IL279275 A IL 279275A IL 279275 A IL279275 A IL 279275A IL 27927520 A IL27927520 A IL 27927520A IL 279275 A IL279275 A IL 279275A
Authority
IL
Israel
Prior art keywords
imaging
specular
scene
sroi
orientation
Prior art date
Application number
IL279275A
Other languages
Hebrew (he)
Inventor
Leizerson Ilya
MAYEROWICZ Yaron
Original Assignee
Elbit Systems C4I And Cyber Ltd
Leizerson Ilya
MAYEROWICZ Yaron
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elbit Systems C4I And Cyber Ltd, Leizerson Ilya, MAYEROWICZ Yaron filed Critical Elbit Systems C4I And Cyber Ltd
Priority to IL279275A priority Critical patent/IL279275A/en
Priority to PCT/IB2021/061255 priority patent/WO2022118253A1/en
Priority to EP21900195.5A priority patent/EP4256397A1/en
Publication of IL279275A publication Critical patent/IL279275A/en
Priority to US18/203,300 priority patent/US11836964B2/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/281Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for attenuating light intensity, e.g. comprising rotatable polarising elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/02Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/02Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light
    • G02B26/023Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light comprising movable attenuating elements, e.g. neutral density filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0018Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for preventing ghost images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/288Filters employing polarising elements, e.g. Lyot or Solc filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2206/00Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/18Control of exposure by setting shutters, diaphragms or filters, separately or conjointly in accordance with light-reducing "factor" of filter or other obturator used with or on the lens of the camera

Description

Attorney docket: P10862-IL DEVICES, SYSTEMS AND METHODS FOR SCENE IMAGE ACQUISITION [0001] The present disclosure relates in general to scene image acquisition. BACKGROUND [0002] The introduction of unwanted reflections into an image by reflective surfaces is a common problem encountered in many imaging applications. For example, when an at least partially transparent and reflective surface such as a glass window is positioned between an object to be imaged and an image acquisition device, light reflected from the surface towards the image acquisition device may render the object invisible or unrecognizable. Polarizing filters are often used to reduce or filter out unwanted specular reflections.[0003] The description above is presented as a general overview of related art in this field and should not be construed as an admission that any of the information it contains constitutes prior art against the present patent application.
BRIEF DESCRIPTION OF THE FIGURES [0004] The figures illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.[0005] For simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity of presentation. Furthermore, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. References to previously presented elements are implied without necessarily further citing the drawing or description in which they appear. The figures are listed below.[0006] FIGs. 1A-Dschematically shows images of elevation views of a platform positioned at different orientations in a scene and the specular reflections reflected from the platform.
Attorney docket: P1O862-IL[0007] FIG. 2A-Dschematically shows top views of the platform in the different orientations shown in FIGs. 1A-D,respectively.[0008] FIG. 3is a schematic diagram of an imaging system for imaging a platform located in a scene, according to some embodiments.[0009] FIGs. 4A-Dschematically shows images of elevation views of a platform positioned at different orientations in a scene and providing specular reflections which are reduced or filtered out by the imaging system, according to some embodiments.[0010] FIGs. 5A-Dschematically shows top views of the platform positioned at the different orientations in the scene of FIGs. 4A-Dalong with a camera capturing images of the platform, according to some embodiments.[0011] FIGs. 6A-Dschematically shows images of elevation views of a platform positioned at a given orientation in a scene and providing specular reflections which are reduced or filtered out by the imaging system capturing images of the platform from different imaging directions, according to some embodiments.[0012] FIGs. 7A-Dschematically shows top views of the platform positioned at the given orientation in the scene shown in FIGs. 6A-D,respectively, and a camera capturing images of the platform from different directions, according to some embodiments.[0013] FIGs. 8A-Bschematically shows an image of a platform in a scene, the image displaying reduced or no specular reflections, according to some embodiments.[0014] FIG. 9 schematically shows another image of a platform in a scene, the image displaying reduced or no specular reflections in a center region of the displayed image, according to some embodiments.[0015] FIG. 10schematically shows a top view of platform in a scene and a camera traversing a scene to capture images of the platform as it travels along a road.[0016] FIG. 11schematically shows a stationary platform in a scene that is being imaged with a camera, according to some embodiments.
Attorney docket: P10862-IL[0017] FIG. 12schematically shows a top view of platform in a scene and a plurality of cameras configured to capture images of the platform as it travels along a road, according to some embodiments.[0018] FIG. 13Ashows a frame multiplexing sequence diagram, according to some embodiments.[0019] FIG. 13Bshows a frame multiplexing sequence diagram, according to some other embodiments.[0020] FIG. 14is a flowchart of a method for imaging a scene., according to some embodiments.
DETAILED DESCRIPTION [0021] Aspects of disclosed embodiments pertain to systems, devices and/or methods configured to acquire images of a scene comprising reflective specular surfaces and which are further configured to reduce or eliminate unwanted reflections (also: glare) from the reflective specular surfaces.[0022] Some embodiments pertain to the reduction or elimination of unwanted specular reflections for imaging a portion of the scene behind a transparent and specular surface, for example, to allow characterizing the scene portion and/or characterizing one or more objects which may be located behind the transparent and reflective specular surface. Generally, specular light reflections may originate from natural and/or artificial light sources. Depending on the scene acquisition conditions, the same surface may exhibit or may not exhibit specular characteristics.[0023] A scene portion that is behind a specular surface that is also transparent may herein be referred to as a scene's specular region-of-interest ("specular ROI" or "sROI").[0024] In some embodiments, the sROI may be defined by the boundaries of the transparent and specular surface (e.g., window frame, windshield frame).
Attorney docket: P10862-IL[0025] In some embodiments, a transparent and specular surface may cover the entire field-of-view (FOV) of an image acquisition device.[0026] A scene portion that does not include the area behind a transparent and specular surface may herein be referred to as ex-sROI. In some examples, ex-sROI may or may not comprise specular surfaces. For example, ex-sROI may comprise opaque and specular surfaces such as, for example, a car's motor hood.[0027] Any "transparent surface" is herein considered to be "at least partially transparent" and includes, for example, surfaces having 50% or 60%, 70% or substantially perfect transparency.[0028] The term "transparent" may pertain a variety of wavelengths including the visible spectrum and/or the near-infrared and/or infrared wavelength spectrum and/or other portions of the electro-magnetic spectrum.[0029] Although embodiments disclosed herein refer to reducing or avoiding the imaging of unwanted specular reflections reflected from transparent and specular surfaces, this should by no means be construed in a limiting manner. Accordingly, the principles outlined herein with respect to the reduction or prevention of imaging specular light reflections reflected from transparent and specular surfaces are analogously applicable for the acquisition of images of opaque and specular surfaces.[0030] In the discussion that follows, without be construed in a limiting manner, a reduction in imaged reflections may pertain mainly to specular reflections, which include linear polarization (e.g., s-polarization).[0031] Examples of specular surfaces can include glass, water and/or metallic surfaces.[0032] According to some embodiments, imaging a scene comprising a specular and, optionally, transparent surface which is positioned between an image acquisition device and an object can be performed without requiring polarization analysis of the light reflected from the surface for filtering filter out unwanted reflections by a polarization filter. For example, no on-site light polarization measurement may be required for reducing or filtering out specular reflections reflected from a specular surface. Accordingly, in some Attorney docket: P10862-ILembodiments, devices and systems described herein are configured to reduce or prevent the imaging of unwanted reflections without user-noticeable latency or with near-zero latency, e.g., in real-time (RT) or near-RT.[0033] Embodiments pertain to the gathering or acquiring of data descriptive of a specular sROI located behind transparent and specular surfaces and ex-sROI, for example, for monitoring, controlling, and/or intelligence gathering purposes.[0034] Acquiring specular sROI and ex-sROI data may be followed by data analysis for characterization of the specular sROI and ex-sROI. Monitoring, controlling and/or intelligence gathering may include, for example, characterizing objects which are located in the specular sROI and the ex-sROI. Characterizing an object may include, for example, object identification and/or classification. In some examples, object characterization may be employed for identification and, optionally, authentication purposes, for example, using facial and/or other physical characteristics recognition (e.g., gait analysis) and comparison with corresponding physical characteristics of known individuals.[0035] Aspects of devices, systems and methods described herein may thus be employed in the context of, for example, border control applications, perimeter surveillance, authentication applications for access control, remote in-vehicle or in-platform passenger recognition, counting the number of passengers in vehicles or platforms, measuring physical characteristics of passengers in vehicles (e.g., passenger height) and/or the like.[0036] In some embodiments, an imaging system comprises one or more image acquisition devices or cameras that include imaging optics and an image sensor comprising a plurality of pixels generating signals related to the imaged scene. The imaging optic is configured to direct light received from the scene onto the image sensor.[0037] The image acquisition device further includes an adjustable polarization filter that is positioned between at least one specular surface and the one or more image sensors for acquiring information about a scene.[0038] For example, the adjustable polarization may be configured to enable acquiring information about the sROI which is behind the specular and transparent surface. The Attorney docket: P10862-ILspecular and transparent surface is thus positioned between the camera and the specular sROI.[0039] In some embodiments, a plurality of image sensors may be employed along with a corresponding plurality of polarization filters. In some examples, the plurality of image sensors and polarizers may be employed by the same image acquisition device or by separate image acquisition devices which are employed by the scene acquisition system.[0040] The polarization filter orientation may be adjusted and set to a desired orientation based on scene imaging parameters including, for example, an image acquisition direction relative to a platform in a scene to reduce or eliminate reflections from the specular surface. The desired polarizer orientation may remain fixed for a certain camera position and orientation.[0041] In some examples, additional scene imaging parameters may be considered which may include, for instance, polarization of light incident onto the image sensor; environmental conditions (e.g., weather conditions, time of day, ambient light, and/or sun position) in relation to the geographic area comprising the scene being imaged and/or in which the camera is located. In some embodiments, the system may also be configured to determine, in addition to a desired polarizer orientation, a desired camera orientation in a scene such to further reduce or prevent specular reflection from reaching an image sensor.[0042] In examples discussed herein, it may be assumed that a lateral displacement of an image acquisition device relative to a specular surface is significantly shorter than a distance between the device and the specular surface, such that a change in acquisition direction due to lateral displacement between the image acquisition device and the specular surface can be considered negligible. However, in cases where such lateral shift is not significantly shorter than a distance between the image acquisition device and the specular surface, the lateral shift is considered to determine a corresponding change in the image acquisition direction. For example, a lateral displacement (up/down and/or sideways) between the image acquisition device and the specular surface exceeding (increasing or decreasing) a high viewpoint angle threshold may be taken into consideration by the system for updating an initial polarizer orientation to obtain an updated desired polarizer orientation.
Attorney docket: P1O862-ILCorrespondingly, in cases where a lateral displacement between the image acquisition device and the specular surface does not exceed the high viewpoint angle threshold, then such lateral displacement is not considered by the system as an input parameter value for determining an updated desired polarizer orientation. In some examples, the upper viewpoint angle threshold may be 5 degrees.[0043] Based on the orientation between a camera's imaging optics and the specular surface, a characteristic (e.g., orientation) of the polarization filter is configured such that, for example, the sROI behind a specular and transparent surface becomes sufficiently visible to allow sROI characterization. In some examples, specular sROI characterization may include, for example, characterization of objects located behind the specular and transparent surface for instance, for the purposes of object identification, classification and/or authentication. Analogously, in some embodiments, the polarizer may be configured to allow object characterization comprised in ex-sROI. In some embodiments, the system may be operable to automatically switch between different polarizer configurations when imaging the sROI and ex-sROI of the scene in manner to reduce or prevent the effect on imaging of unwanted specular reflections from either one of the sROI and ex-sROI.[0044] A specular sROI may be considered "sufficiently visible" if objects in the specular sROI are identifiable, for example, by the naked human eye and/or (e.g., automatically or semi-automatically) by an electronic analysis system, and/or the like. The electronic analysis system may for instance be operable to characterize image data descriptive of the specular sROI. In some embodiments, an image comprising the specular sROI behind the specular and transparent surface may be displayed to a user of the system. In some embodiments, an output relating to characteristics of the specular sROI may be output by the system. The output may include, for example, visual information, auditory information and/or tactile information. In some examples, information for characterizing the specular sROI may be provided in a non-visual manner.[0045] In some examples, a specular surface (which may comprise opaque surface portions and transparent surface portions) may be part of a stationary and/or movable (e.g., mobile) platform. A specular and transparent surface of a platform may include, for Attorney docket: P10862-ILexample, a vehicle window such as the window of a passenger car; truck; bus; the window of a rail-based transport vehicle including trains, subways, trams, etc.; a cockpit window; an aircraft window; the window of a building such as the window of a residential building, an office building window, a shopping window; a glass door; a sentry box window; and/or the like. A specular and opaque surface may include metallic platform surfaces.[0046] As mentioned above, a characteristic (e.g., orientation) of the polarization filter may be configured (e.g., controlled) based on an orientation of the specular surface relative to the image sensor. For instance, an imaging system may be configured to automatically analyze an orientation of a specular surface relative to the image sensor to determine a desired polarization filter orientation. At the desired polarization filter orientation, the amount of specular reflections incident onto the image sensor(s) may be reduced or prevented. The polarization filter may be set to the desired orientation in a manual, automatic or semi-automatic manner.[0047] It is noted that the terms "determining" and "deriving a relative orientation", as well as grammatical variations thereof, may herein also encompass the meaning of the term "estimating a relative orientation".[0048] In some examples, controlling a polarization filter orientation may be based mainly or solely on the specular surface orientation relative to the image sensor.[0049] In some embodiments, the orientation of a specular surface in a scene relative to a (e.g., world) reference frame and/or relative to a camera's imaging direction (also: camera- surface orientation) may be predictable and/or constant or substantially constant over a comparatively long period of time. Therefore, slight variations in orientation of a specular surface relative to the world reference frame and/or to the imaging direction may be considered negligible as long as or if these variations allow, for a desired polarization filter orientation, generating images of the scene behind the specular and transparent surface, for example, to enable specular sROI and ex-sROI characterization including, for instance object characterization. Hence, in some examples, for a certain camera position in the world reference frame and imaging direction, camera-surface orientation values may be associated with (also: mapped to) corresponding desired polarizer orientation values. The Attorney docket: P10862-ILcamera-surface orientations values may be associated with the corresponding range of desired polarizer orientation values through a look-up-table and/or through a mathematical function. In some examples, polarizer orientations may be predetermined with respect to predicted camera-surface orientation estimations.[0050] In some embodiments, the orientation of a specular surface relative to a (e.g., world) reference frame and/or relative to an image sensor located may be predictable and/or constant or substantially constant over a comparatively long period of time, considering natural and/or manmade physical characteristics of a geographical area in which the specular surface and image sensor are located.[0051] Manmade physical characteristics may pertain, for example, the position, orientation and/or configuration of buildings (e.g., window position, orientation and/or height above ground), artificial lighting configuration (e.g., streetlights), infrastructure including roadways and venues accessed by the roadways, parking lots, and/or the like.[0052] Natural physical characteristics may pertain, for example, to terrain topography, vegetation, geological characteristics, and/or the like, in a geographic area.[0053] The orientation of a specular surface within a scene may depend on such manmade and/or natural characteristics of the scene. In some further examples, an expected or instantaneous orientation of a specular surface in a scene may depend on manmade and/or natural characteristics, and an (e.g., expected or instantaneous) orientation of a specular surface relative to a selected imaging direction may depend on manmade and/or natural scene characteristics. Therefore, an (e.g., expected or instantaneous) orientation of a specular surface may be determined (e.g., estimated) based on manmade and/or natural scene characteristics. Hence, a polarization filter orientation may be configured based on knowledge of manmade and/or natural scene characteristics and further based on a selected imaging direction (pitch, yaw) of the acquisition device within the scene.[0054] It is noted that although examples described herein pertain mainly to the imaging of outdoor scenes, these should not be construed in a limiting manner and that the same concepts may also be applied in indoor scene imaging applications. Such indoor scene 9 Attorney docket: P10862-ILimaging applications may include, for example, imaging a scene in a parking garage, underground and/or above ground mass transportation facilities (e.g., subway stations, train stations, airport terminals, and/or the like.[0055] In some embodiments, a platform or specular surface in a scene may be considered to have a basic orientation with respect to yaw and/or pitch relative to the scene's reference coordinates. The basic orientation may pertain to a nominal orientation as well as to deviations from the nominal orientation within a maximum deviation range. An initial desired polarizer configuration may be configured with respect to such nominal platform orientation.[0056] As long as the platform orientations deviates within the maximum range, the deviation may be considered negligible and therefore not used as an input parameter for updating the desired polarizer orientation. However, if the deviation exceeds the maximum range then the initial desired polarizer orientation is updated to obtain a new (also: updated) polarizer orientation.[0057] In some embodiments, the system may be configured to determine a platform's orientation and update the polarizer configuration when the deviation exceeds the maximum deviation range. The system may be configured to perform these steps in real- time or substantially in real-time. In some examples, a maximum deviation range may be defined by+/5־ degrees from the nominal platform orientation.[0058] In one example, the position and orientation of a closed office building window with respect to a world reference frame may remain constant. In a further example, with respect to the world reference frame, the orientation of platforms expected to traverse along a certain route may be predictable for any platform position along that route. In a yet further example, considering a parking space position and orientation with respect to the world reference frame, one of two possible orientations of a vehicle to be parked in that parking space is predictable.[0059] Orientations of vehicles traveling along a road or parking in a basic orientation may slightly vary among different vehicles. However, as mentioned above such variations may in some embodiments be considered negligible with respect to an image sensor 10 Attorney docket: P10862-ILpositioned relative to the vehicle(s) for capturing or acquiring sufficient quality images of persons located in the vehicle(s) for identification purposes. Therefore, a polarization filter may be set to an angular orientation allowing filtering out specular reflections for the purpose of identifying persons located in a vehicle, without requiring readjustment of the polarization filter due to the above noted slight variations.[0060] In some further embodiments, considering the characteristics of the geographical area, the relative specular surface and image sensor orientations may be predictable, yet change significantly within a comparatively short period of time. However, based on these predictable changes, polarization filter orientations can be controlled comparatively fast. For example, a polarization filter orientation can be adjusted (e.g., solely) based on a known route expected to be or being traversed by a vehicle and further based on the sensor's vehicle acquisition angle for each instant vehicle position along the said route.[0061] Hence, based on the knowledge of future or instant platform orientations in a geographical area, the orientation of the platform's transparent and specular surface relative to an image sensor's FOV capturing the platform may be predictable or derivable.[0062] In some embodiments, a sensor’s scene imaging direction may be constant. In that case, a selected polarization filter orientation may remain constant in accordance with a present or expected specular surface orientation in the sensor's FOV. For example, a polarization filter orientation may be configured for a selected specular surface orientation relative to an image sensor. The polarization filter configuration may be retained for any specular surface subsequently entering the image sensor's frame while effectively reducing or preventing the effect of specular reflections. In one embodiment, the imaging system may be configured to acquire images of a plurality of vehicles entering one after the other the image sensor's FOV. For instance, the position and imaging direction of the image sensor may be stationary in the geographic area and the polarization filter configuration may be set to a correspondingly constant value to perform image acquisition of vehicles driving on a road to obtain images of sufficient quality of persons located in the vehicles "passing by" the sensor's FOV.
Attorney docket: P10862-IL[0063] In some embodiments, an imaging system may be configured to detecta (moving) platform, lock onto the platform and track it is an object of interest (OOI). Platform tracking may be implemented by a plurality of cameras that are positioned in different imaging directions and/or by at least one camera that is coupled with an (e.g., gymbaled) steering mechanism for controllably steering the camera, for example, in three degrees of freedom. The polarization filter configurations may be preset or predetermined for the various imaging directions of the at least one steerable camera and/or the plurality of cameras.[0064] In some embodiments, the imaging system may be configured to determine platform (e.g., vehicle) characteristics. These platform characteristics may be considered for determining the orientation of a vehicle's specular surfaces, e.g., relative to the reference frame and/or the camera's imaging optics, for example, by comparing acquired platform characteristics with known platform characteristics, for instance, by using one or more known platform databases.[0065] Platform characteristics may include a type of vehicle (e.g., passenger car, truck, bus); vehicle category (e.g., sportscar, SUV, family car) vehicle make (e.g., Mazda, Volkswagen, Porsche) and/or year of manufacturing. For example, the windshield orientation of a sportscar (e.g., Porsche 911) may differ from the windshield orientation of a Sports Utility Vehicle (SUV) such as that of a Range Rover. In some examples, a range of camera-vehicle orientation values which were determined, e.g., based on vehicle characteristics, may be associated with or mapped to a respective range of desired polarizer values, for example, through a look-up-table and/or a mathematical function.[0066] In some embodiments, polarizer configurations may set based on where an image of a platform is shown on the screen of a display device. For example, a first polarization filter configuration may be selected when a platform's image is displayed on an upper left screen region, and a second polarization filter configuration, different from the first one, may be selected when a platform's image is displayed on an upper left screen region. Accordingly, in some embodiments, different screen display regions may be associated with different polarizer configurations. The different screen display regions may be partially overlapping.
Attorney docket: P10862-IL[0067] In some examples, reflected light characteristics such as polarization may be disregarded when configuration the polarization filter. In some further examples, environmental conditions such as time of day or weather conditions may be disregarded for determining the polarization filter orientation.[0068] Referring now to FIGS. 1A-Cand FIGs. 2A-D,various scene images 500A-Dare shown in which images of a vehicle 600having a vehicle longitudinal axis Vzis acquired from different imaging directions by a camera 800having a camera optical axis Czand comprising acquisition optics 820and an image sensor 840aligned with optical axis Cz. [0069] In the examples shown in FIGs. 1A-Cand 2A-D,the imaging direction of camera 800with respect to a scene's world reference coordinate system (Wxyz)remains stationary while the vehicle's orientation (e.g., pitch and yaw) in the world reference coordinate system (Wxyz)changes relative to the camera's imaging direction. In other examples, vehicle 600 can be stationary relative to the world reference coordinate system while the camera's orientation (e.g., pitch and yaw) relative to vehicle 600changes. In further examples the orientation of both vehicle 600and camera 800may change in the world reference frame. Any of these examples and analogous scenarios are herein collectively referred to as imaging a scene comprising a specular surface from different directions by changing a relative orientation between the specular surface and a camera's imaging direction of the specular surface.[0070] As schematically shown in FIGs. 1A-D,vehicle 600 comprises transparent and specular surfaces 610 (e.g., vehicle windshield) which introduce specular reflections into the camera's FOV. Depending on the orientation of specular surface 610relative to camera 800,specular reflections increasingly obscure or hide visibility of the scene portion behind specular and transparent surface 610and, therefore, of an object 700(exemplified herein as a driver of the vehicle) positioned behind specular and transparent surface 610. For instance, at orientation 01shown in FIGs. 1Aand 2Aalmost no specular reflections are introduced into the FOV of camera 800such that image data can be generated descriptive of object 700,which is, therefore, characterizable. However, at orientation 04shown in Attorney docket: P10862-IL FIGs. IDand 2D,specular reflections are introduced which render object 700nearly to entirely invisible.[0071] As noted above, to overcome the problem of introducing unwanted specular reflections into a camera's FOV, the orientation of imaging optics of a camera relative to a specular surface may be determined. Based on the determined surface-imaging optics orientation, the angle or orientation of a polarization filter may be configured to reduce or eliminate the imaging of unwanted specular reflections by an image sensor.[0072] Referring now to FIG. 3,an imaging system 1000is described which is configured to reduce or eliminate the effect of unwanted specular reflections.[0073] Imaging system 1000comprises a camera 1100that includes imaging optics 1120 configured to receive and optically guide light received from scene 500onto an image sensor 1140of camera 1100to generate image data descriptive of the scene. Camera 1100 further includes a polarizer or polarization filter 1160 which can be set into a desired orientation such to reduce or eliminate specular reflections received from scene 500. [0074] At least one camera 1100may include one or more cameras for imaging in the visible or infrared spectrum, an array of video cameras, e.g., arranged for acquiring 3degrees video images from the scene or multiple video cameras scattered in scene 500. The one or more video cameras may be configurable such that parameters thereof such as zooming, illumination, orientation, positioning, location and/or the like, can be adapted (e.g., adjusted, configured, and/or directed from afar), automatically, manually and/or semi-automatically.[0075] In some examples, camera 1100may comprise a plurality of imaging modules such as, for instance, a Tele-lens assembly and a wide FOV lens assembly having corresponding optical imaging axes. In some examples, the plurality of imaging modules may be incorporated in a Smartphone device.[0076] In some embodiments, a polarizer 1160may be pre-installed in camera 1100.In some embodiments, camera 1100may be retrofitted with polarizer 1160.In some embodiments, polarizer 1160may be positioned in front of imaging optics 1120.In some Attorney docket: P1O862-ILother embodiments, polarizer 1160may be positioned between image sensor 1140and imaging optics 1120. [0077] Imaging system 1000may further include a processor 1200and a memory 1300 which is configured to store data 1310and algorithm code 1320.Processor 1200may be configured to execute algorithm code 1320for the processing of data 1310resulting in the implementation of a scene acquisition control and analysis (SACA) engine 1400. [0078] SACA engine 1400may be configured to determine a present or expected orientation (also: camera-surface orientation) between a specular surface and camera 1100.SACA engine 1400is further configured to determine, based on the determined camera-surface orientation, a desired polarizer orientation that reduces or eliminates incidence of unwanted specular onto the image sensor. The surface-camera orientation may for example be determined as outlined in more detail herein below.[0079] The term "processor", as used herein, may additionally or alternatively refer to a controller. Processor 1200may be implemented by various types of processor devices and/or processor architectures including, for example, embedded processors, communication processors, graphics processing unit (GPU)-accelerated computing, soft- core processors and/or general purpose processors.[0080] Memory 1300may be implemented by various types of memories, including transactional memory and/or long-term storage memory facilities and may function as file storage, document storage, program storage, or as a working memory. The latter may for example be in the form of a static random access memory (SRAM), dynamic random access memory (DRAM), read-only memory (ROM), cache and/or flash memory. As working memory, memory 1300 may, for example, include, e.g., temporally-based and/or non- temporally based instructions. As long-term memory, memory 1300may for example include a volatile or non-volatile computer storage medium, a hard disk drive, a solid state drive, a magnetic storage medium, a flash memory and/or other storage facility. A hardware memory facility may for example store a fixed information set (e.g., software code) including, but not limited to, a file, program, application, source code, object code, data, and/or the like.

Claims (28)

Claims What is claimed is:
1. An imaging system configured to reduce or prevent the effect of unwanted specular reflections reflected by specular surfaces located in a scene, comprising: at least one image acquisition device for acquiring a scene image comprising a specular surface providing specular light reflections; the at least one image acquisition device comprising at least one first image sensor and imaging optics having an optical axis for guiding light received from the scene along the optical axis onto the at least one first image sensor; at least one polarization filter operably positioned between the at least one first image sensor and an object; a processor and a memory which are configured to perform the following: determining an orientation of the specular surface relative to the optical axis of the imaging optics; and determining, based on the orientation of the surface relative to the optical axis of the imaging optics, a polarization filter orientation such that an amount of specular light reflections incident onto the at least one first image sensor is reduced or eliminated.
2. The imaging system of claim 1, further comprising: automatically determining and controlling the polarization filter orientation such that the amount of specular light reflections incident onto the at least one first image sensor is reduced or eliminated.
3. The imaging system of claim 1 or claim 2, wherein an orientation of the specular surface relative to the at least one first imaging optics is determined based on: one or more characteristics of a geographic area in which the image sensor and the object are located; one or more characteristics of a platform comprising the specular surface; environmental conditions including ambient light conditions at the time of scene image acquisition; a time of day at which the scene image is acquired; or any combination of the aforesaid.
4. The imaging system of claim 3, wherein the one or more platform characteristics are determined based on the one or more acquired scene images comprising the specular surface.
5. The imaging system of claim 4, wherein the one or more platform characteristics includes a vehicle class and/or type.
6. The imaging system of any one of the preceding claims, wherein the controlling of a polarization filter orientation is further based on one of the following: a position and orientation of the specular surface in a reference frame; and a position and imaging direction of the at least one first image capturing device sensor in the reference frame.
7. The imaging system of any one of the claims 1 to 6, wherein the polarization filter orientation is determined without performing on-site: light polarization measurement of specular light reflections.
8. The imaging system of any one of the preceding claims, wherein the specular surface is also transparent, and the polarizer filter orientation is set to a desired value for imaging a scene portion that is located behind the transparent and specular surface such to allow characterization of the scene portion.
9. The imaging system of claim 8, wherein the transparent and specular surface defines a specular region-of-interest (sROI), and wherein at least one image acquisition parameter value used for imaging of the sROI differ from the at least one imaging acquisition parameter value employed for imaging scene portions outside the sROI.
10. The imaging system of claim 9, wherein the at least one image acquisition parameter value is determined based on one of the following: an amount of light reflected from the transparent and specular surface and incident onto the at least one first image sensor; an amount of ambient light incident onto an additional image sensor, or both.
11. The imaging system of claim 9 or claim 10, further configured to controllably allow overexposure of sensor pixels which are imaging scene portions that are outside the sROI to ensure sufficient photon accumulation from the sROI.
12. The imaging system of claim 9 or claim 10, further configured to controllably allow underexposure for the purpose of person identification of sensor pixels subjected to light incident from the sROI to avoid overexposure of sensor pixels used for imaging scene portions which are outside the sROI.
13. The imaging system of any one of the claims 9 to 12, further configured to control the at least one image acquisition parameter value to generate a sequence of multiplexed image frames that includes frames generated for imaging the sROI and frames generated for imaging scene portions outside the sROI.
14. The imaging system of any one of claims 9 to 13, wherein the at least one image acquisition parameter value refers to one of the following: gain, exposure time, multiplexing parameters, sROI size, ex-sROI size, or any combination of the aforesaid.
15. An imaging method for reducing or preventing the effect of unwanted specular reflections reflected by specular surfaces located in a scene, comprising: acquiring, at least one image acquisition device having an optical axis, a scene image comprising a specular surface providing specular light reflections; determining an orientation of the specular surface relative to the optical axis of the imaging optics; and determining, based on the orientation of the specular surface relative to the optical axis of the imaging optics, a polarization filter orientation such that an amount of specular light reflections incident onto an at least one first image sensor is reduced or eliminated.
16. The method of claim 15, further comprising: automatically determining and controlling the polarization filter orientation such that the amount of specular light reflections incident onto the at least one first image sensor is reduced or eliminated.
17. The method of claim 15 or claim 16, wherein an orientation of the specular surface relative to the at least one first imaging optics is determined based on: one or more characteristics of a geographic area in which the image sensor and the object are located; an orientation of the platform or of the specular surface in the scene; environmental conditions including ambient light conditions at the time of scene image acquisition; a time of day at which a scene image is acquired; or any combination of the aforesaid.
18. The method of any one of the claims 15 to 17, wherein the platform characteristic is determined based on the one or more acquired scene images comprising the specular surface.
19. The method of claim 18, wherein the platform characteristic includes a vehicle class and/or type.
20. The method of any one of the claims 15 to 19, wherein the controlling of the polarization filter orientation is further based on one of the following: a position and orientation of the specular surface in a reference frame; and a position and imaging direction of the at least one first image capturing device sensor in the reference frame.
21. The method of any one of the claims 15 to 20, wherein the polarization filter orientation is determined without performing on-site: light polarization measurement of specular light reflections.
22. The method of any one of claims 15 to 21, wherein when the specular surface is also transparent, further comprising setting the polarizer filter orientation to a desired value for imaging a scene portion that is located behind the transparent and specular surface such to allow characterization of the scene portion.
23. The method of claim 22, wherein the transparent and specular surface defines a specular region-of-interest (sROI); and wherein at least one image acquisition parameter value used for imaging the sROI of a scene differ from the at least one imaging acquisition parameter values employed for imaging scene portions outside the sROI.
24. The method of claim 23, wherein the at least one image acquisition parameter value is determined based on one of the following: an amount of light reflected from the transparent and specular surface and incident onto the at least one first image sensor; an amount of ambient light incident onto an additional image sensor, or both.
25. The method of claim 23 or claim 24, further comprising controllably allowing overexposure of sensor pixels which are imaging scene portions that are outside the sROI to ensure sufficient photon accumulation from the sROI.
26. The method of claim 23 or claim 24, further comprising controllably allowing underexposure for the purpose of person identification of sensor pixels subjected to light incident from the sROI to avoid overexposure of sensor pixels used for imaging scene portions which are outside the sROI.
27. The method of any one of the claims 23 to 26, further comprising controlling the least one image acquisition parameter value to generate a sequence of multiplexed image frames that includes frames generated for imaging the sROI and frames generated for imaging scene portions outside the sROI.
28. The method of any one of claims 23 to 27, wherein the at least one image acquisition parameter includes one of the following: gain, exposure time, multiplexing parameters, sROI size, ex-sROI size, or any combination of the aforesaid.
IL279275A 2020-12-06 2020-12-06 Device, systems and methods for scene image acquisition IL279275A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
IL279275A IL279275A (en) 2020-12-06 2020-12-06 Device, systems and methods for scene image acquisition
PCT/IB2021/061255 WO2022118253A1 (en) 2020-12-06 2021-12-02 Devices, systems and methods for scene image acquisition
EP21900195.5A EP4256397A1 (en) 2020-12-06 2021-12-02 Devices, systems and methods for scene image acquisition
US18/203,300 US11836964B2 (en) 2020-12-06 2023-05-30 Devices, systems and methods for scene image acquisition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL279275A IL279275A (en) 2020-12-06 2020-12-06 Device, systems and methods for scene image acquisition

Publications (1)

Publication Number Publication Date
IL279275A true IL279275A (en) 2022-07-01

Family

ID=81853969

Family Applications (1)

Application Number Title Priority Date Filing Date
IL279275A IL279275A (en) 2020-12-06 2020-12-06 Device, systems and methods for scene image acquisition

Country Status (4)

Country Link
US (1) US11836964B2 (en)
EP (1) EP4256397A1 (en)
IL (1) IL279275A (en)
WO (1) WO2022118253A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100079618A1 (en) * 2006-05-29 2010-04-01 Panasonic Corporation Light source estimation device, light source estimation system, light source estimation method, device for super-resolution, and method for super-resolution
US20110050854A1 (en) * 2008-12-25 2011-03-03 Katsuhiro Kanamori Image processing device and pseudo-3d image creation device
US20150124148A1 (en) * 2013-11-07 2015-05-07 Bryce T. Osoinach Devices having automatically adjustable polarizers and related operating methods
US20190268521A1 (en) * 2018-02-26 2019-08-29 Motorola Mobility Llc Digital Image Capture with a Polarizer at Different Rotation Angles

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4257106A (en) 1979-05-24 1981-03-17 Norlin Industries, Inc. Method and apparatus for thermal imaging
JPH0534641A (en) * 1991-07-30 1993-02-12 Kimisato Kurihara Observing device and method
US5406938A (en) 1992-08-24 1995-04-18 Ethicon, Inc. Glare elimination device
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
JPH10145668A (en) 1996-11-14 1998-05-29 Nikon Corp Polar filter control mechanism
US6411749B2 (en) * 2000-05-11 2002-06-25 Micro-Optice, Inc. In-line fiber optic polarization combiner/divider
US20020125411A1 (en) 2001-03-08 2002-09-12 Christy Orrin D. Method and apparatus for reducing specular reflection from a scannable surface
US7561312B1 (en) 2004-10-04 2009-07-14 Google Inc. Systems and methods for glare removal using polarized filtering in document scanning
WO2007071290A1 (en) 2005-12-22 2007-06-28 Robert Bosch Gmbh Automatic polarizer for cctv applications
US7729607B2 (en) 2006-05-31 2010-06-01 Technologies4All, Inc. Camera glare reduction system and method
FR2927179B1 (en) * 2008-02-01 2010-11-26 Solystic SYSTEM FOR ACQUIRING IMAGES FOR IDENTIFICATION OF SIGNS ON POSTAL MAILINGS.
JP5774030B2 (en) 2010-02-25 2015-09-02 ヴォロテック・リミテッド Optical filters and processing algorithms with various polarization angles
US20110228115A1 (en) * 2010-03-16 2011-09-22 Microsoft Corporation Large Format Digital Camera
US9060110B2 (en) * 2011-10-07 2015-06-16 Canon Kabushiki Kaisha Image capture with tunable polarization and tunable spectral sensitivity
WO2013090843A1 (en) 2011-12-15 2013-06-20 Trygger Llc Rotatable light filter assembly for camera built in a smartphone
WO2014186620A1 (en) * 2013-05-15 2014-11-20 The Johns Hopkins University Eye tracking and gaze fixation detection systems, components and methods using polarized light
IL236114A (en) * 2014-12-07 2016-04-21 Yoav Grauer Object detection enhancement of reflection-based imaging unit
US10371519B1 (en) 2017-05-26 2019-08-06 Lockheed Martin Corporation Polarization optical compass
US11158072B2 (en) * 2018-02-02 2021-10-26 Dishcraft Robotics, Inc. Intelligent dishwashing systems and methods
US11513223B2 (en) * 2019-04-24 2022-11-29 Aeye, Inc. Ladar system and method with cross-receiver
US11036067B2 (en) * 2019-07-23 2021-06-15 Semiconductor Components Industries, Llc Image sensor packages with tunable polarization layers
EP4042366A4 (en) * 2019-10-07 2023-11-15 Boston Polarimetrics, Inc. Systems and methods for augmentation of sensor systems and imaging systems with polarization
US11131934B2 (en) * 2019-10-29 2021-09-28 Waymo Llc Non-telecentric light guide elements

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100079618A1 (en) * 2006-05-29 2010-04-01 Panasonic Corporation Light source estimation device, light source estimation system, light source estimation method, device for super-resolution, and method for super-resolution
US20110050854A1 (en) * 2008-12-25 2011-03-03 Katsuhiro Kanamori Image processing device and pseudo-3d image creation device
US20150124148A1 (en) * 2013-11-07 2015-05-07 Bryce T. Osoinach Devices having automatically adjustable polarizers and related operating methods
US20190268521A1 (en) * 2018-02-26 2019-08-29 Motorola Mobility Llc Digital Image Capture with a Polarizer at Different Rotation Angles

Also Published As

Publication number Publication date
US20230306710A1 (en) 2023-09-28
WO2022118253A1 (en) 2022-06-09
US11836964B2 (en) 2023-12-05
EP4256397A1 (en) 2023-10-11

Similar Documents

Publication Publication Date Title
US11601601B2 (en) Optical array for high-quality imaging in harsh environments
CN108460734B (en) System and method for image presentation by vehicle driver assistance module
CN102447911B (en) Image acquisition unit, its method and associated control element
JP4970516B2 (en) Surrounding confirmation support device
CN103770708B (en) The dynamic reversing mirror self adaptation dimming estimated by scene brightness is covered
CN102668540A (en) Imaging device, on-vehicle imaging system, road surface appearance detection method, and object detection device
CA2440477A1 (en) Enhanced display of environmental navigation features to vehicle operator
CN208479822U (en) A kind of automobile-used panoramic looking-around system
JP2020056839A (en) Imaging apparatus
KR101942179B1 (en) Lane visibility improvement device in rain
CN106203272A (en) The method and apparatus determining the movement of movable objects
JP2020057869A (en) Imaging apparatus
CN109584176A (en) Motor vehicle driving vision enhancement system
CN205010094U (en) Vehicle HUD system
IL279275A (en) Device, systems and methods for scene image acquisition
JP6515531B2 (en) Imaging information processing apparatus and imaging information processing system
KR101678448B1 (en) Driving monitoring system for providing guide information
DE102013220839B4 (en) A method of dynamically adjusting a brightness of an image of a rear view display device and a corresponding vehicle imaging system
KR102435281B1 (en) Road accident detection system and method using a lamppost-type structure
KR101934345B1 (en) Field analysis system for improving recognition rate of car number reading at night living crime prevention
US20210256278A1 (en) Method for Detecting Light Conditions in a Vehicle
US11778316B2 (en) Imaging apparatus
WO2022199416A1 (en) Camera module, terminal device, and imaging method
Tan et al. Thermal Infrared Technology-Based Traffic Target Detection in Inclement Weather
Everson et al. Sensor performance and weather effects modeling for Intelligent Transportation Systems (ITS) applications