WO2004064391A1 - Versatile camera for various visibility conditions - Google Patents

Versatile camera for various visibility conditions Download PDF

Info

Publication number
WO2004064391A1
WO2004064391A1 PCT/IL2004/000038 IL2004000038W WO2004064391A1 WO 2004064391 A1 WO2004064391 A1 WO 2004064391A1 IL 2004000038 W IL2004000038 W IL 2004000038W WO 2004064391 A1 WO2004064391 A1 WO 2004064391A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
camera according
versatile camera
sensors
images
Prior art date
Application number
PCT/IL2004/000038
Other languages
French (fr)
Other versions
WO2004064391B1 (en
Inventor
Hanan Shamir
Yossi Yaeli
Original Assignee
Elbit Systems Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elbit Systems Ltd. filed Critical Elbit Systems Ltd.
Publication of WO2004064391A1 publication Critical patent/WO2004064391A1/en
Publication of WO2004064391B1 publication Critical patent/WO2004064391B1/en
Priority to US11/182,302 priority Critical patent/US7496293B2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images

Definitions

  • the disclosed technique relates to 24 hour remote vision in
  • the disclosed technique is particularly applicable to real time
  • a direct scene such as on a helmet visor, vision goggles or
  • Such a camera is often a video
  • the camera that continuously provides images of a scene.
  • the camera is
  • the camera may be stationary.
  • the camera is portable or carried by a vehicle, for terrestrial or
  • a good example is a camera employed
  • HMD Head Mounted Display
  • HMD prevalently features image projection reflected from monocle or goggles lenses or a helmet visor to the eyes of the
  • HMD users can include a pilot or another
  • HUD Heads Up Display
  • the image often includes a vision enhancement display, wherein
  • the visor The combination is conducted in registration, namely - the
  • projected image is captured by a camera, and is manipulated by image
  • processing means available onboard available onboard (often on the helmet). If necessary,
  • the system can calculate in real time the
  • user's head such as the pilot's helmet
  • view orientation
  • the user and the camera can be relieved or spared entirely, as the camera can be aligned to face the field of view of the user.
  • a compact camera has limited room and poses a
  • UV ultraviolet
  • NIR near infrared
  • IR thermal infrared
  • each sensor is
  • the optical and routing module serves to receive incoming rays
  • the at least two sensors serve to capture images of the scene and provide
  • each sensor has
  • a poor visibility conditions sensor such as a dim light
  • one of the at least two sensors is a daylight sensor, thus providing
  • the images retrieved by the sensors are forwarded to the
  • head mounted display or initially processed and merged in registration.
  • Each sensor has a particular operational
  • At least one of the at least two sensors is a
  • At least another one of the at least two sensors is a daylight sensor, thus providing seamless day and night (24 hour) capabilities.
  • Figure 1 is a schematic illustration of a camera constructed
  • Figure 2 is a schematic illustration of a camera constructed
  • Figure 3 is a schematic illustration of a camera constructed
  • Figure 4 is a schematic illustration of a camera constructed
  • Figure 5 is a schematic illustration of a camera constructed
  • Figure 6 is a block diagram of a method for converting a scene
  • Figure 7 is a schematic illustration of spatial and temporal
  • Figure 8 is an expanded view of the schematic illustration of
  • Camera 100 includes an optical and routing module 102 and "N" sensors
  • N can be any natural number greater than one, as
  • camera 100 comprises at least two sensors.
  • Optical and routing module
  • 102 includes optical elements for capturing images and a router for routing
  • router can be integral or separate elements.
  • Optical and routing module
  • optical and routing module 102 Forwarded to
  • the scene can be a direct scene - conforming
  • sensors 110, 112, and 114 is converted to a digital signal representation of
  • the image signal is fed downstream for further processing to a
  • Each of sensors 110, 112, and 114 has a
  • each of sensors 110, 112, and 114 can be distinct, but can also overlap, at least
  • At least one of sensors 110, 112, and 114 is a poor visibility conditions
  • a sensor i.e.: a sensor that facilitates perception of the viewed scenery
  • sensor features, for example, the amplification of dim light, or the detection
  • sensors 110, 112, and 114 are a sensor. Further preferably, at least one of sensors 110, 112, and 114 is a sensor.
  • Optical & routing module 102 includes optical elements such as
  • range that matches the view as seen by the user typically infinite range.
  • optical elements on a helmet.
  • the optical elements may be either shared among all of
  • sensors 110, 112 and 114 or associated separately with each sensor.
  • helmet headset or display device manufacturer can also provide for personal calibration, either once or before each
  • a stationary or hand held camera further requires
  • the user's line-of-sight detection can be employed for spatial
  • a user's line-of-sight detector can refer to a fairly accurate head line-
  • temporal filtering at the pixel level is subsequently conducted with
  • Optical and routing module 102 routes the incoming rays toward
  • the router can be installed
  • the router can form an
  • optical and routing module 102 integral part of the optical elements in optical and routing module 102 or it
  • optical and routing module can be installed within the optical elements of optical and routing module
  • An example for the latter includes disposing diffractive lenses among
  • optical elements of optical and routing module 102 or the disposing of integrated parallel optical routes along the optical path.
  • the router may be based on aperture division (including different
  • the router can also be disposed
  • optical and routing module 102 pass through optical and routing module 102 in different paths toward their
  • beam splitter element can rely for instance on intensity allocation or
  • Such beam splitters include, for example, a
  • each wave band is reflected or refracted in a
  • splitter include lenses, diffractive element, micro machining (mechanically
  • deflecting plates - MEMS/MOEMS deflecting plates - MEMS/MOEMS
  • bifocal optics such as two parallel
  • the splitter can be any optical barrels, multiple path optics, and the like.
  • the splitter can be any optical barrels, multiple path optics, and the like.
  • the light intensity is directed toward a daylight sensor and a large-portion
  • Such a splitter can include, for example, a "10%-90%” prism, wherein 10%
  • Daylight is a naturally strong light, and thus
  • An alternative splitter can include a 10%-90% pellicle (such as a thin membrane stretched on a ring), wherein 90% of the
  • sensors 120, 122, or 124 may
  • a visible light sensor includes a visible light sensor, a night vision enhancement sensor, a
  • FLIR forward looking infra-red
  • Suitable FLIR vision sensors may include an InGaA (Indium
  • Gallium Arsenide based sensor for the short wave infrared range
  • Photodetector for the long wave infrared range.
  • Photodetector for the long wave infrared range.
  • FLIR vision may not be functional with optical elements that are
  • the FLIR-embodying camera can be mounted
  • a visible light sensor is either a black and white or color sensor
  • APS Active Pixel System
  • the APS can be any Active Pixel System (APS) operational, for instance, in the 400 to 680nm (or the narrower 450 to 650nm) visible band.
  • APS can be any Active Pixel System operational, for instance, in the 400 to 680nm (or the narrower 450 to 650nm) visible band.
  • the APS can be any Active Pixel System (APS) operational, for instance, in the 400 to 680nm (or the narrower 450 to 650nm) visible band.
  • the rolling row APS be of a "rolling row” type or a “global shutter” type.
  • the rolling row APS is a “rolling row” type or a "global shutter” type.
  • the global shutter APS provides for whole pixels array
  • At least one sensor is operational at the non-visible, 1064nm IR frequency
  • LIDAR Laser Intensity Direction And Ranging
  • a visible light sensor can also include this IR vision capability.
  • the APS for that purpose can
  • an APS for that purpose can include a broadband
  • optical and routing module 102 With the optical and routing module 102.
  • Camera 100 can be operative to apply to both eyes of the user
  • camera 100 can be any one eyes of the helmet wearer, where applicable.
  • camera 100 is divided for its separate application to each eye of the user.
  • the division can take place with a suitable processor (such as control and
  • camera 100 is operative to
  • Camera 100 can be stationary, hand held, or mounted on,
  • HMD head mounted display
  • helmet a helmet
  • headset goggles
  • Camera 100 can be designed for
  • Camera 200 includes optical and routing module
  • control and processing unit 250 and
  • Optical and routing module 202 includes optical elements 204
  • the light beam is forwarded to router 206.
  • Router 206 selectively routes the light beam on a frequency domain basis
  • a dim light or night vision enhancement sensor usually requires
  • a sensor for measuring the weak dim or night light A sensor for measuring the weak dim or night light.
  • non-daylight vision in a military aircraft cockpit is preferably operational at the 650nm (or 680nm) to 950nm band (type A, B or C NVIS per
  • the band is selected according to the display
  • This selected band also evades dim lights in the cockpit (400-650nm) that
  • Image intensifier 234 can be of the type employing a
  • Sensor element 224 includes a
  • CMOS imager that samples the screen to provide an electric signal
  • Image intensifier 234 can
  • a typical exposure can be for example 60 exposures per second
  • HVPS HVPS tube
  • intensifier can provide for the voltage determining the intensification
  • Control of the gating can provide for the protection of a night vision
  • control can be automated by means of a suitable
  • the image intensifier can also be any other light range or intensity detector.
  • the image intensifier can also be any other light range or intensity detector.
  • the image intensifier can also be any other light range or intensity detector.
  • an optical on/off iris shutter which may be part of
  • An APS for the day or night sensors is preferably compatible to a
  • EBAPS such as EBAPS 230.
  • EBAPS includes technology similar to the
  • EBAPS is preferably gated.
  • HVPS can provide for the controlled powering of the EBAPS. It is noted
  • router 206 can operate on the basis of
  • Router 206 can be implemented using a 10%-90% prism, wherein 10% of the light intensity is reflected to a daylight sensor and 90%
  • routers can include a VIS-NIR separator, splitting between the Visual
  • notch filter for the 1064nm frequency can be added to router 206, for
  • a further routing method operates under the time domain.
  • routers can include a switching mirror (such as of MEMS type), which
  • router 206 can be operated so that, for example, 10% of
  • Camera 200 also includes control and processing unit 250
  • Control and processing unit 250 controls with control and processing unit 250.
  • Control and processing unit 250 controls with control and processing unit 250.
  • an image processor that merges or fuses the images in registration and provides them downstream to display module 260 or to an
  • each frame is synchronized to equivalent
  • Control and processing unit 250 functions
  • unit 250 preferably also controls the various elements of camera 200 such
  • the output of cameras 100 and 200 can be any suitable context of airborne cameras.
  • Camera 300 includes a housing 302, an optical module 304, a
  • router 306 a daylight sensor APS 310, a nightlight image intensifier 312, a
  • nightlight sensor APS 3144 a control and processing unit 320, and a power
  • Router 306 includes a splitter element 330 represented by a
  • Image intensifier 312 includes a high voltage power supply
  • Optical module 304 directs
  • the routed rays can be based on intensity such as with a 10%-90%
  • the routed rays can also be
  • VIS-NIR VIS-NIR
  • the rays can be routed based on time
  • Image intensifier 312 intensifies rays
  • HVPS tube 332 surrounds the
  • HVPS tube 332 may be spatially
  • APS 314 is
  • camera 300 functions as a 24 hour camera, with
  • APS 310 and APS 314 are coupled with control and processing
  • APS 310 and APS 314 provide the
  • Control and processing unit 320 merges the retrieved images in
  • Control and processing unit 320 includes an interface and control
  • a power supply card in the form of a power supply card, provides the power to the various components
  • Control and processing unit 320 is coupled with user line-of-sight
  • Detector 350 provides real time information regarding the
  • user line-of-sight detector 350 includes a head line-of-sight
  • noise at the pixel level is subsequently conducted by optimizing the image
  • Optical and routing module 404 includes a router combined or integral with
  • Optical and routing module 404 functions as an optical module.
  • Camera 400 further includes a display 452 coupled with
  • Control and processing unit provides its
  • such a display can include
  • an HMD display such as a helmet visor, goggles or other eye piece, and
  • Camera 500 includes optical and routing module 504, coupled
  • daylight sensor APS 510 daylight sensor APS 510
  • nightlight sensor 512 nightlight sensor 512
  • thermal sensor 556 Optical and routing module 504 routes incoming rays
  • Nightlight sensor 512 includes an image intensifier and an APS analogous to the embodiment of Figure 3, or an EBAPS analogous to the
  • Thermal sensor 556 includes an uncooled Focal
  • FPA Plane Array
  • APS Plane Array
  • Optical and routing module 504 routes the Short
  • SWIR Wave Infra Red
  • optical and routing module 504 can be designed to
  • MWIR Mid Wave Infra Red
  • Camera 500 also includes control and processing unit 502, a
  • accelerometers 546 incorporates at least two accelerometers, such as accelerometers 546 and
  • Vibrations of camera 500 can be caused by any
  • accelerometers preferably installed as tiny MEMS chips, suffice for
  • gyroscope may be used instead of accelerometers 546 and 548 for the
  • Accelerometers 546 and 548 are
  • Kalman Predictor/warping 558 which is coupled with control
  • Accelerometers 546 and 548 detect and monitor
  • Kalman predictor/warping 558 A Kalman predictor is employed to calculate, on a per image basis,
  • control and processing unit 502 that incorporates the
  • Kalman predictor function or the warping function or both.
  • predictor or a warping module can each form a separate module, and are
  • optical and routing module 102 or optical
  • procedure 602 the incoming rays are routed toward at
  • module 102 or router 206 routes the incoming rays toward sensors 110,
  • procedure 602 can also be performed simultaneously with procedure 600.
  • each of the at least two sensors provides
  • each of sensors 110, 112, and 114 capture images of the scene
  • sensor APS 310 provides for daylight detection, and sensor 314 together
  • image intensifier 312 (l 2 APS configuration) provides for dim light or
  • processing unit 250 merges in registration the images provided by
  • Procedure 608 preferably includes the sub-
  • Procedure 608 further preferably includes the sub-procedure of generating
  • illuminators such as
  • the display for optional viewing.
  • 210, 212 and 214 provide the retrieved images to control and processing
  • stabilization is carried out based on readings from accelerometers, for
  • sight detector 350 provides the relevant readings to control and processing
  • the spatial and temporal filter is preferably based on filtering at the pixel level with
  • Figure 7 is a
  • Figure 8 is an expanded view of the schematic illustration of
  • Figures 7 and 8 show a spatial and temporal filter (S/T filter),
  • the filtering is performed in real-time with minimal
  • S/T filter 700 consists of a spatial filter
  • Spatial filter 702 performs scintillation noise
  • Temporal filter 704 is based on an MR (Infinite Impulse Response) filter
  • Temporal filter 704 includes a memory 712
  • Memory 712 stores the history data, which consists of
  • Processes 706, 708 and 710 constitute a
  • Alpha multiplier process 706 operates on the incoming
  • beta multiplier process 708 operates on the history data
  • adder process 710 combines the two pixel streams. For example,
  • the filter will improve the signal to noise ratio by a factor of at least 3.
  • the output from adder 710 serves as the filter output, which is then fed
  • a bilinear process 716 is performed to calculate
  • filter 704 further includes a motion detection process 718 that monitors the
  • alpha multiplier process 706 and the beta coefficient for beta multiplier

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

A versatile camera that operates in various visibility conditions, such as daylight and poor visibility conditions, the camera including at least two sensors that capture images in a scene and provide a digital representation of the captured images, each sensor having a particular operational wavelength, and an optical and routing module that receives incoming rays from the scene and routes the incoming rays toward the two sensors.

Description

VERSATILE CAMERA FOR VARIOUS VISIBILITY CONDITIONS
FIELD OF THE DISCLOSED TECHNIQUE
The disclosed technique relates to 24 hour remote vision in
general, and more particularly to methods and systems for improving the
remote vision by reproducing the field of view under various visibility
conditions. The disclosed technique is particularly applicable to real time
displays of a direct scene, such as on a helmet visor, vision goggles or
other eyepieces.
BACKGROUND OF THE DISCLOSED TECHNIQUE
Various visibility conditions typify many applications requiring
deployment of a camera for remote vision. Such a camera is often a video
camera that continuously provides images of a scene. The camera is
often required to operate around the clock - during day and night, and
under changing weather conditions. The camera may be stationary. In
many cases the camera is portable or carried by a vehicle, for terrestrial or
airborne tasks, and as such can be exposed to unpredictable or rapidly
changing visibility circumstances. A good example is a camera employed
for Head Mounted Display ("HMD"). HMD concerns a helmet or goggles
or monocle wearer, within or outside a vehicle, whether for military or
civilian purposes. HMD prevalently features image projection reflected from monocle or goggles lenses or a helmet visor to the eyes of the
wearer in registration with the direct scene as seen by the wearer through
the visor or the goggles. HMD users can include a pilot or another
airplane crew member, a vehicle operator (in space, air, sea, or land), an
arms operator, a foot soldier, and the like. For the sake of simplicity,
reference below shall be frequently made to the example of a helmet visor
and a pilot, whereby it is noted that the principles concerned are well
applicable to other devices and methods with analogous implications.
In airplane cockpits, Heads Up Display ("HUD") is giving way to
HMD. The image often includes a vision enhancement display, wherein
the field of view as seen by the user through the visor, namely a direct
scene, is combined with an image of the same view made to reflect from
the visor. The combination is conducted in registration, namely - the
projected image of the field of view converges, in real time, with that of the
actual direct scene view as seen by the user through the visor. The
projected image is captured by a camera, and is manipulated by image
processing means available onboard (often on the helmet). If necessary,
with the aid of positioning means, the system can calculate in real time the
user's head (such as the pilot's helmet) position and view orientation, and
provide the image compatible to the user's field of view. If the camera is
mounted on the helmet or attached to the user's head or on a suitable
headset or goggles, the burden of calculating the relative fields of view of
the user and the camera can be relieved or spared entirely, as the camera can be aligned to face the field of view of the user. The installment of a
camera on a helmet or another eyepiece support calls for miniaturization
of the camera in size and weight as much as possible, so as to eliminate
interference to the user. A compact camera has limited room and poses a
difficulty for containing space-consuming features, such as night vision
enhancement, spectral conversion and high quality broad spectrum
perception. The use of a single sensor for various wavebands, such as
visible spectrum (for daylight vision), NIR (for night vision), or infrared
(such as for thermal detection), imposes a difficult task for the sensor to
achieve. A single sensor cannot adapt simultaneously for optimal
detection of different wavelength ranges and/or wide range of illumination
levels without limiting resolution and refresh rates. The use of separate
cameras, one for each waveband or illumination level, incurs the addition
of excess weight when several cameras are used simultaneously.
Alternatively, repeated switching between different cameras is
cumbersome, and increases the manufacturing and maintenance costs.
SUMMARY OF THE DISCLOSED TECHNIQUE
It is an object of the disclosed technique to provide a novel
method and camera for various visibility conditions, that allows for
concurrently containing at least two vision sensing features, such as
daylight vision, dim light or night vision enhancement, spectral conversion,
high quality broadband perception, and the like. Spectral conversion can
refer to capturing the view in non-visible bands, such as ultraviolet (UV),
near infrared (NIR) and thermal infrared (IR). IR is detected by the
Forward Looking Infra Red (FLIR) technique, either at the higher
frequencies (such as for active IR) or lower frequencies (such as for
thermal detection). Mere enhancement of direct scene daylight is
redundant, but can be beneficial for providing an indirect scene daylight
image, or accentuation of certain objects only, such as the pointing spots
of laser designators or the "coloring" of objects for identification purposes
(friend or foe, preplanned target, terrestrial navigational marks, and the
like). Dim light and night vision involve image intensification. FLIR vision
involves the conversion of detected IR wavelengths into a visible display.
The disclosed technique overcomes the disadvantages of the
prior art by providing a stationary, portable, handheld, or head mounted
camera for capturing and conveying a direct or indirect scene, while
routing the incoming light to at least two sensors, wherein each sensor is
optimally designed for a particular waveband. In accordance with the disclosed technique, there is thus provided a versatile camera for various
visibility conditions, having an optical and routing module and at least two
sensors. The optical and routing module serves to receive incoming rays
from the scene and route these rays to at least two sensors, respectively.
The at least two sensors serve to capture images of the scene and provide
a digital signal representation of the images, wherein each sensor has
particular operational wavelength range or ranges. According to one
aspect of the disclosed technique, at least one of the at least two sensors
incorporates a poor visibility conditions sensor, such as a dim light
features amplifier or an invisible light sensor for detecting invisible light
and its conversion to a visible representation. Preferably, at least another
one of the at least two sensors is a daylight sensor, thus providing
seamless capabilities for day and night or for good and poor visibility
conditions. The images retrieved by the sensors are forwarded to the
head mounted display, or initially processed and merged in registration.
According to the disclosed technique there is also provided a
method for providing images of a scene under various visibility conditions
for a display, by which incoming rays are received from the scene and
routed to at least two sensors. Each sensor has a particular operational
wavelength range. Preferably, at least one of the at least two sensors is a
poor visibility conditions sensor, providing for instance the amplification of
dim light, or conversion of invisible light to a visible representation.
Preferably, at least another one of the at least two sensors is a daylight sensor, thus providing seamless day and night (24 hour) capabilities. The
images are then forwarded to the display or merged in registration and
processed and the resultant image is then provided to the head mounted
display.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosed technique will be understood and appreciated
more fully from the following detailed description taken in conjunction with
the drawings in which:
Figure 1 is a schematic illustration of a camera constructed and
operative in accordance with one embodiment of the disclosed technique;
Figure 2 is a schematic illustration of a camera constructed and
operative in accordance with another embodiment of the disclosed
technique;
Figure 3 is a schematic illustration of a camera constructed and
operative in accordance with a further embodiment of the disclosed
technique;
Figure 4 is a schematic illustration of a camera constructed and
operative in accordance with yet another embodiment of the disclosed
technique;
Figure 5 is a schematic illustration of a camera constructed and
operative in accordance with yet a further embodiment of the disclosed
technique;
Figure 6 is a block diagram of a method for converting a scene
for a display according to another embodiment constructed and operative
in accordance with the disclosed technique; Figure 7 is a schematic illustration of spatial and temporal
filtering with respect to user line-of-sight, operative in accordance with an
embodiment of the disclosed technique; and
Figure 8 is an expanded view of the schematic illustration of
Figure 7.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Reference is now made to Figure 1 , which is a schematic
illustration of a camera, generally referenced 100, constructed and
operative in accordance with an embodiment of the disclosed technique.
Camera 100 includes an optical and routing module 102 and "N" sensors
110, 112, and 114. N can be any natural number greater than one, as
camera 100 comprises at least two sensors. Optical and routing module
102 includes optical elements for capturing images and a router for routing
the captured images to different channels. The optical elements and the
router can be integral or separate elements. Optical and routing module
102 is coupled with sensors 110, 112, and 114. Incoming rays from a
scene are perceived by optical and routing module 102 and forwarded to
sensors 110, 112, and 114. The scene can be a direct scene - conforming
to the view of a real time spectator; or an indirect scene - which is not
viewed simultaneously by a direct spectator. The light detected by each of
sensors 110, 112, and 114 is converted to a digital signal representation of
the images of the captured scene, usually by pixel values of a raster
pattern. The image signal is fed downstream for further processing to a
display or to an intermediate image processor that merges the images in
registration into a single image. Each of sensors 110, 112, and 114 has a
particular operational wavelength range (or set of ranges) that suits for
detection in a particular operational domain. The range(s) of each of sensors 110, 112, and 114 can be distinct, but can also overlap, at least
partially, with spectral range(s) of other sensors thereof. Preferably, at
least one of sensors 110, 112, and 114 is a poor visibility conditions
sensor, i.e.: a sensor that facilitates perception of the viewed scenery
despite the poor visibility circumstances. A poor visibility conditions
sensor features, for example, the amplification of dim light, or the detection
and conversion of invisible light to a visible representation. It is noted that
both amplification and conversion capabilities often exist in a single
sensor. Further preferably, at least one of sensors 110, 112, and 114 is a
daylight sensor. By selecting at least one of sensors 110, 112, and 114 to
be a daylight sensor, and selecting at least another one of sensors 110,
112, and 114 to be a poor visibility conditions sensor, camera 100
acquires seamless day and night (24 hour) capabilities, or seamless good
visibility and poor visibility, capabilities.
Optical & routing module 102 includes optical elements such as
lenses that provide for the capturing of the general view, focusing at a
range that matches the view as seen by the user (typically infinite range).
For example a viewing angle of about 50 degrees focused at infinite range
satisfactorily covers the entire field of view of a pilot wearing a HMD device
on a helmet. The optical elements may be either shared among all of
sensors 110, 112 and 114, or associated separately with each sensor.
The calibration of camera 100 with the line-of-sight of the HMD wearer can
be predetermined by the helmet headset or display device manufacturer, but can also provide for personal calibration, either once or before each
operation, if necessary. A stationary or hand held camera further requires
calibration between the user's line-of-sight and the camera. Such a task is
achievable by position and orientation sensors coupled with the user's
head, along with suitable control and processing means that either
continuously direct the camera toward the user's line-of-sight with
adequate physical orientation means coupled with the camera, or retrieve
(by appropriate algorithms) the image view angle that corresponds to the
user's line-of-sight.
The user's line-of-sight detection can be employed for spatial
and temporal stabilization of camera 100 or of the image output of camera
100. A user's line-of-sight detector can refer to a fairly accurate head line-
of-sight reader or to a more precise eye line-of-sight tracker. Spatial and
temporal filtering at the pixel level is subsequently conducted with
reference to the readings of the user's line-of-sight detector
Optical and routing module 102 routes the incoming rays toward
sensors 110, 112, and 114, respectively. The router can be installed
along the path of the incoming rays downstream of the optical elements
and upstream of sensors 110, 112, and 114. The router can form an
integral part of the optical elements in optical and routing module 102 or it
can be installed within the optical elements of optical and routing module
102. An example for the latter includes disposing diffractive lenses among
the optical elements of optical and routing module 102, or the disposing of integrated parallel optical routes along the optical path. The operation of
the router may be based on aperture division (including different
apertures) or wavefront division (based on wavelength or a fixed
percentage), or a combination thereof. The router can also be disposed
upstream of the optical elements and produce different light beams that
pass through optical and routing module 102 in different paths toward their
corresponding different sensors. Optional elements that can function as a
beam splitter element can rely for instance on intensity allocation or
wavelength "segregation". Such beam splitters include, for example, a
slanted semi-transparent partially reflecting mirror, a prism, a pellicle, and
a spectral splitter, wherein each wave band is reflected or refracted in a
different direction toward a compatible sensor. Other examples for beam
splitter include lenses, diffractive element, micro machining (mechanically
deflecting plates - MEMS/MOEMS), bifocal optics (such as two parallel
optical barrels), multiple path optics, and the like. The splitter can
comprise a small-portion/large-portion splitter, wherein a small-portion of
the light intensity is directed toward a daylight sensor and a large-portion
of the light intensity is directed toward a poor visibility conditions sensor.
Such a splitter can include, for example, a "10%-90%" prism, wherein 10%
of the light intensity is reflected to a daylight sensor and 90% is refracted
toward a night vision sensor. Daylight is a naturally strong light, and thus
10% can suffice for sensor detection while most of the weak night light is
required for detection. An alternative splitter can include a 10%-90% pellicle (such as a thin membrane stretched on a ring), wherein 90% of the
light intensity is reflected to a night vision sensor and 10% is refracted
toward a daylight sensor.
For various visibility conditions, sensors 120, 122, or 124, may
include a visible light sensor, a night vision enhancement sensor, a
forward looking infra-red (FLIR) sensor, and the like.
Suitable FLIR vision sensors may include an InGaA (Indium
Gallium Arsenide) based sensor for the short wave infrared range; an InSb
(Indium Stibnite) based sensor for the mid wave infrared range; or a non-
refrigerated VOx (Vanadium Oxide) micro bolometer, a GaA (Gallium
Arsenide) based sensor, or a QWIP (Quantum Well Infrared
Photodetector) for the long wave infrared range. In the context of an
aircraft, FLIR vision may not be functional with optical elements that are
opaque to thermal detection frequencies applicable for FLIR, namely - 3 to
5 microns (3000-5000nm). Materials such as ZnS (Wurtzite, Zinc Blende
or Sphalerite also known as "cleartrun" or "cleartrans") with broadband
permeability including the FLIR operational range, can be employed in this
context for the camera optical elements in the optical and routing module
102. With common cockpit canopies being substantially opaque to the
FLIR operational ranges, the FLIR-embodying camera can be mounted
externally on the airplane body.
A visible light sensor is either a black and white or color sensor,
preferably an Active Pixel System (APS) operational, for instance, in the 400 to 680nm (or the narrower 450 to 650nm) visible band. The APS can
be of a "rolling row" type or a "global shutter" type. The rolling row APS
scans the pixels row after row in a raster and incurs a non-uniform
integration signal. The global shutter APS provides for whole pixels array
exposure at the same "instant" - during a short integration time period.
According to a particular aspect of the disclosed technique, at
least one sensor is operational at the non-visible, 1064nm IR frequency,
which corresponds to a prevalent laser designating frequency known as
LIDAR (Laser Intensity Direction And Ranging) application. In some cases
a visible light sensor can also include this IR vision capability. The
detection of the 1064nm frequency provides for the user indication of the
physical spot on the laser-designated objects in real time. Such detection
can be conducted during the day as well as the night. The 1064nm
frequency is not detected by regular daytime and nighttime sensors. If the
1064nm radiation is concealed by the strong daylight radiation, its daylight
sensor can remain operative during the night in parallel to a night vision
sensor and provide additional image information. This frequency, as well
as other frequencies can by employed for emphasizing laser-designated
objects, identification of modulated or encrypted transmission by a laser,
and the like.
If a laser or another active source is employed to shed light at
this frequency by covering or scanning a large area, the additional
detected frequency can add a contrast effect to the overall image detected by the other "regular" frequencies. In the military context, if detection of
this frequency is beyond the range of day and IR detectors, an active
source is not likely to be detected by a foe. The APS for that purpose can
also include a "high pass" IR sensor, operational at wavelengths above
950nm. Alternatively, an APS for that purpose can include a broadband
sensor operational from 400nm to 1100nm, covering daylight, while a filter
that blocks the undesired wavelengths can be implemented in conjunction
with the optical and routing module 102.
Camera 100 can be operative to apply to both eyes of the user,
wherein the image is divided for its separate application to each eye. In
the example of HMD with a helmet, preferably, only one camera 100 is
mounted on the helmet, to minimize weight and costs. In order to reach
both eyes of the helmet wearer, where applicable, camera 100 can be
operative to apply to both eyes of the user, wherein the image output of
camera 100 is divided for its separate application to each eye of the user.
The division can take place with a suitable processor (such as control and
processing unit 250 of Figure 2) or in further elements on the helmet
display features or on board. Alternatively, camera 100 is operative to
apply to a single eye of the user. Further alternatively, a second camera
similar to camera 100 is operative to apply to the other eye of the user. It
may be desirable to provide a separate camera for each eye of the user,
such as when stereoscopic vision is of substantial significance. Camera 100 can be stationary, hand held, or mounted on,
integral with, added on, or attachable to a device worn by the user, such
as a head mounted display (HMD), a helmet, a headset, goggles,
eyepiece, binoculars and a monocle. Camera 100 can be designed for
use in an air, space, sea, or land environment, onboard a vehicle or for
portable use by an individual outside a vehicle.
Reference is now made to Figure 2, which is a schematic
illustration of a camera, generally referenced 200, constructed and
operative in accordance with another embodiment of the disclosed
technique. The embodiment shown in Figure 2 has similarities to the one
shown in Figure 1 , with like parts designated by like numerals except for
the use of a prefix 200 instead of 100, and their functioning is analogous
and thus not elaborated. Camera 200 includes optical and routing module
202, sensors 210, 212, and 214, control and processing unit 250, and
display module 260.
Optical and routing module 202 includes optical elements 204
and router 206. In this example, light is initially encountered by optical
elements 204. Subsequently, the light beam is forwarded to router 206.
Router 206 selectively routes the light beam on a frequency domain basis
toward the relevant sensors 210, 212, and 214.
A dim light or night vision enhancement sensor usually requires
light enhancement for intensifying the weak dim or night light. A sensor for
non-daylight vision in a military aircraft cockpit is preferably operational at the 650nm (or 680nm) to 950nm band (type A, B or C NVIS per
MIL-STD-3009). The band is selected according to the display
requirements, such as whether the display is colored or monochrome.
This selected band also evades dim lights in the cockpit (400-650nm) that
can impair perception of the ambient nighttime light.
One device that provides such enhancement is an image
intensifier (I2), such as image intensifier 234, coupled with a nightlight
sensor element 224. Image intensifier 234 can be of the type employing a
photo-cathode for converting photon into electrons, an MCP ("Micro
Cannel Plate") for multiplication of electron flux, and a phosphorous
screen in which the electrons emit photons. Sensor element 224 is
preferably an electronic video sensor. Sensor element 224 includes a
CMOS imager that samples the screen to provide an electric signal
representation of the pixel intensity. Control of the electric field strength
provided by the voltage difference between the plates determines the rate
of intensification. For example 100V to 1000V difference can provide for
up to 30,000 fold multiplication of electron flux. Image intensifier 234 can
be "gated", i.e. exposed to detection of incoming photons for limited
periods. A typical exposure can be for example 60 exposures per second
with 100nsec-30msec duration for each. A High Voltage Power Supply
tube (HVPS) that encompasses a cylindrical or barrel-shaped image
intensifier can provide for the voltage determining the intensification
provided by the MCP and the duration of exposure in a gated intensifier. Control of the gating can provide for the protection of a night vision
intensifier against damage caused by penetration of excess light into the
sensitive intensifier. The control can be automated by means of a suitable
light range or intensity detector. The image intensifier can also be
protected by means of an optical on/off iris shutter, which may be part of
router 206.
An APS for the day or night sensors is preferably compatible to a
standard format such as VGA, SVGA, XGA, QXGA, UXGA; SXGA, and
HDTV, to suit standard display features.
An alternative enhancement for dim light or night vision sensor
can be provided by an Electron Bombardment Active Pixel Sensor
(EBAPS) such as EBAPS 230. EBAPS includes technology similar to the
one described above in reference to image intensifier 234, except for the
electron flow which is not converted into photon radiation in a phosphorous
screen, but rather forwarded directly to, for instance, a CMOS coupled
therewith to provide an electric signal representation of the pixel intensity.
Analogously, EBAPS is preferably gated. A High Voltage Power Supply
(HVPS) can provide for the controlled powering of the EBAPS. It is noted
that EBAPS occupies substantially smaller space than that consumed by a
sensor coupled with an image intensifier.
In the above context, router 206 can operate on the basis of
intensity (rather than frequency), splitting the incoming light beam into two
or more paths. Router 206 can be implemented using a 10%-90% prism, wherein 10% of the light intensity is reflected to a daylight sensor and 90%
is refracted toward a night vision sensor. An alternative adequate option is
a 10%-90% pellicle, wherein 90% of the light intensity is transferred (or
reflected) to a night vision sensor and 10% is reflected (or refracted)
toward a daylight sensor.
Other routing methods operate on a frequency basis. Such
routers can include a VIS-NIR separator, splitting between the Visual
Spectrum, which is routed to the daylight sensor, and the Near Infra-Red
spectrum, which is routed to the nightlight sensor. In the context of
passing the 1064nm band together with daylight to the same sensor, a
notch filter for the 1064nm frequency can be added to router 206, for
limiting the NIR spectrum from reaching the daylight sensor.
A further routing method operates under the time domain. Such
routers can include a switching mirror (such as of MEMS type), which
alternately routes the light beam toward two or more sensors. In the
above context, router 206 can be operated so that, for example, 10% of
the time period the light beam is routed to the day sensor, and 90% of the
time period the light beam is routed to the night sensor.
Camera 200 also includes control and processing unit 250
coupled with sensors 210, 212, and 214, and a display unit 260 coupled
with control and processing unit 250. Control and processing unit 250
receives the digital pixels images provided by sensors 210, 212, and 214,
and includes an image processor that merges or fuses the images in registration and provides them downstream to display module 260 or to an
external display. Without control and processing unit 250, the provision of
the signal information directly from the sensors 210, 212, and 214 to an
external display, in a HMD or elsewhere onboard (if on a vehicle), requires
heavy wiring. These images are typically pixel-based video images of the
same scenery, where preferably each frame is synchronized to equivalent
frames in the different sensors. Control and processing unit 250 functions
as a real time combiner, that combines into a single image the images
provided by sensors features 210, 212, and 214. Control and processing
unit 250 preferably also controls the various elements of camera 200 such
as activating a desired sensor or controlling a controllable splitter. In the
context of airborne cameras, the output of cameras 100 and 200 can be
recorded for after flight interrogation purposes at a memory (not shown)
mounted in camera 100, or externally on a helmet, headset, goggles, or it
can be forwarded to a memory elsewhere onboard.
With reference to Figure 3 there is shown a schematic
illustration of a camera, generally referenced 300, constructed and
operative in accordance with a further embodiment of the disclosed
technique. Camera 300 includes a housing 302, an optical module 304, a
router 306, a daylight sensor APS 310, a nightlight image intensifier 312, a
nightlight sensor APS 314, a control and processing unit 320, and a power
supply 322. Router 306 includes a splitter element 330 represented by a
slanted prism or mirror 330, in the context of an exemplary implementation of router 306. Image intensifier 312 includes a high voltage power supply
HVPS 332 and an on/off iris shutter 334. Optical module 304 directs
incoming rays, represented by rays 340 to router 306. Router 306
conveys some of rays 340 toward image intensifier 312, and deflects some
of rays 340 toward APS 310, as represented by rays 342. The allocation
of the routed rays can be based on intensity such as with a 10%-90%
prism 330 or a bifocal optical module. The routed rays can also be
allocated based on frequency differentiation, such as with a VIS-NIR
separator. Yet alternatively, the rays can be routed based on time
differentiation, such as with a switching mirror. A notch filter for the
1064nm frequency or a high pass filter for wavelengths above 1 micron
can be installed in router 306 for the rays directed to APS 310, which is
sensitive to the 1064nm frequency. Image intensifier 312 intensifies rays
340 for detection by APS 314. This is an l2APS configuration. On/off iris
shutter 334 is disposed before intensifier 312 for controlling penetration of
excess daylight into intensifier 312. HVPS tube 332 surrounds the
cylindrical intensifier 312. Alternatively, HVPS tube 332 may be spatially
separated from intensifier 312. The cross sectional segments of HVPS
tube 332 above and below intensifier 312 are illustrated. HVPS 332
controls and supplies power for intensifier 312 and iris shutter 334. In a
daylight environment, only APS 310 is functional, for both the visible range
and for the 1064nm frequency. In a nightlight environment, APS 314 is
functional for the nightlight and APS 310 is functional for the 1064nm frequency only. Thus, camera 300 functions as a 24 hour camera, with
automatic seamless sensing capabilities adaptive to changing light
conditions.
APS 310 and APS 314 are coupled with control and processing
unit 320 via flexible printed circuitry. APS 310 and APS 314 provide the
images retrieved as a signal representation to control and processing unit
320. Control and processing unit 320 merges the retrieved images in
registration and processes them to provide the resultant image to a
display. Control and processing unit 320 includes an interface and control
card that provides the interface for receiving control information and
delivering the output signal for the display. Power supply 322, preferably
in the form of a power supply card, provides the power to the various
elements of camera 300, directly or through control and processing unit
320. Control and processing unit 320 is coupled with user line-of-sight
detector 350. Detector 350 provides real time information regarding the
fairly accurate head line-of-sight or to the more precise eye line-of-sight.
Accordingly, user line-of-sight detector 350 includes a head line-of-sight
reader or an eye line-of-sight tracker. Spatial and temporal filtering of
noise at the pixel level is subsequently conducted by optimizing the image
quality to correlate by minimum noise with the readings of line-of-sight
detector 350.
Referring now to Figure 4, there is shown a schematic illustration
of a camera, generally referenced 400, constructed and operative in accordance with yet another embodiment of the disclosed technique. The
embodiment shown in Figure 4 is similar to the one shown in Figure 3, with
like parts designated by like numerals except for the use of a prefix 400
instead of 300, and their functioning is analogous and thus not elaborated.
Optical and routing module 404 includes a router combined or integral with
an optical module. Optical and routing module 404 functions as an
equivalent to both optical module 304 and router 306 in Figure 3. EBAPS
412 functions as an equivalent to both intensifier 312 and nightlight APS
314 (an equivalent to iris shutter 334 is redundant), and is powered by
HVPS 432. Camera 400 further includes a display 452 coupled with
control and processing unit 402. Control and processing unit provides its
output signal to display 452, for displaying the images detected by APS
410 and EBAPS 412. In the example of a pilot, such a display can include
an HMD display, such as a helmet visor, goggles or other eye piece, and
the like.
With reference to Figure 5, there is shown a schematic
illustration of a camera, generally designated 500, constructed and
operative in accordance with yet a further embodiment of the disclosed
technique. Camera 500 includes optical and routing module 504, coupled
with three sensors: daylight sensor APS 510, nightlight sensor 512, and
thermal sensor 556. Optical and routing module 504 routes incoming rays
toward the three sensors, analogous to the embodiment of Figure 2.
Nightlight sensor 512 includes an image intensifier and an APS analogous to the embodiment of Figure 3, or an EBAPS analogous to the
embodiment of Figure 4. Thermal sensor 556 includes an uncooled Focal
Plane Array (FPA) and APS, analogous to the one shown in the
embodiment of Figure 2. Optical and routing module 504 routes the Short
Wave Infra Red (SWIR) band toward the uncooled FPA of thermal sensor
556. Alternatively, optical and routing module 504 can be designed to
support a Mid Wave Infra Red (MWIR) sensor or a Long Wave Infra Red
(LWIR) sensor.
Camera 500 also includes control and processing unit 502, a
power supply 522 and display 552, all of which function analogously as
described with reference to Figures 2, 3, and 4. Camera 500 further
incorporates at least two accelerometers, such as accelerometers 546 and
548, for the task of spatial image stabilization for the display, due to
mechanical vibrations. Vibrations of camera 500 can be caused by any
shake, tremor, quivering or trembling source, etc., that eventually
destabilizes the image perceived. In most cases two or three
accelerometers, preferably installed as tiny MEMS chips, suffice for
carrying out the task of measuring spatial vibrations. It is noted that a
gyroscope may be used instead of accelerometers 546 and 548 for the
purposes of vibration measurement. Accelerometers 546 and 548 are
coupled with Kalman Predictor/warping 558, which is coupled with control
and processing unit 502. Accelerometers 546 and 548 detect and monitor
camera vibration, and provide this information to Kalman predictor/warping 558. A Kalman predictor is employed to calculate, on a per image basis,
image transformation due to camera movements, and to yield the
corrective image transfer commands. The image transfer commands
define the necessary geometrical transformation "warping" of the images
perceived by the camera sensors (510, 512, 556). All these functions can
be performed by control and processing unit 502 that incorporates the
Kalman predictor function or the warping function or both. A Kalman
predictor or a warping module can each form a separate module, and are
shown as a single module 558 for demonstrative purposes.
According to the disclosed technique there is also provided a
method for converting a direct scene for a head mounted display.
Reference is now made to Figure 6, illustrating a method for converting a
direct scene for a head mounted display according to another embodiment
constructed and operative in accordance with the disclosed technique. In
procedure 600, incoming rays from a viewed scene are received in an
optical module. With reference to Figures 1 and 2, incoming rays from the
direct scene are received by optical and routing module 102 or optical
elements 204. In procedure 602 the incoming rays are routed toward at
least two sensors. In reference to Figures 1 and 2, optical and routing
module 102 or router 206 routes the incoming rays toward sensors 110,
112, and 114, or sensors 210, 212 and 214, respectively. It is noted that
procedure 602 can also be performed simultaneously with procedure 600.
In procedure 604 images of the scene are captured in each of the at least two sensors. In procedure 606, each of the at least two sensors provides
a digital signal representation of the images it captures. In reference to
Figure 1 , each of sensors 110, 112, and 114 capture images of the scene,
and convert the images into a digital signal representation. Each sensor
has a specific operational wavelength range which is adapted to the
sensor capabilities. The capabilities of each sensor are selected to
provide complementary image information with respect to the detected
information of the other sensors. Preferably, at least one of the at least
two sensors features the amplification of dim light or the conversion of
invisible light to a visible representation, or a combination of such
amplification and conversion. If at least one other sensor detects daylight,
seamless day and night detection is provided. In reference to Figure 3,
sensor APS 310 provides for daylight detection, and sensor 314 together
with image intensifier 312 (l2APS configuration) provides for dim light or
night vision amplification and conversion.
In optional procedure 608 the images provided by the at least
two sensors are merged in registration. In reference to Figure 2, control
and processing unit 250 merges in registration the images provided by
sensors 210, 212, and 214. Procedure 608 preferably includes the sub-
procedure of image fusion between at least two sensors on the basis of
pixel intensity, at the pixel level, for providing a dynamic range extension.
Procedure 608 further preferably includes the sub-procedure of generating
a synthetic colorized image on the basis of spectral response, at the pixel level, for providing multiple spectral band observation. Such sub-
procedures provide for the elimination of the "blooming effect", typically
occurring at nighttime in night vision systems, when illuminators (such as
in urban scenery) generate bloomed light spots. An on-the-fly pixel
analysis for passing threshold intensity is performed on both the daylight
APS sensor and the nightlight l2APS intensified sensor. The best pixel
within the dynamic range, that is not saturated or cutoff, is transferred to
the display for optional viewing.
In optional procedure 610 the resultant merged images are
processed to be applied to a display. In reference to Figure 2, sensors
210, 212 and 214 provide the retrieved images to control and processing
unit 250, wherein the images are merged in registration and processed to
provide the resultant image to display module 260.
Two optional stabilizing or corrective procedures can be applied
for improving the method performance. In procedure 612, spatial image
stabilization is carried out based on readings from accelerometers, for
correcting vibration disturbances. In reference to Figure 5, accelerometers
546 and 548 provide their readings to Kalman predictor/warping 558,
which corrects the vibration disturbances to provide spatial image
stabilization. In procedure 614, spatial and temporal filtering is performed
with respect to user line-of-sight. In reference to Figure 3, user line-of-
sight detector 350 provides the relevant readings to control and processing
unit 320, which carries out the spatial and temporal filtering. The spatial and temporal filter is preferably based on filtering at the pixel level with
respect to the readings of a head line-of-sight reader or an eye line-of-
sight tracker.
Reference is now made to Figures 7 and 8. Figure 7 is a
schematic illustration of spatial and temporal filtering with respect to user
line-of-sight, operative in accordance with an embodiment of the disclosed
technique. Figure 8 is an expanded view of the schematic illustration of
Figure 7. Figures 7 and 8 show a spatial and temporal filter (S/T filter),
generally designated 700, which serves to filter an image in the spatial and
temporal domain. The filtering is performed in real-time with minimal
latency and no smearing effect. S/T filter 700 consists of a spatial filter
702 and a temporal filter 704. Spatial filter 702 performs scintillation noise
cleaning and is preferably implemented using the median process.
Temporal filter 704 is based on an MR (Infinite Impulse Response) filter
that is low pass with one pole. Temporal filter 704 includes a memory 712,
an alpha multiplier process 706, a beta multiplier process 708, and an
adder process 710. Memory 712 stores the history data, which consists of
averaged video frames. Processes 706, 708 and 710 constitute a
combiner process. Alpha multiplier process 706 operates on the incoming
video stream, beta multiplier process 708 operates on the history data,
and adder process 710 combines the two pixel streams. For example,
typical coefficient values may be: alpha = 1/10 and beta = 10/11. In this
case, the filter will improve the signal to noise ratio by a factor of at least 3. The output from adder 710 serves as the filter output, which is then fed
back to the history buffer in memory 712 as the new averaged video
frame. The address of which of the pixels are read is matched to the
current incoming pixel according to line-of-sight data 714. Line-of-sight
data 714 determines the four nearest neighboring pixels of the current
incoming pixel position. A bilinear process 716 is performed to calculate
the equivalent pixel level to the four nearest neighboring pixels. Temporal
filter 704 further includes a motion detection process 718 that monitors the
energy under the image with respect to the previous frame. This
calculation is performed by analyzing the image over several segments in
parallel. The results are then used to determine the alpha coefficient for
alpha multiplier process 706 and the beta coefficient for beta multiplier
process 708.
It will be appreciated by persons skilled in the art that the
disclosed technique is not limited to what has been particularly shown and
described hereinabove. Rather the scope of the disclosed technique is
defined only by the claims, which follow.

Claims

1. Versatile camera for various visibility conditions, comprising:
at least two sensors for capturing images of a scene and providing a
digital signal representation of said images, wherein each sensor
has a particular operational wavelength range; and
an optical and routing module for receiving incoming rays from said
scene and routing said incoming rays toward said at least two
sensors, respectively.
2. The versatile camera according to claim 1 , wherein at least one of
said at least two sensors comprises a poor visibility conditions
sensor.
3. The versatile camera according to claim 2, wherein said poor visibility
conditions sensor comprises a dim light amplifier.
4. The versatile camera according to claim 2, wherein said poor visibility
conditions sensor comprises invisible light sensor for detecting and
converting invisible light to a visible representation.
5. The versatile camera according to claim 1 , wherein at least one of
said at least two sensors comprises a daylight sensor.
6. The versatile camera according to claim 1 , further comprising a
processor for merging in registration and processing the images
provided by said sensor units.
7. The versatile camera according to claim 1 , further comprising at least
two accelerometers for spatial stabilization of said display.
8. The versatile camera according to claim 1 , further comprising at least
one gyroscope for spatial stabilization of said display.
9. The versatile camera according to claim 1 , further comprising a user
line-of-sight detector, for spatial and temporal filtering.
10. The versatile camera according to claim 9, wherein said user line-of-
sight detector comprises a head line-of-sight reader for spatial and
temporal filtering at the pixel level with reference to the readings of
said head line-of-sight reader.
1 1 . The versatile camera according to claim 9, wherein said user line-of-
sight detector comprises an eye line-of-sight tracker for spatial and
temporal filtering at the pixel level with reference to the readings of
said eye line-of-sight tracker.
12. The versatile camera according to claim 1 , wherein said at least two
sensors include any combination from the list consisting of:
a visible daylight sensor;
a night vision enhancement sensor;
- a dim light enhancement sensor;
a 1.06 micron sensor; and
a Forward looking infra-red (FLIR) sensor.
13. The versatile camera according to claim 12, wherein said FLIR
sensor may include any combination from the list consisting of:
an Indium Gallium Arsenide (InGaA) sensor;
an Indium Stibnite (InSb) sensor;
a non-refrigerated Vanadium Oxide (VOx) bolometer;
a Gallium Arsenide (GaA) sensor; and
- a Quantum Well Infrared Photodetector (QWIP).
14. The versatile camera according to claim 5, wherein said daylight
sensor comprises an Active Pixel Sensor (APS) operational for the
visible band from about 400-450nm to about 650-680nm.
15. The versatile camera according to claim 1 , wherein one sensor of
said at least two sensors comprises an Active Pixel Sensor (APS)
operational at wavelengths above 950nm (high pass IR).
16. The versatile camera according to claim 1 , wherein one sensor of
said at least two sensors comprises an Active Pixel Sensor (APS)
operational at the 1064nm IR frequency.
5
17. The versatile camera according to claim 5, wherein said daylight
sensor comprises an Active Pixel Sensor (APS) operational at visible
daylight and at ranges extending beyond the visible daylight, and
wherein said Active Pixel Sensor (APS) comprises a sensor selected
o from the list consisting of:
a sensor at 1064nm IR frequency;
a high pass IR sensor above the wavelength of 950nm; and
a broadband sensor substantially operational from about 400nm
to about 1100nm.
5
18. The versatile camera according to claim 1 , wherein at least one of
said at least two sensors comprises an image intensifier (I2) coupled
to an electronic video sensor.
o
19. The versatile camera according to claim 2, wherein said poor visibility
conditions sensor comprises an image intensifier (I2) coupled to an
electronic video sensor, which is operational for the about 650-680nm
to about 950nm wavelength band.
20. The versatile camera according to claim 18, wherein said image
intensifier (I2) comprises an optical on/off iris shutter.
21. The versatile camera according to claim 18, wherein said image
intensifier (I2) is gated.
22. The versatile camera according to claim 1 , wherein at least one of
said at least two sensors comprises an Electron Bombardment Active
Pixel Sensor (EBAPS).
23. The versatile camera according to claim 22, wherein said EBAPS is
gated.
24. The versatile camera according to claim 1 , wherein at least one of
said at least two sensors comprises an Active Pixel Sensor (APS)
compatible to a standard format selected from the list consisting of:
VGA;
SVGA;
- XGA;
QXGA;
UXGA;
SXGA; and HDTV.
25. The versatile camera according to claim 3, further comprising a High
Voltage Power Supply (HVPS) for the dim light amplifier.
26. The versatile camera according to claim 1 , wherein said optical and
routing module comprises a router selected from the list consisting of:
slanted semitransparent partially reflecting mirror;
prism;
- pellicle;
spectral splitter;
lenses;
diffractive element;
micro machining (mechanically deflecting plates
MEMS/MOEMS);
bifocal optics; and
multiple path optics.
27. The versatile camera according to claim 1 , wherein said optical and
routing module comprises a small-portion/large-portion splitter,
wherein small-portion of the light intensity is directed to a daylight
sensor and large-portion of the light intensity is directed toward a
poor visibility conditions sensor.
28. The versatile camera according to claim 27, wherein said small-
portion/large-portion splitter is selected from the list consisting of:
10%-90% prism, wherein 10% of the light intensity is directed to
a daylight sensor and 90% is directed toward a poor visibility
conditions sensor; and
10%-90% pellicle, wherein 90% of the light intensity is directed
to a poor visibility conditions sensor and 10% is directed toward
a daylight sensor.
29. The versatile camera according to claim 26, wherein said spectral
splitter is a VIS-NIR separator.
30. The versatile camera according to claim 1 , wherein said optical and
routing module includes a notch filter for the 1064nm frequency.
31. The versatile camera according to claim 1 , wherein said versatile
camera is coupled with a display for displaying said scene.
32. The versatile camera according to claim 31 , wherein said display
comprises a head mounted display.
33. The versatile camera according to claim 32, wherein said head
mounted display is selected from the list consisting of:
helmet mounted display;
headset mounted display;
- goggles;
eyepiece;
binocular display; and
monocle.
34. The versatile camera according to claim 1 , wherein said versatile
camera is operative to apply display(s) to both eyes of the user, and
wherein said digital signal representation of said images is divided for
its separate application to each eye.
35. The versatile camera according to claim 1 , wherein said versatile
camera is operative to apply a display to a single eye of the user.
36. The versatile camera according to claim 23, wherein a second similar
versatile camera is operative to apply a display to the other eye of the
user.
37. The versatile camera according to claim 1 , mounted on, integral with,
added on, or attachable to a device selected from the list consisting
of:
helmet;
headset;
goggles;
eyepiece;
binoculars; and
monocle.
38. The versatile camera according to claim 1 , adapted for use in an air,
space, sea, or land environment, for a direct or indirect scene,
onboard a vehicle or for portable use by an individual.
39. The versatile camera according to claim 1 , wherein said scene is a
direct scene, and said digital signal representation of said images is
compatible to display in registration with said direct scene as seen by
the user.
40. A method for providing images of a scene under various visibility
conditions for a display, comprising the procedures of:
receiving incoming rays from said scene; routing said incoming rays toward at least two sensors, wherein
each of said at least two sensors has a particular operational
wavelength range;
capturing images of said scene in each of said at least two
sensors; and
providing a digital signal representation of said images.
41. The method for providing images according to claim 40, wherein said
procedure of providing a digital signal representation of said images
includes any combination of procedures selected from the list
consisting of:
amplifying dim light; and
converting invisible light to a visible representation,
wherein said procedure is performed in at least one of said at least
two sensors.
42. The method for providing images according to claim 41 , wherein at
least one of said at least two sensors comprises a daylight sensor.
43. The method for providing images according to claim 40, further
comprising the procedure of merging in registration the images
provided by said at least two sensors.
44. The method for providing images according to claim 43, further
comprising the procedure of applying the resultant merged images to
a display.
45. The method for providing images according to claim 43, wherein said
procedure of merging comprises the sub-procedure of image fusion
between at least two sensors on the basis of pixel intensity, at the
pixel level.
46. The method for providing images according to claim 43, wherein said
procedure of merging further comprises the sub-procedure of
generating a synthetic colorized image, on the basis of spectral
response, at the pixel level.
47. The method for providing images according to claim 43, further
comprising the procedure of applying the resultant merged image to a
display.
48. The method for providing images according to claim 40, further
comprising the procedure of spatial image stabilization based on the
reading of at least two accelerometers.
49. The method for providing images according to claim 40, further
comprising the procedure of spatial image stabilization based on the
reading of at least one gyroscope.
50. The method for providing images according to claim 40, further
comprising the procedure of spatial and temporal filtering with respect
to the user line-of-sight.
51. The method for providing images according to claim 50, wherein said
procedure of spatial and temporal filtering comprises spatial and
temporal filtering at the pixel level with respect to the readings of a
head line-of-sight reader or an eye line of sight tracker.
PCT/IL2004/000038 2003-01-15 2004-01-14 Versatile camera for various visibility conditions WO2004064391A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/182,302 US7496293B2 (en) 2004-01-14 2005-07-15 Versatile camera for various visibility conditions

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL153967A IL153967A (en) 2003-01-15 2003-01-15 Versatile camera for various visibility conditions
IL153967 2003-01-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/182,302 Continuation-In-Part US7496293B2 (en) 2004-01-14 2005-07-15 Versatile camera for various visibility conditions

Publications (2)

Publication Number Publication Date
WO2004064391A1 true WO2004064391A1 (en) 2004-07-29
WO2004064391B1 WO2004064391B1 (en) 2004-09-16

Family

ID=32697012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2004/000038 WO2004064391A1 (en) 2003-01-15 2004-01-14 Versatile camera for various visibility conditions

Country Status (2)

Country Link
IL (1) IL153967A (en)
WO (1) WO2004064391A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1675384A2 (en) * 2004-12-22 2006-06-28 Sony Corporation Image processing apparatus, method and image pickup apparatus
WO2006070351A2 (en) * 2004-12-29 2006-07-06 Elbit Systems Ltd. Synthetic colour night vision system
EP1716442A1 (en) * 2004-02-19 2006-11-02 Jean-Claude Robin Method and device for capturing images with large lighting dynamics
GB2449982B (en) * 2007-06-06 2010-03-17 Arnold & Richter Kg Digital motion picture camera with two image sensors
EP1883227A3 (en) * 2006-07-25 2011-09-21 ITT Manufacturing Enterprises, Inc. Motion compensated image registration for overlaid/fused video
EP3289757A4 (en) * 2015-05-01 2018-07-18 Duelight LLC Systems and methods for generating a digital image
US10178300B2 (en) 2016-09-01 2019-01-08 Duelight Llc Systems and methods for adjusting focus based on focus target information
US10182197B2 (en) 2013-03-15 2019-01-15 Duelight Llc Systems and methods for a digital image sensor
US10372971B2 (en) 2017-10-05 2019-08-06 Duelight Llc System, method, and computer program for determining an exposure based on skin tone
US10382702B2 (en) 2012-09-04 2019-08-13 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US10469714B2 (en) 2016-07-01 2019-11-05 Duelight Llc Systems and methods for capturing digital images
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
DE102020133691A1 (en) 2020-12-16 2022-06-23 Valeo Schalter Und Sensoren Gmbh Infrared thermal imaging camera and on-board system to assist in driving a motor vehicle having such a camera
US11463630B2 (en) 2014-11-07 2022-10-04 Duelight Llc Systems and methods for generating a high-dynamic range (HDR) pixel stream
US12003853B2 (en) 2020-08-21 2024-06-04 Duelight Llc Systems and methods for adjusting focus based on focus target information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990005426A1 (en) * 1988-11-03 1990-05-17 Pearpoint Limited T.v. surveillance camera
US6215597B1 (en) * 1999-11-17 2001-04-10 Duncan Technologies, Inc. Apparatus for forming a plurality of subimages having different characteristics
US6246437B1 (en) * 1991-11-05 2001-06-12 Canon Kabushiki Kaisha Image pickup device
EP1158787A1 (en) * 2000-05-26 2001-11-28 Thales Device and analysis method of one or several wide dynamic range signals
US20030048493A1 (en) * 2001-09-10 2003-03-13 Pontifex Brian Decoursey Two sensor quantitative low-light color camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990005426A1 (en) * 1988-11-03 1990-05-17 Pearpoint Limited T.v. surveillance camera
US6246437B1 (en) * 1991-11-05 2001-06-12 Canon Kabushiki Kaisha Image pickup device
US6215597B1 (en) * 1999-11-17 2001-04-10 Duncan Technologies, Inc. Apparatus for forming a plurality of subimages having different characteristics
EP1158787A1 (en) * 2000-05-26 2001-11-28 Thales Device and analysis method of one or several wide dynamic range signals
US20030048493A1 (en) * 2001-09-10 2003-03-13 Pontifex Brian Decoursey Two sensor quantitative low-light color camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WAXMAN A M ET AL: "ELECTRONIC IMAGING AIDS FOR NIGHT DRIVING: LOW-LIGHT CCD, UNCOOLED THERMAL IR, AND COLOR FUSED VISIBLE/LWIR", PROCEEDINGS OF THE SPIE, SPIE, BELLINGHAM, VA, US, vol. 2902, 18 November 1996 (1996-11-18), pages 62 - 73, XP008015964, ISSN: 0277-786X *

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1716442A1 (en) * 2004-02-19 2006-11-02 Jean-Claude Robin Method and device for capturing images with large lighting dynamics
EP1675384A2 (en) * 2004-12-22 2006-06-28 Sony Corporation Image processing apparatus, method and image pickup apparatus
EP1675384A3 (en) * 2004-12-22 2011-11-30 Sony Corporation Image processing apparatus, method and image pickup apparatus
US8212876B2 (en) 2004-12-29 2012-07-03 Elbit Systems Ltd. Synthetic colour night vision system
WO2006070351A2 (en) * 2004-12-29 2006-07-06 Elbit Systems Ltd. Synthetic colour night vision system
WO2006070351A3 (en) * 2004-12-29 2006-09-08 Elbit Systems Ltd Synthetic colour night vision system
EP1883227A3 (en) * 2006-07-25 2011-09-21 ITT Manufacturing Enterprises, Inc. Motion compensated image registration for overlaid/fused video
AU2007203127B2 (en) * 2006-07-25 2012-04-19 Elbit Systems Of America, Llc Motion compensated image registration for overlaid/fused video
GB2449982B (en) * 2007-06-06 2010-03-17 Arnold & Richter Kg Digital motion picture camera with two image sensors
US11025831B2 (en) 2012-09-04 2021-06-01 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US10652478B2 (en) 2012-09-04 2020-05-12 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US10382702B2 (en) 2012-09-04 2019-08-13 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US10498982B2 (en) 2013-03-15 2019-12-03 Duelight Llc Systems and methods for a digital image sensor
US10182197B2 (en) 2013-03-15 2019-01-15 Duelight Llc Systems and methods for a digital image sensor
US10931897B2 (en) 2013-03-15 2021-02-23 Duelight Llc Systems and methods for a digital image sensor
US11394894B2 (en) 2014-11-06 2022-07-19 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US11463630B2 (en) 2014-11-07 2022-10-04 Duelight Llc Systems and methods for generating a high-dynamic range (HDR) pixel stream
EP3289757A4 (en) * 2015-05-01 2018-07-18 Duelight LLC Systems and methods for generating a digital image
US10904505B2 (en) 2015-05-01 2021-01-26 Duelight Llc Systems and methods for generating a digital image
US10110870B2 (en) 2015-05-01 2018-10-23 Duelight Llc Systems and methods for generating a digital image
US10129514B2 (en) 2015-05-01 2018-11-13 Duelight Llc Systems and methods for generating a digital image
US11356647B2 (en) 2015-05-01 2022-06-07 Duelight Llc Systems and methods for generating a digital image
US10375369B2 (en) 2015-05-01 2019-08-06 Duelight Llc Systems and methods for generating a digital image using separate color and intensity data
US11375085B2 (en) 2016-07-01 2022-06-28 Duelight Llc Systems and methods for capturing digital images
US10469714B2 (en) 2016-07-01 2019-11-05 Duelight Llc Systems and methods for capturing digital images
US10477077B2 (en) 2016-07-01 2019-11-12 Duelight Llc Systems and methods for capturing digital images
US10270958B2 (en) 2016-09-01 2019-04-23 Duelight Llc Systems and methods for adjusting focus based on focus target information
US10178300B2 (en) 2016-09-01 2019-01-08 Duelight Llc Systems and methods for adjusting focus based on focus target information
US10785401B2 (en) 2016-09-01 2020-09-22 Duelight Llc Systems and methods for adjusting focus based on focus target information
US10586097B2 (en) 2017-10-05 2020-03-10 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US10372971B2 (en) 2017-10-05 2019-08-06 Duelight Llc System, method, and computer program for determining an exposure based on skin tone
US10558848B2 (en) 2017-10-05 2020-02-11 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US11455829B2 (en) 2017-10-05 2022-09-27 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US11699219B2 (en) 2017-10-05 2023-07-11 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US12003853B2 (en) 2020-08-21 2024-06-04 Duelight Llc Systems and methods for adjusting focus based on focus target information
DE102020133691A1 (en) 2020-12-16 2022-06-23 Valeo Schalter Und Sensoren Gmbh Infrared thermal imaging camera and on-board system to assist in driving a motor vehicle having such a camera
US12003864B2 (en) 2021-05-14 2024-06-04 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time

Also Published As

Publication number Publication date
WO2004064391B1 (en) 2004-09-16
IL153967A (en) 2014-01-30
IL153967A0 (en) 2004-08-31

Similar Documents

Publication Publication Date Title
US7496293B2 (en) Versatile camera for various visibility conditions
US7842921B2 (en) Clip-on infrared imager
US6646799B1 (en) System and method for combining multiple energy bands to improve scene viewing
US9148579B1 (en) Fusion night vision system
US6476391B1 (en) Infrared imaging system for advanced rescue vision system
US9648255B2 (en) Multi-modal optoelectronic vision system and uses thereof
EP3401631B1 (en) Thermal reflex sight
WO2004064391A1 (en) Versatile camera for various visibility conditions
US20170208262A1 (en) Digital enhanced vision system
US20080266669A1 (en) Electronic Day and Night Vision Spectacles
US20120019700A1 (en) Optical system with automatic mixing of daylight and thermal vision digital video signals
US20070228259A1 (en) System and method for fusing an image
US7813037B2 (en) Day/night-vision device
US20080170119A1 (en) Vision enhancement system
US7746551B2 (en) Vision system with eye dominance forced to fusion channel
US20080011941A1 (en) Aviation night vision system using common aperture and multi-spectral image fusion
Gerken et al. Military reconnaissance platform for the spectral range from the visible to the MWIR
JP5953636B2 (en) Modular night vision system with fusion optical sensor
GB2468948A (en) Optical bypass device and thermal imaging attachment for an image intensifier
JPH06121325A (en) Color image pickup device
GB2259211A (en) Thermal imaging system
RU2786356C1 (en) Dual-spectrum video surveillance system
WO2017044130A1 (en) Multi-modal optoelectronic vision system and uses thereof
CN219162478U (en) Binocular night vision device
Zhang et al. Pixel-by-pixel VIS/NIR and LIR sensor fusion system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

B Later publication of amended claims

Effective date: 20040809

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11182302

Country of ref document: US

122 Ep: pct application non-entry in european phase
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWP Wipo information: published in national office

Ref document number: 11182302

Country of ref document: US