WO2000037970A9 - Extreme temperature radiometry and imaging apparatus - Google Patents

Extreme temperature radiometry and imaging apparatus

Info

Publication number
WO2000037970A9
WO2000037970A9 PCT/US1999/029359 US9929359W WO0037970A9 WO 2000037970 A9 WO2000037970 A9 WO 2000037970A9 US 9929359 W US9929359 W US 9929359W WO 0037970 A9 WO0037970 A9 WO 0037970A9
Authority
WO
WIPO (PCT)
Prior art keywords
user
portions
scene
image
color
Prior art date
Application number
PCT/US1999/029359
Other languages
French (fr)
Other versions
WO2000037970B1 (en
WO2000037970A2 (en
WO2000037970A3 (en
Inventor
Charles C Warner
Scott A Foster
Stewart W Evans
Raul Krivoy
Michael W Burke
John R Rae
Original Assignee
Flir Systems
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/210,167 external-priority patent/US6255650B1/en
Application filed by Flir Systems filed Critical Flir Systems
Priority to AU45202/00A priority Critical patent/AU4520200A/en
Publication of WO2000037970A2 publication Critical patent/WO2000037970A2/en
Publication of WO2000037970A3 publication Critical patent/WO2000037970A3/en
Publication of WO2000037970B1 publication Critical patent/WO2000037970B1/en
Publication of WO2000037970A9 publication Critical patent/WO2000037970A9/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/12Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
    • G02B23/125Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification head-mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • H04N23/23Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from thermal infrared radiation

Definitions

  • the present invention relates generally to hazard-avoidance and blind vision equipment for use in extreme temperatures. More particularly, it concerns a self-contained, portable, easily deployed, helmet-mounted thermal or infrared (TR) imaging system capable of unobtrusively expanding the view of users such as fire fighters by providing a head-up IR detecting and imaging of a scene in front of a user, displaying an image that eliminates or sees through obscurants such as dark, smoke and particulate that may otherwise blind the user.
  • TR thermal or infrared
  • Fire fighting is extremely hazardous and demanding because of extreme temperatures and obscurants that can blind or disable a fire fighter from locating the fire's source or human beings at risk within a burning building.
  • When there are no visible flames e.g. when alcohol, hydrogen, hydrocarbons, etc. burn, there can be lethally high temperatures caused by gases that burn without visible ignition or flaming. Whether there are visible or invisible flames, nevertheless there can be dense smoke or airborne particulate that makes normal vision impossible. At night or in dark locations, even without extremely high temperatures and even without obscurants, vision is essential to containing a fire or saving a life.
  • IR vision subsystems for fire fighters have been bulky and integrated with other fire fighting protective equipment worn by fire fighters. They also typically have required an umbilical cord to equipment worn on the body of the fire fighter.
  • IR equipment is connected with protective body gear referred to herein as a bunker suit typically including or augmented by self-contained breathing apparatus (SCBA).
  • SCBA self-contained breathing apparatus
  • a head-up display, an infrared (IR) camera and associated electronics including power are integrated into portable, self-contained, wraparound, face-worn vision-enhancement apparatus useful in environments of dense air-borne particulate and thermal extremes such as encountered in fire fighting situations, in accordance with the invention.
  • Reflective and opaque expanses or lenses are provided in front of a user's eyes at approximately eye level for IR vision display and blinding purposes, respectively, to produce a clear bright picture or image representing a detected and imaged scene viewed by the camera otherwise obscured by darkness or obscurants.
  • the IR camera is integral with the wrap-around system along with a self-contained power supply so that the system is portable and requires no umbilical cord or other external connections.
  • the imager is preferably an uncooled focal plane array.
  • Associated imaging, storing, processing and displaying electronics are cooled in the extreme thermal environment using an integral plural phase heatsink, to protect elements of the apparatus from environmental heat.
  • the apparatus is separate from, but compatible with, helmets and
  • the apparatus may be temporarily affixed via a clip and strap to the brim of a helmet and may be easily shifted on the user's face from its normal night-vision position to a temporary stowed position in front of the forehead whereby the user has virtually unobstructed binocular vision.
  • the intended use of the apparatus is for firefighting and fire rescue, but it is believed to have application in other rescue and adverse conditions, associated with vehicle accidents, mining accidents, and combat.
  • FIGs. 1 A and IB are side elevations of the invented apparatus, with
  • FIG. 1 A showing the apparatus in infrared (IK) mode of deployment and with Fig. IB showing the apparatus in a tilted-back, direct-view mode of deployment.
  • IK infrared
  • FIGs. 2A and 2B are isometric views of the invention, with Fig. 2A showing an exploded, partly assembled version and with Fig. 2B showing a fully assembled version, with certain portions of the imaging apparatus's housing cut away to reveal certain interior details.
  • Fig. 3 is an optical schematic diagram corresponding with Fig. 1A illustrating the IR optical geometry of the invention.
  • Figs. 4A and 4B are schematic diagrams respectively showing an overhead and lateral view of a user and a camera head-mounted camera, the views illustrating line-of-sight and focal axes and their convergence in front of the user.
  • Fig. 5 is a block diagram of the opto-electronics within the housing of the invented apparatus.
  • Fig. 6 is a flowchart illustrating the color mapping method used in the invented apparatus for image enhancement.
  • Figs 7A and 7B are graphs that illustrate the color mapping method used in the invented apparatus for image enhancement, with Fig. 7A showing an entire palette mapped, and with Fig. 7B showing a detail of a portion of Fig. 7A.
  • Fig. 8 is a graph of a simplified histogram showing the percentage of a sensed scene signal that is within particular temperature ranges.
  • Fig. 9 is a graph illustrating linear gain and conventional histogram projection gain, and iUustrating three different combinations of histogram projection gain with specified percentages of linear gain, all for the sensed scene signal represented in Fig. 8.
  • Fig. 10 is a graph illustrating a series of calculations based on a particular sensed scene signal to determine an optimal percentage of linear gain and non-linear (histogram projection) gain.
  • Apparatus 10 may be seen to include a left arching region 12, a right arching region 14 and a forward arching region 16 that hovers preferably just above the user's eye level on the face.
  • Apparatus 10 preferably includes a lightweight, preferably molded polymer housing with an interior void in which are located all essential components including an IR (thermal) optical engine 18 located adjacent forward region 16.
  • Optical engine 18 includes an un-cooled bolometric LR detector array 20, preferably of the type described and illustrated in commonly owned U.S. Patent No.
  • Array 20 produces a high-resolution, two- dimensional, temperature pixel image of a scene within its field of view.
  • the image produced by array 20 is stored in a digital memory 22 managed by a microprocessor 24.
  • Left region 14 includes a battery subsystem 26 for integrally powering all components.
  • Bolometric IR detector array 20 because it is un-cooled (by which is meant it is not cryogenically cooled), produces only slight heat itself, but nevertheless is sensitive to heat in the ambient environment around it.
  • an important contribution of the invention is a fluid heatsink 28 for removing from opto-electronics 30 including optical engine 18, array 20, memory 22 and processor 24.
  • Opto-electronics 30 typically are subject to extreme environmental heat that may be produced by a fire. Accordingly, heatsink 28 is provided to ensure continuous, long-term, accurate temperature profile detection and imaging by the detector array despite the environmental extremes.
  • Fig. 1A will be understood as showing apparatus 10 in its deployed position whereby the user is blinded in a left eye and vision-enhanced in a right eye, as described above.
  • Fig. IB corresponds directly with Fig. 1A and shows the same features as those described above but in a different orientation.
  • Fig. IB shows apparatus 10 in a tilted-back or stowed position in which the user is able to see relatively unobstructed and unenhanced with the naked eyes when the vision- enhancement features of apparatus 10 are not needed.
  • the configuration of front region 16 of apparatus 10 wherein the lower edge 16a of region 16 temiinates in a line just below eye-level makes it possible to tilt apparatus 10 back toward the forehead ever so slightly to afford the user a direct unenhanced and relatively unobstructed view, as indicated.
  • this configuration of apparatus 10 is preferable to alternative arrangements whereby, for example, a visor section flips up away from the frame on a hinge along the top or is removable or whereby the left lens is rendered transparent for direct viewing with the left eye or whereby the right and/or left lens is rendered only translucent such that an IR image and direct view are superimposed within the view of the user.
  • the lens hinge or removal configurations over which apparatus 10 is believed to represent an improvement require a hinge or connection between the active display surface and the frame, thus potentially disturbing or destabilizing the optical path from the IR camera to the user's eye.
  • the translucent display lens configuration over which apparatus 10 distinguishes itself is known to cause eye-strain and confusion in many users due to the superposition of often slightly conflicting images. Confusion from superimposed images is unavoidable, since the natures of infrared energy and visible spectral energy are by definition different, and amorphous object or target boundaries result in confusion when the different images are superimposed. Often the infrared image will lag the direct image, for example, as the user's head is turned. Invented apparatus 10 avoids these problems by going against conventional wisdom in head-up IR displays and provides the user with the option of choosing to view a scene in either monocular IR or in binocular direct view modes of operation by a simple tilt or rotate of the apparatus about the user's ears.
  • Optical engine 18 may be seen in a slightly different form here to include array 20, a preferably back-lit liquid crystal display (LCD) 32 providing preferably at least 320 X 240 pixel resolution with a nrinimum 6-bit gray scale, a partially (preferably 50%) reflective planar mirror or mirrored lens 34 that turns the LCD image onto a focusing curved 100% reflective mirrored surface 36 that reflects the 50% intensity image back through 50% reflective surface 34 into the firefighter's eye.
  • LCD liquid crystal display
  • a partially (preferably 50%) reflective planar mirror or mirrored lens 34 that turns the LCD image onto a focusing curved 100% reflective mirrored surface 36 that reflects the 50% intensity image back through 50% reflective surface 34 into the firefighter's eye.
  • the display expanse may be viewed by looking through mirror 34 at focussing mirror 36, providing an approximately 25% polychromatic spectral energy efficiency, IR-representative field of view below which the firefighter may view the scene directly as indicated.
  • Optical engine 18 also includes an IR camera unit 37 mounted as better illustrated in Figs. 7A and 7B to intercept a frontal infrared scene along its focal axis.
  • the objective lens optical components within optical engine 18 preferably meet the F 1.3 optical standard in operation.
  • the objective lens preferably is a 1" diameter lens having a 30° azimuth (horizontal) field of view for wide-angle scene imaging.
  • the lens also preferably is transmissive of IR energy in the 8 to 12 micron spectral bandwidth.
  • the focus range of the lens is preferably set to 2 to 24 feet for normal viewing, but may be manually adjusted to 100 feet optimum focus distance.
  • Forward region 16 of apparatus 10 thus may be seen from Figs. 1A, IB, 2A, 2B and 3 to include a curved display expanse within and extending across an upper portion of a user's right eye field of view and a 'blind' or opaque expanse 38 within the user's left eye field of view.
  • the left eye of the user thus is preferably covered, or 'blinded.'
  • an IR image of the fire scene and through the other eye an obstructed or 'blind' view (as the parallax view resulting from a user's slightly laterally separated eyes and the depth perception obtained as a result of the parallax view are relatively unimportant in this so-called night-vision environment)
  • weight, power and cost are saved without significant compromise of IR image.
  • Configuring apparatus 10 to achieve this strategic placement permits the user normally to view the fire scene monoscopically via LCD 32 and mirrored lens 36, and, alternatively, to view the scene stereoscopically and unaided beneath the eyeglass portion, by looking below the display expanse.
  • Such dual mode viewing is very easily accomplished by a momentary, slight backward tilt of the user's head around which apparatus 10 wraps or by a momentary, slight backward tilt of the apparatus relative to the user's head.
  • This invented feature will be referred to herein as bi-focality.
  • Figs. 2A and 2B show apparatus 10 in isometric view corresponding generally with Figs. 1A, IB and 3, with identical component parts identically designated by reference designators.
  • Figs. 2A and 2B are exploded, partly assembled, and fully assembled versions of apparatus 10, respectively.
  • Also shown in Figs. 2A and 2B are the internal configuration of various subsystems within a housing of apparatus 10.
  • the subsystems include battery subsystem 26, opto-electronics indicated generally at 40, and a clamshell housing assembly, or simply housing, 42.
  • opto-electronics 40 include optical engine 18 and electronics, to be described in more detail by reference to Fig. 5, most or all of which are mounted within housing 42.
  • battery subsystem 26 provides regulated DC power to optoelectronics 40 within housing 42 via one or more electrical conductors 44 that route power and ground, as well as control and communication signals between the two subsystems and more particularly to electronics 46 of opto-electronics 40.
  • battery subsystem 26 includes a Sanyo HR-4/3FAU or Panasonic HHR-450AB01 battery.
  • Battery subsystem 26 may also include circuitry, not shown, that monitors usage of battery 26 so that warning messages may be produced if the remaining charge of battery 26 has dropped below a certain level, or if battery 26 has been subjected to too many charge cycles or excessive temperature.
  • the interactive nature of battery 26 is indicated in Fig. 5 by the control and communication signal that leads to and from battery 26.
  • the plurality of contacts 44 shown on battery subsystem 26 in Fig. 2A allow for the transmission of the power, control and communication signals from and to battery subsystem 26.
  • battery subsystem 26 preferably is mounted to housing 42 in such a manner that it can be easily and quickly removed for maintenance, repair or replacement with a fresh, fully charged battery subsystem.
  • battery subsystem 26 preferably is mounted to housing 42 in such a manner that it can be easily and quickly removed for maintenance, repair or replacement with a fresh, fully charged battery subsystem.
  • a helmet clip 42C also may be seen from the drawings to extend slightly forward and above the upper front edge of housing 42.
  • Other quick release mechanisms may be used to attach apparatus 10 to a protective helmet of the type typically worn by firefighters.
  • a headband or strap B shown in Fig. 1A, may be attached to apparatus 10 for additional support, at eyelets 42E. It is intended but not essential that apparatus 10 may be passed between firefighters as needed, while the firefighters are fully clothed in typical protective gear including hoods and gloves.
  • the preferred horseshoe shape of housing 42 is designed, in part, to ease handling of apparatus 10 by a gloved hand.
  • the horseshoe shape is defined by left arching region 12 and right arcl ⁇ ig region 14 (the legs of the horseshoe) interconnected by front region 16.
  • Legs 12 and 14 function as carrying handles for apparatus 10, if needed. Legs 12 and 14 even allow a firefighter to hold apparatus 10 in a viewing position without attaching apparatus 10 to a helmet or strapping it to a user's head. This may be particularly useful if apparatus 10 is passed frequently between firefighters in the midst of a fire, or if apparatus 10 or a firefighter's helmet becomes dislodged or structurally damaged.
  • Opto-electronics 40 including electronics 46 will be described now by reference to the schematic block diagram of Fig. 5, which, for completeness, also includes battery subsystem 26 and external options to be described.
  • Detector array 20 preferably is mounted on a camera/buffer printed circuit board (PCB) 48 which includes digital memory 22 for buffering digital scenery data obtained by the optical engine.
  • Optical engine 18 and battery subsystem 26 counterbalance one another along the legs.
  • Heatsink, or phase change module, 28 will be understood to be mounted in close physical proximity to detector 20 and other sensitive electronics mounted on camera buffer PCB 48 and microprocessor PCB 50, so as to dissipate heat radiating therefrom and to maintain the detector and the electronics within predefined limits compatible with proper operation thereof.
  • heatsink 28 is placed far enough back in leg 14 of housing 42 so that it counterbalances detector 20 along leg 14.
  • Battery subsystem 26 in leg 12 further counterbalances detector 20 along leg 12, while at the same time offsetting the weight of heatsink 28 so that apparatus 10 is balanced laterally as well, in a direction extending along forward region 16.
  • Microprocessor 24 preferably is separately mounted on a microprocessor PCB 50 located nearby so that timing, control, gain, image data and power are shared between the PCBs.
  • Optical engine 18 preferably includes an NUC shutter 52 and IR optics 54 and drive and heat signals are routed to shutter 52 from camera/buffer PCB 48 as shown.
  • An optional software development connector 56 may be provided as shown in Fig. 5 that facilitates future software development and upgrades that may be implemented in the form of programmable read-only memory (PROM) that preferably is an integral part of microprocessor 24.
  • PROM programmable read-only memory
  • Microprocessor PCB 50 with camera/buffer PCB 48 preferably mounted thereon, is mounted on a printed circuit motherboard/display board 58 and routes power, audio, control and digital LCD video signals therebetween.
  • Board 58 also mounts LCD backlight electronics 60, LCD 32 and display optics 62 as shown.
  • Board 58 provides power and control via a power distribution board 64 to battery subsystem 26.
  • a video transmitter 66 or video recorder 68 or both may be supported as external options to apparatus 10 via a provided NTSC/PAL video (RS-170) input/output port mounted on motherboard/display board 58.
  • NTSC/PAL video (RS-170) input/output port mounted on motherboard/display board 58.
  • Battery subsystem 26 is an important contributor to the portability and high functional density of apparatus 10.
  • Battery subsystem 26 includes a switch 70 and a light-emitting diode (LED) 72 for switching power on and off in apparatus 10 and for indicating when power is on. It will be appreciated that these are battery save features of apparatus 10 intended to extend its useful operating life without the user having to replace the battery subsystem.
  • LED light-emitting diode
  • battery subsystem 26 provides power conversion from battery voltage to the regulated +3.3 volts direct current (VDC), +5VDC, +12VDC required by optoelectronics 40. It also provides sufficient holdover (internal capacitance) to support low-power operation of apparatus 10 when the battery is unexpectedly removed.
  • VDC direct current
  • the battery that forms a preferably lightweight, low-cost, removable and rechargeable part of battery subsystem 26 when fully charged provides a minimum of 1 hour's operation when new, and a minimum of 40 minutes' operation after 500 charge/discharge cycles.
  • the battery contacts disconnect prior to external environmental exposure when installing and removing the battery into and from housing 42 to avoid possible explosive atmospheric exposure to electrical potential. This is accomplished via the mechanical design of the mounting structure and seal configurations.
  • Self-contained, sealed, liquid or plural-phase heatsink 28 may take any suitable form, but in accordance with the preferred embodiment of the invention, may be thought of as a plural-phase heatsink that by its solid/fluid character may be contained within a finite volume over its operating curve.
  • a self-contained system enables a firefighter to easily employ and deploy such a vision/display system in a fire without a restiaining umbilical, for example, to a separate coolant source.
  • the use of a high- temperature-range plural phase polymer as a heatsink material avoids exhaust problems or the removal of high-temperature by-products, e.g. the steam produced by heating water.
  • Heatsink 28 is low-mass including sealed container and material contents, and provides for the thermal storage of 3100 calories of heat of fusion at a 65°C (149°F) to 75°C (167°F) melting point.
  • Heatsink 28 in its preferred embodiment utilizes organic paraffin, such as beeswax, or n-hexatriacontane, C36H74.
  • Organic paraffins typically have high heats of fusion per unit weight, melt homogeneously, and do not supercool.
  • Heatsink 28 is intended to maintain the IR detector array hot side temperature below 80°C under all rated environmental conditions, at least for a limited time.
  • apparatus 10 is easily employed and deployed by simply slipping it onto the face and over the ears, and perhaps by securing it with a clip 42C and a band B that extends over the brim or bill of the firefighter's helmet, as shown best in Fig. 1A, with no connections to other equipment being required. It will be appreciated also that apparatus 10 is dimensioned and configured to avoid interference with other gear such as typically may be worn by users of apparatus, e.g. helmets, respirator or gas masks, e.g. SCBA, and attire.
  • apparatus 10 is dimensioned and configured to avoid interference with other gear such as typically may be worn by users of apparatus, e.g. helmets, respirator or gas masks, e.g. SCBA, and attire.
  • apparatus 10 is designed for more than low weight or volume, it also is sized and shaped to conform to and extend around an average user's head and face at approximately eye level, while not extending laterally around the head of the user any more than is necessary or radially therefrom more than ⁇ 3-inches.
  • apparatus 10 has a very wide thermal dynamic range that enables it to accurately survey scenes having temperatures ranging between 0°C ⁇ Ts ⁇ 815°C (32°F ⁇ T s ⁇ 1500°F). Where there are present extreme temperature ranges, it is difficult in monochrome or color display systems to differentiate extremely high temperatures, e.g. a gaseous, flammable vapor that may be several hundred degrees Centigrade, and relatively low temperatures, e.g. a living human being the surface temperature of which typically is under forty degrees Centigrade.
  • extremely high temperatures e.g. a gaseous, flammable vapor that may be several hundred degrees Centigrade
  • relatively low temperatures e.g. a living human being the surface temperature of which typically is under forty degrees Centigrade.
  • Color coding temperature ranges via microprocessor 24 in cooperation with one or more image buffers in memory 22, may, for example, represent dangerously high avoidance zones having avoidance temperature ranges like fires, e.g. Tsi > 600°C in shades of red, intermediate temperature ranges, e.g. 100°C ⁇ Ts 2 ⁇ 600°C, in shades of gray and relatively low target temperature ranges that might be rescue targets like human beings, e.g. 25°C ⁇ Ts3 ⁇ 100°C in shades of blue.
  • This preferred color coding allows a user to readily distinguish hazardous temperature zones from 'safe' target temperatures, which may be targets of particular interest to the firefighter.
  • the representation of intermediate temperature ranges in the color range of gray de-emphasizes those zones of the scene that are normally of little interest to a firefighter, because the zones are of too high a temperature to be a rescue target, and too low a temperature to be a threat to the protective gear used by the firefighter.
  • Some other neutral color may be used instead of or addition to gray for the representation of the intermediate temperature ranges, such as brown or copper.
  • some color other than red or blue may be used for target and avoidance temperature ranges, provided, preferably, that the color of the target and/or avoidance portions are visually distinct from all other portions of the color image. Red is believed to readily distinguish those portions of the scene that are at a dangerously high temperature.
  • the novel color coding also avoids occurrences of monochrome or polychromatic saturation of displays by which object profile and character often are obscured.
  • inventive features briefly include the provision of remote wireless monitoring via an optional pocket-sized, belt-worn transmitter 66 operatively connected to an input/output port of the microprocessor via suitable means such as a video cable and adding radiometric, e.g. numerical temperature readout, capability to apparatus 10.
  • radiometric e.g. numerical temperature readout
  • Apparatus 10 also preferably provides a NTSC/PAL output port for the digital video signals to be provided to an external display monitor.
  • the invented apparatus represents a step-wise decrease in volume and weight in IR imaging systems, with the weight of apparatus 10 under 4- pounds and the internal and external volume of apparatus 10 under 80, e.g. 71, cubic inches and 120, e.g. 105, cubic inches, respectively.
  • Such is made possible, in part, by fo ⁇ ning housing 42 using a suitably durable but lightweight polymer preferably via suitable injection molding techniques.
  • Such volume and weight, coupled with the increase in functionality, will be referred to herein as high functional density. It involves ir ⁇ niaturizing IR imaging systems to render them more self-contained, portable, low-cost, etc., and results in a significant utility improvement.
  • apparatus 10 weighs less than 4-pounds, making it extremely portable and comfortably wearable by a user. This extremely low weight renders apparatus 10 comfortably worn, and transported and stored on the person of the user or a vehicle or storage area, and makes it easily ported among users. In other words, apparatus 10 by virtue of its extremely low weight is as easy to deploy and stow and handle as a piece of clothing or accessory, yet it is extremely valuable as a firefighting or surveillance tool.
  • Such low weight is achieved in accordance with the invention by a combination of component selection, especially in the selection of low-weight batteries, heatsinks and optical elements, and a preferably integrally molded clam-shell type housing requiring little or no hardware to seal its contents against environmental extremes such as salt or fresh water spray or airborne contaminants.
  • Another aspect of the invention is the color or other coding of images whereby temperature-representing pixels are classified into selected temperature ranges and color coded to represent particular ranges for visual image enhancement, e.g. to highlight visually those portions of the scene that may contain a living human being whose temperature is within a certain relatively low predefined range and/or to visually diminish a flaming background whose temperature is within a certain relatively high predefined range.
  • This concept is not limited to fire fighting, but is broadly applicable to IR imaging wherein a broad range of temperatures is expected and wherein an important feature within a field of view might otherwise be masked from view by color saturation due to the prevalence of extreme temperatures around the target object. For example, it may be useful in temperature-condition monitoring, law enforcement and television broadcast.
  • This aspect of the invention will be referred to herein as a high-contrast ratio visual image enhancement method.
  • Figs. 6, 7A and 7B illustrate the preferred technique by which color mapping is accomplished in apparatus 10 in accordance with the invention.
  • monochrome displays lend themselves to shape identification
  • polychrome displays lend themselves to temperature identification.
  • the invention in its preferred embodiment uses a combination of monochrome and color displays in which normal temperature ranges are presented on LCD 30 in monochrome to facilitate feature ID and extreme temperature ranges might be presented thereon in polychrome to facilitate temperature ID. In this way, background may be presented in gray-scale and highlights of particular interest may be presented in color. All such temperature zone identification, isothermal boundary imaging and color coding readily are accomplished by software and firmware operating within self-contained microprocessor 24.
  • a gray-scale image in the present invention is created on LCD 32 by mapping the image in pixels, with any particular pixel being produced on the screen by equal levels of the red, green and blue portions of an RGB multicolor signal.
  • the luminance produced by the combined RGB signal for each pixel is modulated as a function of the temperature of each portion of the sensed scene. This is done by firmware in microprocessor 24, preferably using histogram-equalization image processing. Examples of such histogram equalization used in connection with machine vision systems are found in US Patent Nos. 5,083,204 and 5,563,962, the disclosures of which are incorporated herein by reference.
  • IR camera unit 37 is radiometrically calibrated so that the image on LCD 30 accurately represents the thermal profile of the scene within the field of view of IR camera unit 37, and not just a relative temperature as found in prior art devices.
  • the calibrated signal from unit 37 is further processed to highlight selected temperature ranges in a selected color. A unique aspect of this highlighting is that the signal is mapped so that the highlighting within the selected temperature range is within a range of the selected color, as described below.
  • the graphs in Figs. 7A and 7B illustrate how the highlighting is mapped to the displayed image.
  • the RGB luminance as a function of temperature is represented in Figs. 7A and 7B as three linear regions, each a linear function of temperature.
  • This linear representation is a gross simplification, particularly when histogram equalization and automatic gain control is used, but it clarifies the color mapping routine of the present invention.
  • the luminance of the red and green signals of the RGB signal have been shifted slightly so that it is easier to distinguish the individual R, G, and B signals.
  • the equalized mapping of the RGB portion of the signal is shifted to favor one color, with compensating decreases in the corresponding portions of the RGB signal.
  • the preferred color highlighting is to emphasize one of the base components of the RGB signal, such as blue for the human target zone, and red for the extreme temperature zone.
  • the highlighting of the human temperature zone in shades of blue is shown in detail in Fig. 7B.
  • luminance of the highlighted portions of the image are maintained relative to the non-highlighted portions adjacent in temperature range to the highlighted temperature range by the compensation discussed above.
  • luminance highlighting in addition to the described color higWighting may be added, by changing the compensation routine. For example, by increasing the blue portion of the RGB signal as desired within the selected temperature range, without making any compensation to the red or green portions of the RGB signal, the relative luminance will be increased within the selected temperature range, and the portions of the image in the selected temperature range will be highlighted in ranges of blue.
  • the color highlighting of the present invention may be applied to various or multiple temperature ranges as the situation requires or allows. For firefighting and other life-threatening activities, it is believed that it is safer to highlight only a few key portions of the image, such as those representing a human target and excessive heat. Other highlighted temperature ranges may be identified based on a particular activity such as fire control, in which the image might be highlighted to show different levels of combustion, or such as fire cleanup, in which the image might be highlighted to show dangerous hotspots within walls or other structure.
  • Fig. 8 is a histogram of a hypothetical, simplified sensed scene signal, showing the percentage of the signal that is within particular temperature ranges.
  • the signal is represented as containing data only within three discreet temperature bands, corresponding to a sensed scene having emissions only within these three bands of temperatures.
  • 60% of the signal falls within a lower temperature band
  • 30% of the signal falls within a middle temperature band
  • the remaining 10% of the signal falls within an upper temperature band.
  • Conventional histogram projection mapping is shown in one of the lines in Fig. 9, showing that the signal within the lower temperature band is mapped to 60% of the available display palette, the signal within the middle temperature band is mapped to 30% of the available display palette, and the signal within an upper temperature band is mapped to the remaining 10% of the available display palette.
  • N number of data points per scan
  • Palette For x(Tmin) to x(Tmax), Histogram Projection (x) + ( (100* (t(x)-T(x-l)/Tspan)-
  • linear span The relationship between linear span, scene span, non-linear percentage and linear percentage is shown in Fig. 10, for a histogram gain of 75, and a minimum span of 25.
  • the graph of the linear span and scene span are identical, above a vertical value of 25, the minimum span established in this example.
  • the formulas used are set forth below.
  • Threshold 100 Threshold ( MinSpan * 100 ) /
  • LinearSpan IF (SceneSpan ⁇ MinSpan,MinSpan, SceneSpan)
  • Non-linear% IF (SceneSpan ⁇ thresh, IF(SceneSpan ⁇ minSpan, 0, (SceneSpan-minSpan)
  • MinSpan limits the temperature resolution so that a relatively mono-thermic scene is not mapped to the entire available palette. If such, a scene were so mapped, slightly different temperatures may be mapped to very different colors or shades, making the resulting image difficult for a human viewer to interpret. It is believed that a MinSpan value of 25 is satisfactory for typical building fire situations, but other values may be used.
  • the linear modification of histogram projection may also be incorporated with the highlighting discussed earlier.
  • the software and firmware within microprocessor 24 provides other advantages.
  • the firmware upon the application of power automatically uses default settings for many operating parameters of apparatus 10, with such default settings stored in non-volatile memory such as read-only memory (ROM).
  • ROM read-only memory
  • Such permanently stored default settings preferably include control settings for image temperature range, and display brightness.
  • the firmware preferably indicates to the user normal operating conditions including the elapsed time on the battery and the battery charge level.
  • the firmware preferably provides user warnings of critical operating conditions such as failed power, critical battery level (e.g. ⁇ 5 operating minutes remaining) and internal temperature alarm indicating that the opto-electronics are operating above their nominal maximum operating temperature.
  • apparatus 10 may first be used for a rescue operation, and then used for a fire control operation, in which case different temperature ranges of the scene may be highlighted in the image. Such a change might be accomplished by an external switch on housing 42, not shown, which triggers different routines within the firmware of apparatus 10.
  • reprogramming is accomplished by connecting a programming computer to the power/control/communication contacts associated with battery subsystem 26, or to software development connector 56.
  • a head-mounted camera such as IR camera 37 of apparatus 10 and a head of a user wearing such apparatus including the camera are shown schematically in overhead and side views.
  • the camera and its positioning and mounting within housing 42 (not shown in the simplified schematic diagram but understood to be in keeping with the teachings of the present application) of apparatus 10 achieves an important advantage over prior art portable thermal imaging systems.
  • a parallax problem exists in conventional systems wherein the optical axis of the camera is parallel with the line of sight of the user.
  • This problem results from the off-axis, asymmetric location of the camera relative to the eye or centerline of the eyes of the user.
  • the problem is a serious one, especially in life-threatening or hazardous situations such as firefighting.
  • the user In the near field of 'view' through the camera, the user has difficulty handling objects within arm's reach because the camera effectively misrepresents to the user the object's location to the extent of the vertical and lateral offset between the user's eyes and the camera's 'eye.' A user thus tends to reach for an object imaged by the camera only to grope for it where its position is falsely indicated, a few inches typically from where the object actually is located in space.
  • Apparatus 10 solves this problem by providing convergence of optical axis A defined by optical engine 18 (of which IR camera 37 is a part) and an axis describing the user's virtual or nominal line of sight through the right eye that is viewing the scene on LCD 32. This is accomplished by mounting optical engine 18 within housing 42 such that the optical axis converges the nominal line of sight to define a point of convergence F. It will be appreciated that, in accordance with the invention, the user's line of sight is actually to a virtual scene produced for display on a mirrored lens within apparatus 10 but that the user's nominal line-of-sight axis Au may be projected into the scene, as illustrated, for purposes of explaining another feature of the invention. Accordingly, Figs.
  • FIG. 4A and 4B illustrate a virtual line of sight of the user and the focal point along such virtual line of sight representing the effective focal path of the user viewing a scene on such a display.
  • the angle of convergence is chosen such that convergence F of the axes occurs at a nominal arm's length spaced away or in front of the user's right eye, e.g. between 2- and 4-feet, and typically approximately 3-feet away. This distance is indicated in Figs. 4A and 4B by a solid horizontal line 74 extending from the point of convergence F and the user's eye.
  • the angles of convergence between the optical axis A and line of sight LOS in the horizontal plane and in the vertical plane are indicated as 76 and 78 in Fig. 4A and 4B, respectively.
  • the optical axis of optical engine 18 is aimed down in a vertical plane by and angle of approximately 6-degrees and to the side in a horizontal vertical plane by and angle of approximately 8-degrees by configuring housing 42 to mount IR camera 37 at corresponding angles that converge approximately 3 feet in front of the user's right eye with the user's line of sight from the right eye.
  • the invention will be understood to include an apparatus and a method of representing a thermal image in a portable device, as described above.
  • the steps of the method include generating an electronic signal representative of a scene in front of a user using an infrared camera, identifying target portions of the electronic signal that represent portions of the scene that are within an identified target temperature range, and mapping the electronic signal to display a color image of the scene.
  • the target portions of the electronic signal are mapped in a color range that is visually distinct from all other portions of the color image.
  • the method may also include the steps of identifying avoidance portions of the electronic signal that represent portions of the scene that are above an identified avoidance temperature; mapping the avoidance portions of the electronic signal to the color image in a color range that is visually distinct from all other portions of the color image, and mapping those portions of the electronic signal that do not represent target and/or avoidance portions to the color image in a neutral color range.
  • this is done by producing a multicolor RGB signal representative of the image, and emphasizing at least one color of the multicolor signal.
  • Other colors of the multicolor signal may be de-emphasized so that the relative luminance of the target portions of the image remain approximately equivalent to the relative luminance of portions of the image that represent portions of the scene that are near to the identified temperature target range.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Radiation Pyrometers (AREA)

Abstract

A head-up display, an infrared (IR) camera (18) and electronics are integrated into portable, wrap-around, face-worn vision-enhancement apparatus (10) useful in environments of dense air-borne particulate and thermal extremes. Reflective and opaque lenses (38) are provided at eye level for IR, vision display and blinding purposes. The IR camera is integral with wrap-around system. An optical axis of the IR camera and an axis describing the user's virtual line of sight through the viewing eye converge at a nominal arm's length in front of the user's viewing eye. The imager is preferably an un-cooled focal plane array (20) and associated imaging, storing, processing and displaying electronics are cooled using an integral plural phase heatsink. The apparatus is separate from, but compatible with, helmets and SCBA gear, it can be installed and removed by an individual user. Also provided is enhanced vision via color-coded temperature banding.

Description

EXTREME TEMPERATURE RADIOMETRY AND IMAGING APPARATUS
Technical Field
The present invention relates generally to hazard-avoidance and blind vision equipment for use in extreme temperatures. More particularly, it concerns a self-contained, portable, easily deployed, helmet-mounted thermal or infrared (TR) imaging system capable of unobtrusively expanding the view of users such as fire fighters by providing a head-up IR detecting and imaging of a scene in front of a user, displaying an image that eliminates or sees through obscurants such as dark, smoke and particulate that may otherwise blind the user.
Background Art
Fire fighting is extremely hazardous and demanding because of extreme temperatures and obscurants that can blind or disable a fire fighter from locating the fire's source or human beings at risk within a burning building. When there are no visible flames, e.g. when alcohol, hydrogen, hydrocarbons, etc. burn, there can be lethally high temperatures caused by gases that burn without visible ignition or flaming. Whether there are visible or invisible flames, nevertheless there can be dense smoke or airborne particulate that makes normal vision impossible. At night or in dark locations, even without extremely high temperatures and even without obscurants, vision is essential to containing a fire or saving a life.
Conventionally, infrared (IR) vision subsystems for fire fighters have been bulky and integrated with other fire fighting protective equipment worn by fire fighters. They also typically have required an umbilical cord to equipment worn on the body of the fire fighter. Typically, IR equipment is connected with protective body gear referred to herein as a bunker suit typically including or augmented by self-contained breathing apparatus (SCBA).
Other vision systems for fire detection are not designed for hands- free operation as is required of a system used by firefighters that must enter the scene of the fire. For example, US Patent Nos. 5,422,484 and 5,726,632, the disclosures of which are incorporated herein by reference, disclose various handheld or pedestal-mounted flame sensors.
So-called night vision systems relying on IR detection and imaging often are useless in the presence within the detector's field of view of such extreme temperatures that the location of a human being or animal, for example, in a burning building goes undetected by a display phenomenon called blooming whereby a high-temperature gas cloud is represented by a color, e.g. white, that tends to wash out critical detail such as a low-temperature human form represented in another area of the display by a different gray scale. Effectively, the high- temperature cloud within view of the IR detector bleaches out needed detail in another area of the display, such as that of a human form. For example, the video systems of US Patent No. 5,200,827, the disclosures of which are incorporated herein by reference, do not address these problems unique to the firefighting and rescue fields. Disclosure of the Invention
A head-up display, an infrared (IR) camera and associated electronics including power are integrated into portable, self-contained, wraparound, face-worn vision-enhancement apparatus useful in environments of dense air-borne particulate and thermal extremes such as encountered in fire fighting situations, in accordance with the invention. Reflective and opaque expanses or lenses are provided in front of a user's eyes at approximately eye level for IR vision display and blinding purposes, respectively, to produce a clear bright picture or image representing a detected and imaged scene viewed by the camera otherwise obscured by darkness or obscurants. The IR camera is integral with the wrap-around system along with a self-contained power supply so that the system is portable and requires no umbilical cord or other external connections. The imager is preferably an uncooled focal plane array. Associated imaging, storing, processing and displaying electronics are cooled in the extreme thermal environment using an integral plural phase heatsink, to protect elements of the apparatus from environmental heat.
The apparatus is separate from, but compatible with, helmets and
SCBA and attire worn by the user so that it can easily be installed and removed by an individual user. Extended hands-free operation is provided in a lightweight package providing enhanced vision via color-coded temperature banding for display purposes, the color coding being performed in microprocessor-based firmware that forms part of the electronics. The apparatus may be temporarily affixed via a clip and strap to the brim of a helmet and may be easily shifted on the user's face from its normal night-vision position to a temporary stowed position in front of the forehead whereby the user has virtually unobstructed binocular vision. The intended use of the apparatus is for firefighting and fire rescue, but it is believed to have application in other rescue and adverse conditions, associated with vehicle accidents, mining accidents, and combat.
Objects and advantages of the present invention will be more readily understood after consideration of the drawings and the detailed description of the preferred embodiment which follows.
Brief Description of the Drawings Figs. 1 A and IB are side elevations of the invented apparatus, with
Fig. 1 A showing the apparatus in infrared (IK) mode of deployment and with Fig. IB showing the apparatus in a tilted-back, direct-view mode of deployment.
Figs. 2A and 2B are isometric views of the invention, with Fig. 2A showing an exploded, partly assembled version and with Fig. 2B showing a fully assembled version, with certain portions of the imaging apparatus's housing cut away to reveal certain interior details.
Fig. 3 is an optical schematic diagram corresponding with Fig. 1A illustrating the IR optical geometry of the invention. Figs. 4A and 4B are schematic diagrams respectively showing an overhead and lateral view of a user and a camera head-mounted camera, the views illustrating line-of-sight and focal axes and their convergence in front of the user.
Fig. 5 is a block diagram of the opto-electronics within the housing of the invented apparatus.
Fig. 6 is a flowchart illustrating the color mapping method used in the invented apparatus for image enhancement.
Figs 7A and 7B are graphs that illustrate the color mapping method used in the invented apparatus for image enhancement, with Fig. 7A showing an entire palette mapped, and with Fig. 7B showing a detail of a portion of Fig. 7A.
Fig. 8 is a graph of a simplified histogram showing the percentage of a sensed scene signal that is within particular temperature ranges.
Fig. 9 is a graph illustrating linear gain and conventional histogram projection gain, and iUustrating three different combinations of histogram projection gain with specified percentages of linear gain, all for the sensed scene signal represented in Fig. 8.
Fig. 10 is a graph illustrating a series of calculations based on a particular sensed scene signal to determine an optimal percentage of linear gain and non-linear (histogram projection) gain. Detailed Description of the Preferred Embodiment and Best Mode of Carrying Out the Invention
Referring first to Figs. 1A, IB, 2A and 2B, a preferred embodiment of the invented wrap-around, head-up display apparatus is indicated generally at 10. Apparatus 10 may be seen to include a left arching region 12, a right arching region 14 and a forward arching region 16 that hovers preferably just above the user's eye level on the face. Apparatus 10 preferably includes a lightweight, preferably molded polymer housing with an interior void in which are located all essential components including an IR (thermal) optical engine 18 located adjacent forward region 16. Optical engine 18 includes an un-cooled bolometric LR detector array 20, preferably of the type described and illustrated in commonly owned U.S. Patent No. 5,554,849 entitled MICRO-BOLOMETRIC INFRARED STARING ARRAY and issued September 10, 1996, the disclosures of which are incorporated herein by reference. Array 20 produces a high-resolution, two- dimensional, temperature pixel image of a scene within its field of view. The image produced by array 20 is stored in a digital memory 22 managed by a microprocessor 24. Left region 14 includes a battery subsystem 26 for integrally powering all components. Bolometric IR detector array 20, because it is un-cooled (by which is meant it is not cryogenically cooled), produces only slight heat itself, but nevertheless is sensitive to heat in the ambient environment around it. Thus, an important contribution of the invention is a fluid heatsink 28 for removing from opto-electronics 30 including optical engine 18, array 20, memory 22 and processor 24. Opto-electronics 30 typically are subject to extreme environmental heat that may be produced by a fire. Accordingly, heatsink 28 is provided to ensure continuous, long-term, accurate temperature profile detection and imaging by the detector array despite the environmental extremes. Fig. 1A will be understood as showing apparatus 10 in its deployed position whereby the user is blinded in a left eye and vision-enhanced in a right eye, as described above. The choice of the right or left eye for blinding and viewing is predetermined for each apparatus 10, but it will be understood that a mirror-image version of apparatus 10 may be constructed for those users that prefer blinding the right eye, and vision-enhancing the left eye. All of the drawings and the discussion herein is for a right-eye-enhanced embodiment.
Fig. IB corresponds directly with Fig. 1A and shows the same features as those described above but in a different orientation. Fig. IB shows apparatus 10 in a tilted-back or stowed position in which the user is able to see relatively unobstructed and unenhanced with the naked eyes when the vision- enhancement features of apparatus 10 are not needed. It will be appreciated that the configuration of front region 16 of apparatus 10 wherein the lower edge 16a of region 16 temiinates in a line just below eye-level makes it possible to tilt apparatus 10 back toward the forehead ever so slightly to afford the user a direct unenhanced and relatively unobstructed view, as indicated.
It is believed that this configuration of apparatus 10 is preferable to alternative arrangements whereby, for example, a visor section flips up away from the frame on a hinge along the top or is removable or whereby the left lens is rendered transparent for direct viewing with the left eye or whereby the right and/or left lens is rendered only translucent such that an IR image and direct view are superimposed within the view of the user. The lens hinge or removal configurations over which apparatus 10 is believed to represent an improvement require a hinge or connection between the active display surface and the frame, thus potentially disturbing or destabilizing the optical path from the IR camera to the user's eye.
The translucent display lens configuration over which apparatus 10 distinguishes itself is known to cause eye-strain and confusion in many users due to the superposition of often slightly conflicting images. Confusion from superimposed images is unavoidable, since the natures of infrared energy and visible spectral energy are by definition different, and amorphous object or target boundaries result in confusion when the different images are superimposed. Often the infrared image will lag the direct image, for example, as the user's head is turned. Invented apparatus 10 avoids these problems by going against conventional wisdom in head-up IR displays and provides the user with the option of choosing to view a scene in either monocular IR or in binocular direct view modes of operation by a simple tilt or rotate of the apparatus about the user's ears. Referring next to Fig. 2, the optical imaging technique used in accordance with the invention is illustrated schematically. Optical engine 18 may be seen in a slightly different form here to include array 20, a preferably back-lit liquid crystal display (LCD) 32 providing preferably at least 320 X 240 pixel resolution with a nrinimum 6-bit gray scale, a partially (preferably 50%) reflective planar mirror or mirrored lens 34 that turns the LCD image onto a focusing curved 100% reflective mirrored surface 36 that reflects the 50% intensity image back through 50% reflective surface 34 into the firefighter's eye. The display expanse may be viewed by looking through mirror 34 at focussing mirror 36, providing an approximately 25% polychromatic spectral energy efficiency, IR-representative field of view below which the firefighter may view the scene directly as indicated. Optical engine 18 also includes an IR camera unit 37 mounted as better illustrated in Figs. 7A and 7B to intercept a frontal infrared scene along its focal axis. The objective lens optical components within optical engine 18 preferably meet the F 1.3 optical standard in operation. The objective lens preferably is a 1" diameter lens having a 30° azimuth (horizontal) field of view for wide-angle scene imaging. The lens also preferably is transmissive of IR energy in the 8 to 12 micron spectral bandwidth. The focus range of the lens is preferably set to 2 to 24 feet for normal viewing, but may be manually adjusted to 100 feet optimum focus distance.
Forward region 16 of apparatus 10 thus may be seen from Figs. 1A, IB, 2A, 2B and 3 to include a curved display expanse within and extending across an upper portion of a user's right eye field of view and a 'blind' or opaque expanse 38 within the user's left eye field of view. The left eye of the user thus is preferably covered, or 'blinded.' By enabling through one eye an IR image of the fire scene and through the other eye an obstructed or 'blind' view (as the parallax view resulting from a user's slightly laterally separated eyes and the depth perception obtained as a result of the parallax view are relatively unimportant in this so-called night-vision environment), weight, power and cost are saved without significant compromise of IR image.
Importantly, prior art difficulties with the user resolving a visual image through one eye and an IR image through the other, or resolving a visual image and an IR image through the same one or more eyes, are avoided. Depth distortion whereby one image suggests a foreground scene or shorter distance to an object and a different superimposed image suggests a background scene or greater distance to an object~a distortion of visual perception that is inherent in superimposition vision systems-also is avoided. Placement of the eyeglass forward region of apparatus 10 relative to the user's head and particularly the user's face such that its bottom edge effectively cuts across the bridge of the nose is an important feature of the invention. Configuring apparatus 10 to achieve this strategic placement permits the user normally to view the fire scene monoscopically via LCD 32 and mirrored lens 36, and, alternatively, to view the scene stereoscopically and unaided beneath the eyeglass portion, by looking below the display expanse. Such dual mode viewing is very easily accomplished by a momentary, slight backward tilt of the user's head around which apparatus 10 wraps or by a momentary, slight backward tilt of the apparatus relative to the user's head. This invented feature will be referred to herein as bi-focality.
It will be appreciated that it is the dimension and configuration of the apparatus—and its resulting automatic positioning relative to the elevation of the user's eyes by its conformation with the bridge of the nose and the ears such that it perches at a given elevation on the user's face—that results in bifocal operation, i.e. the dual mode operation achieved by a slight tilting forward or backward of the head or the slight tilting backward or forward of the apparatus. This is perceived to represent a great advantage over prior art systems that require the user's mind to resolve simultaneous inputs to either eye (in a left- and-right bifurcated system) or both eyes (in a head-up, see-through system), one of which is unaided vision and the other of which is IR imaged vision, which superposition is believed to be confusing and potentially hazardous.
Figs. 2A and 2B show apparatus 10 in isometric view corresponding generally with Figs. 1A, IB and 3, with identical component parts identically designated by reference designators. Figs. 2A and 2B are exploded, partly assembled, and fully assembled versions of apparatus 10, respectively. Also shown in Figs. 2A and 2B are the internal configuration of various subsystems within a housing of apparatus 10. The subsystems include battery subsystem 26, opto-electronics indicated generally at 40, and a clamshell housing assembly, or simply housing, 42. It will be understood that opto-electronics 40 include optical engine 18 and electronics, to be described in more detail by reference to Fig. 5, most or all of which are mounted within housing 42.
The subsystems listed above that form a part of apparatus 10 will be understood to be operatively connected as suggested by Figs. 2A and 2B. For example, battery subsystem 26 provides regulated DC power to optoelectronics 40 within housing 42 via one or more electrical conductors 44 that route power and ground, as well as control and communication signals between the two subsystems and more particularly to electronics 46 of opto-electronics 40. Preferably, battery subsystem 26 includes a Sanyo HR-4/3FAU or Panasonic HHR-450AB01 battery.
Battery subsystem 26 may also include circuitry, not shown, that monitors usage of battery 26 so that warning messages may be produced if the remaining charge of battery 26 has dropped below a certain level, or if battery 26 has been subjected to too many charge cycles or excessive temperature. The interactive nature of battery 26 is indicated in Fig. 5 by the control and communication signal that leads to and from battery 26. The plurality of contacts 44 shown on battery subsystem 26 in Fig. 2A allow for the transmission of the power, control and communication signals from and to battery subsystem 26.
It will also be understood that battery subsystem 26 preferably is mounted to housing 42 in such a manner that it can be easily and quickly removed for maintenance, repair or replacement with a fresh, fully charged battery subsystem. Finally, it will be appreciated from Fig. 2B that preferably substantially all of opto-electronics 40~including the integral display that enables a user to 'see' though obscurants such as smoke or to 'see' in the absence of light—are contained within housing 42. This holds true except for those insubstantial portions that extend from the housing such as the forward region of optical engine 18 including forward portions of IR camera unit 37.
A helmet clip 42C also may be seen from the drawings to extend slightly forward and above the upper front edge of housing 42. Other quick release mechanisms may be used to attach apparatus 10 to a protective helmet of the type typically worn by firefighters. A headband or strap B, shown in Fig. 1A, may be attached to apparatus 10 for additional support, at eyelets 42E. It is intended but not essential that apparatus 10 may be passed between firefighters as needed, while the firefighters are fully clothed in typical protective gear including hoods and gloves. The preferred horseshoe shape of housing 42 is designed, in part, to ease handling of apparatus 10 by a gloved hand. The horseshoe shape is defined by left arching region 12 and right arclώig region 14 (the legs of the horseshoe) interconnected by front region 16. Legs 12 and 14 function as carrying handles for apparatus 10, if needed. Legs 12 and 14 even allow a firefighter to hold apparatus 10 in a viewing position without attaching apparatus 10 to a helmet or strapping it to a user's head. This may be particularly useful if apparatus 10 is passed frequently between firefighters in the midst of a fire, or if apparatus 10 or a firefighter's helmet becomes dislodged or structurally damaged. Opto-electronics 40 including electronics 46 will be described now by reference to the schematic block diagram of Fig. 5, which, for completeness, also includes battery subsystem 26 and external options to be described. Detector array 20 preferably is mounted on a camera/buffer printed circuit board (PCB) 48 which includes digital memory 22 for buffering digital scenery data obtained by the optical engine. Optical engine 18 and battery subsystem 26 counterbalance one another along the legs.
Heatsink, or phase change module, 28 will be understood to be mounted in close physical proximity to detector 20 and other sensitive electronics mounted on camera buffer PCB 48 and microprocessor PCB 50, so as to dissipate heat radiating therefrom and to maintain the detector and the electronics within predefined limits compatible with proper operation thereof. Preferably, heatsink 28 is placed far enough back in leg 14 of housing 42 so that it counterbalances detector 20 along leg 14. Battery subsystem 26 in leg 12 further counterbalances detector 20 along leg 12, while at the same time offsetting the weight of heatsink 28 so that apparatus 10 is balanced laterally as well, in a direction extending along forward region 16.
Microprocessor 24 preferably is separately mounted on a microprocessor PCB 50 located nearby so that timing, control, gain, image data and power are shared between the PCBs. Optical engine 18 preferably includes an NUC shutter 52 and IR optics 54 and drive and heat signals are routed to shutter 52 from camera/buffer PCB 48 as shown. An optional software development connector 56 may be provided as shown in Fig. 5 that facilitates future software development and upgrades that may be implemented in the form of programmable read-only memory (PROM) that preferably is an integral part of microprocessor 24.
Microprocessor PCB 50, with camera/buffer PCB 48 preferably mounted thereon, is mounted on a printed circuit motherboard/display board 58 and routes power, audio, control and digital LCD video signals therebetween. Board 58 also mounts LCD backlight electronics 60, LCD 32 and display optics 62 as shown. Board 58 provides power and control via a power distribution board 64 to battery subsystem 26. Optionally, a video transmitter 66 or video recorder 68 or both may be supported as external options to apparatus 10 via a provided NTSC/PAL video (RS-170) input/output port mounted on motherboard/display board 58. It will be appreciated by those skilled in the art that other external options may be provided, within the spirit and scope of the invention. However, current implementation of these added options may seriously limit the portability and exchangeability of apparatus 10 between firefighters.
Battery subsystem 26 is an important contributor to the portability and high functional density of apparatus 10. Battery subsystem 26 includes a switch 70 and a light-emitting diode (LED) 72 for switching power on and off in apparatus 10 and for indicating when power is on. It will be appreciated that these are battery save features of apparatus 10 intended to extend its useful operating life without the user having to replace the battery subsystem.
In accordance with a preferred embodiment of the invention, battery subsystem 26 provides power conversion from battery voltage to the regulated +3.3 volts direct current (VDC), +5VDC, +12VDC required by optoelectronics 40. It also provides sufficient holdover (internal capacitance) to support low-power operation of apparatus 10 when the battery is unexpectedly removed. The battery that forms a preferably lightweight, low-cost, removable and rechargeable part of battery subsystem 26 when fully charged provides a minimum of 1 hour's operation when new, and a minimum of 40 minutes' operation after 500 charge/discharge cycles. Importantly, the battery contacts disconnect prior to external environmental exposure when installing and removing the battery into and from housing 42 to avoid possible explosive atmospheric exposure to electrical potential. This is accomplished via the mechanical design of the mounting structure and seal configurations.
Self-contained, sealed, liquid or plural-phase heatsink 28 may take any suitable form, but in accordance with the preferred embodiment of the invention, may be thought of as a plural-phase heatsink that by its solid/fluid character may be contained within a finite volume over its operating curve. Importantly, only a self-contained system enables a firefighter to easily employ and deploy such a vision/display system in a fire without a restiaining umbilical, for example, to a separate coolant source. The use of a high- temperature-range plural phase polymer as a heatsink material avoids exhaust problems or the removal of high-temperature by-products, e.g. the steam produced by heating water.
Heatsink 28 is low-mass including sealed container and material contents, and provides for the thermal storage of 3100 calories of heat of fusion at a 65°C (149°F) to 75°C (167°F) melting point. Heatsink 28 in its preferred embodiment utilizes organic paraffin, such as beeswax, or n-hexatriacontane, C36H74. Organic paraffins typically have high heats of fusion per unit weight, melt homogeneously, and do not supercool.
One particular heatsink material believed to work well is available from Le Technologies, Inc., Hillsboro, Oregon, as product 04000850. It includes approximately 80% or less modified paraffin wax, CAS 64742-51-4, up to 25% amide wax, CAS 13276-08-9, up to 25% ethylene vinyl acetate copolymer, CAS 24937-78-8, and up to 3% antioxidant, CAS 10081-67-1. Many other phase-change materials might be used, including fatty acids, salt hydrates, fused salt hydrates, and metallic eutectic compounds. Heatsink 28 is intended to maintain the IR detector array hot side temperature below 80°C under all rated environmental conditions, at least for a limited time.
The use of the invented liquid or plural-phase heatsink permits apparatus 10 to operate usefully, depending upon the ambient temperature of the environment in which it is used, over varying periods of time. For example, apparatus 10 may be operated ^definitely without interruption in ambient temperatures -10°C < TA < 30°C (14°F < TA < 86°F); up to 2 hours at TA = 40°C (104°F); one hour at TA = 50°C (~120°F); twenty minutes at TA = 80°C (176°F); ten minutes at TA = 100°C (212°F); five minutes at TA = 150°C (302°F); and two minutes at TA = 315°C (~600°F).
Other important features of the invention that complement the self-contained, head-mount features of vision/display system 10 include its ergonomics and colorized display. Ergonomically speaking, apparatus 10 is easily employed and deployed by simply slipping it onto the face and over the ears, and perhaps by securing it with a clip 42C and a band B that extends over the brim or bill of the firefighter's helmet, as shown best in Fig. 1A, with no connections to other equipment being required. It will be appreciated also that apparatus 10 is dimensioned and configured to avoid interference with other gear such as typically may be worn by users of apparatus, e.g. helmets, respirator or gas masks, e.g. SCBA, and attire. Thus the size and shape of apparatus 10 is designed for more than low weight or volume, it also is sized and shaped to conform to and extend around an average user's head and face at approximately eye level, while not extending laterally around the head of the user any more than is necessary or radially therefrom more than ~ 3-inches.
The colorization of the IR display is of great benefit in avoiding temperature extrema which tend to saturate the image field. It will be appreciated that apparatus 10 has a very wide thermal dynamic range that enables it to accurately survey scenes having temperatures ranging between 0°C < Ts < 815°C (32°F ≤ Ts < 1500°F). Where there are present extreme temperature ranges, it is difficult in monochrome or color display systems to differentiate extremely high temperatures, e.g. a gaseous, flammable vapor that may be several hundred degrees Centigrade, and relatively low temperatures, e.g. a living human being the surface temperature of which typically is under forty degrees Centigrade.
Color coding temperature ranges, via microprocessor 24 in cooperation with one or more image buffers in memory 22, may, for example, represent dangerously high avoidance zones having avoidance temperature ranges like fires, e.g. Tsi > 600°C in shades of red, intermediate temperature ranges, e.g. 100°C < Ts2 < 600°C, in shades of gray and relatively low target temperature ranges that might be rescue targets like human beings, e.g. 25°C < Ts3 < 100°C in shades of blue. This preferred color coding allows a user to readily distinguish hazardous temperature zones from 'safe' target temperatures, which may be targets of particular interest to the firefighter. The representation of intermediate temperature ranges in the color range of gray de-emphasizes those zones of the scene that are normally of little interest to a firefighter, because the zones are of too high a temperature to be a rescue target, and too low a temperature to be a threat to the protective gear used by the firefighter.
Some other neutral color may be used instead of or addition to gray for the representation of the intermediate temperature ranges, such as brown or copper. Similarly, some color other than red or blue may be used for target and avoidance temperature ranges, provided, preferably, that the color of the target and/or avoidance portions are visually distinct from all other portions of the color image. Red is believed to readily distinguish those portions of the scene that are at a dangerously high temperature. The novel color coding also avoids occurrences of monochrome or polychromatic saturation of displays by which object profile and character often are obscured. Other inventive features briefly include the provision of remote wireless monitoring via an optional pocket-sized, belt-worn transmitter 66 operatively connected to an input/output port of the microprocessor via suitable means such as a video cable and adding radiometric, e.g. numerical temperature readout, capability to apparatus 10. The latter feature requires calibration, which is facilitated in accordance with the preferred embodiment of the invention by the fact that bolometric detectors are relatively easier to calibrate than are prior art ferro-electric detector or cryogenically cooled elements. Apparatus 10 also preferably provides a NTSC/PAL output port for the digital video signals to be provided to an external display monitor.
The invented apparatus represents a step-wise decrease in volume and weight in IR imaging systems, with the weight of apparatus 10 under 4- pounds and the internal and external volume of apparatus 10 under 80, e.g. 71, cubic inches and 120, e.g. 105, cubic inches, respectively. Such is made possible, in part, by foπning housing 42 using a suitably durable but lightweight polymer preferably via suitable injection molding techniques. Such volume and weight, coupled with the increase in functionality, will be referred to herein as high functional density. It involves irύniaturizing IR imaging systems to render them more self-contained, portable, low-cost, etc., and results in a significant utility improvement.
In accordance with a preferred embodiment of the invention, apparatus 10 weighs less than 4-pounds, making it extremely portable and comfortably wearable by a user. This extremely low weight renders apparatus 10 comfortably worn, and transported and stored on the person of the user or a vehicle or storage area, and makes it easily ported among users. In other words, apparatus 10 by virtue of its extremely low weight is as easy to deploy and stow and handle as a piece of clothing or accessory, yet it is extremely valuable as a firefighting or surveillance tool. Such low weight is achieved in accordance with the invention by a combination of component selection, especially in the selection of low-weight batteries, heatsinks and optical elements, and a preferably integrally molded clam-shell type housing requiring little or no hardware to seal its contents against environmental extremes such as salt or fresh water spray or airborne contaminants. Another aspect of the invention is the color or other coding of images whereby temperature-representing pixels are classified into selected temperature ranges and color coded to represent particular ranges for visual image enhancement, e.g. to highlight visually those portions of the scene that may contain a living human being whose temperature is within a certain relatively low predefined range and/or to visually diminish a flaming background whose temperature is within a certain relatively high predefined range. This concept is not limited to fire fighting, but is broadly applicable to IR imaging wherein a broad range of temperatures is expected and wherein an important feature within a field of view might otherwise be masked from view by color saturation due to the prevalence of extreme temperatures around the target object. For example, it may be useful in temperature-condition monitoring, law enforcement and television broadcast. This aspect of the invention will be referred to herein as a high-contrast ratio visual image enhancement method.
Figs. 6, 7A and 7B illustrate the preferred technique by which color mapping is accomplished in apparatus 10 in accordance with the invention. It is noted in this connection that monochrome displays lend themselves to shape identification, whereas polychrome displays lend themselves to temperature identification. Thus, the invention in its preferred embodiment uses a combination of monochrome and color displays in which normal temperature ranges are presented on LCD 30 in monochrome to facilitate feature ID and extreme temperature ranges might be presented thereon in polychrome to facilitate temperature ID. In this way, background may be presented in gray-scale and highlights of particular interest may be presented in color. All such temperature zone identification, isothermal boundary imaging and color coding readily are accomplished by software and firmware operating within self-contained microprocessor 24. A gray-scale image in the present invention is created on LCD 32 by mapping the image in pixels, with any particular pixel being produced on the screen by equal levels of the red, green and blue portions of an RGB multicolor signal. The luminance produced by the combined RGB signal for each pixel is modulated as a function of the temperature of each portion of the sensed scene. This is done by firmware in microprocessor 24, preferably using histogram-equalization image processing. Examples of such histogram equalization used in connection with machine vision systems are found in US Patent Nos. 5,083,204 and 5,563,962, the disclosures of which are incorporated herein by reference.
Preferably, IR camera unit 37 is radiometrically calibrated so that the image on LCD 30 accurately represents the thermal profile of the scene within the field of view of IR camera unit 37, and not just a relative temperature as found in prior art devices. Optionally, the calibrated signal from unit 37 is further processed to highlight selected temperature ranges in a selected color. A unique aspect of this highlighting is that the signal is mapped so that the highlighting within the selected temperature range is within a range of the selected color, as described below.
The graphs in Figs. 7A and 7B illustrate how the highlighting is mapped to the displayed image. The RGB luminance as a function of temperature is represented in Figs. 7A and 7B as three linear regions, each a linear function of temperature. This linear representation is a gross simplification, particularly when histogram equalization and automatic gain control is used, but it clarifies the color mapping routine of the present invention. The luminance of the red and green signals of the RGB signal have been shifted slightly so that it is easier to distinguish the individual R, G, and B signals.
Within a particular temperature range, such as near human body temperature, or at extreme temperatures (above 550°C in Fig. 7A), the equalized mapping of the RGB portion of the signal is shifted to favor one color, with compensating decreases in the corresponding portions of the RGB signal. The preferred color highlighting is to emphasize one of the base components of the RGB signal, such as blue for the human target zone, and red for the extreme temperature zone. The highlighting of the human temperature zone in shades of blue is shown in detail in Fig. 7B.
The luminance of the highlighted portions of the image are maintained relative to the non-highlighted portions adjacent in temperature range to the highlighted temperature range by the compensation discussed above. However, luminance highlighting in addition to the described color higWighting may be added, by changing the compensation routine. For example, by increasing the blue portion of the RGB signal as desired within the selected temperature range, without making any compensation to the red or green portions of the RGB signal, the relative luminance will be increased within the selected temperature range, and the portions of the image in the selected temperature range will be highlighted in ranges of blue.
It is intended that the color highlighting of the present invention may be applied to various or multiple temperature ranges as the situation requires or allows. For firefighting and other life-threatening activities, it is believed that it is safer to highlight only a few key portions of the image, such as those representing a human target and excessive heat. Other highlighted temperature ranges may be identified based on a particular activity such as fire control, in which the image might be highlighted to show different levels of combustion, or such as fire cleanup, in which the image might be highlighted to show dangerous hotspots within walls or other structure.
A unique modification of histogram projection has also been found to improve viewability of the sensed image. A simplified example of histogram projection mapping is illustrated in Figs. 8 and 9. Fig. 8 is a histogram of a hypothetical, simplified sensed scene signal, showing the percentage of the signal that is within particular temperature ranges. For the purposes of demonstration, the signal is represented as containing data only within three discreet temperature bands, corresponding to a sensed scene having emissions only within these three bands of temperatures. In this example, 60% of the signal falls within a lower temperature band, 30% of the signal falls within a middle temperature band, and the remaining 10% of the signal falls within an upper temperature band. Conventional histogram projection mapping is shown in one of the lines in Fig. 9, showing that the signal within the lower temperature band is mapped to 60% of the available display palette, the signal within the middle temperature band is mapped to 30% of the available display palette, and the signal within an upper temperature band is mapped to the remaining 10% of the available display palette.
It is believed that improved object identification and classification is possible if a certain percentage of linearity is imposed on the histogram projection mapping. As shown in Fig. 9, a certain percentage of the available palette is allocated to those portions of the temperature spectrum in which little or no emissions are sensed. A small percentage of linearity maps to relatively small jumps between those portions of the available palette that are used for mapping the sensed temperatures. As the percentage of linearity increases, the portion of the available palette that separates sensed temperature regions increases, thereby accentuating the differences between the bands of sensed emissions. The formulas used are set forth below, and are based on a palette that is mapped from values of 0 to 100. Histogram = SensedSignal*100/ (Su (SensedSignal) )
N=number of data points per scan
Histogram Projection = For x=l, Histogram(l) ; For x=2 to
N, Histogram Projection (x- 1) +Histogram(x-l)
TSpan = Tmax (sensed) - T in (sensed) Palette = 0 below Tmin (sensed) Palette = 100 above Tmax (sensed)
Palette = For x(Tmin) to x(Tmax), Histogram Projection (x) + ( (100* (t(x)-T(x-l)/Tspan)-
T(x)) * PercentLinearity
The relationship between linear span, scene span, non-linear percentage and linear percentage is shown in Fig. 10, for a histogram gain of 75, and a minimum span of 25. The graph of the linear span and scene span are identical, above a vertical value of 25, the minimum span established in this example. The formulas used are set forth below.
AutoHistGain 75 Variable established by programmer MinSpan 25 Variable established by programmer
Threshold 100 Threshold = ( MinSpan * 100 ) /
( 100 - AutoHistGai )
For each value of Scene Span shown below, the formulas are as follows,:
LinearSpan = IF (SceneSpan<MinSpan,MinSpan, SceneSpan) Non-linear% = IF (SceneSpan<thresh, IF(SceneSpan<minSpan, 0, (SceneSpan-minSpan)
*100/SceneSpan) ,
(SceneSpan*autoHistGain) /SceneSp an) Linear% = 100 - Non-linear% By processing a sensed scene signal using these formulas, the software and firmware within microprocessor 24 software can adjust the amount of linear and non-linear gain as a function of the sensed scene signal. This allows optimal viewing under various conditions. The variable AutoHistGain sets the maximum percentage of conventional histogram projection (non-linear) gain that may be applied to a sensed scene signal.
The variable MinSpan limits the temperature resolution so that a relatively mono-thermic scene is not mapped to the entire available palette. If such, a scene were so mapped, slightly different temperatures may be mapped to very different colors or shades, making the resulting image difficult for a human viewer to interpret. It is believed that a MinSpan value of 25 is satisfactory for typical building fire situations, but other values may be used.
The linear modification of histogram projection may also be incorporated with the highlighting discussed earlier. The software and firmware within microprocessor 24 provides other advantages. Preferably the firmware upon the application of power automatically uses default settings for many operating parameters of apparatus 10, with such default settings stored in non-volatile memory such as read-only memory (ROM). Such permanently stored default settings preferably include control settings for image temperature range, and display brightness. The firmware preferably indicates to the user normal operating conditions including the elapsed time on the battery and the battery charge level. The firmware preferably provides user warnings of critical operating conditions such as failed power, critical battery level (e.g. < 5 operating minutes remaining) and internal temperature alarm indicating that the opto-electronics are operating above their nominal maximum operating temperature.
Many of the system parameters may be reprogrammed. For example, apparatus 10 may first be used for a rescue operation, and then used for a fire control operation, in which case different temperature ranges of the scene may be highlighted in the image. Such a change might be accomplished by an external switch on housing 42, not shown, which triggers different routines within the firmware of apparatus 10. Currently, reprogramming is accomplished by connecting a programming computer to the power/control/communication contacts associated with battery subsystem 26, or to software development connector 56.
Referring now to Figs. 4A and 4B, a head-mounted camera such as IR camera 37 of apparatus 10 and a head of a user wearing such apparatus including the camera are shown schematically in overhead and side views. The camera and its positioning and mounting within housing 42 (not shown in the simplified schematic diagram but understood to be in keeping with the teachings of the present application) of apparatus 10 achieves an important advantage over prior art portable thermal imaging systems. A parallax problem exists in conventional systems wherein the optical axis of the camera is parallel with the line of sight of the user.
This problem results from the off-axis, asymmetric location of the camera relative to the eye or centerline of the eyes of the user. The problem is a serious one, especially in life-threatening or hazardous situations such as firefighting. In the near field of 'view' through the camera, the user has difficulty handling objects within arm's reach because the camera effectively misrepresents to the user the object's location to the extent of the vertical and lateral offset between the user's eyes and the camera's 'eye.' A user thus tends to reach for an object imaged by the camera only to grope for it where its position is falsely indicated, a few inches typically from where the object actually is located in space.
Apparatus 10 solves this problem by providing convergence of optical axis A defined by optical engine 18 (of which IR camera 37 is a part) and an axis describing the user's virtual or nominal line of sight through the right eye that is viewing the scene on LCD 32. This is accomplished by mounting optical engine 18 within housing 42 such that the optical axis converges the nominal line of sight to define a point of convergence F. It will be appreciated that, in accordance with the invention, the user's line of sight is actually to a virtual scene produced for display on a mirrored lens within apparatus 10 but that the user's nominal line-of-sight axis Au may be projected into the scene, as illustrated, for purposes of explaining another feature of the invention. Accordingly, Figs. 4A and 4B illustrate a virtual line of sight of the user and the focal point along such virtual line of sight representing the effective focal path of the user viewing a scene on such a display. The angle of convergence is chosen such that convergence F of the axes occurs at a nominal arm's length spaced away or in front of the user's right eye, e.g. between 2- and 4-feet, and typically approximately 3-feet away. This distance is indicated in Figs. 4A and 4B by a solid horizontal line 74 extending from the point of convergence F and the user's eye. The angles of convergence between the optical axis A and line of sight LOS in the horizontal plane and in the vertical plane are indicated as 76 and 78 in Fig. 4A and 4B, respectively.
Objects within the user's grasp will be found where indicated by the image projected on LCD 32 because the camera is targeting and imaging the object with the angular offset that overcomes the above problem, typically in a range of between approximately 4- and 10- degrees. In accordance with a preferred embodiment of the invention, the optical axis of optical engine 18 is aimed down in a vertical plane by and angle of approximately 6-degrees and to the side in a horizontal vertical plane by and angle of approximately 8-degrees by configuring housing 42 to mount IR camera 37 at corresponding angles that converge approximately 3 feet in front of the user's right eye with the user's line of sight from the right eye.
Users of apparatus 10 thus reach to grasp an object in front of them based upon the image projected on LCD 32 and find the object where they reach without difficulty and without groping as with prior art systems in which the camera's optical axis and the user's line of sight are parallel with one another. In the user's very far field of view, objects' location are slightly offset vertically and laterally as a result of the alignment of IR camera 37, but it is believed that there is little difficulty for a user to proceed toward a distant object and locate the object readily once the object is a few feet in front of the user. Thus the invented solution to the prior art parallax problem surprisingly is without disorientation when 'viewing' a distant object.
The invention will be understood to include an apparatus and a method of representing a thermal image in a portable device, as described above. The steps of the method include generating an electronic signal representative of a scene in front of a user using an infrared camera, identifying target portions of the electronic signal that represent portions of the scene that are within an identified target temperature range, and mapping the electronic signal to display a color image of the scene. Preferably, the target portions of the electronic signal are mapped in a color range that is visually distinct from all other portions of the color image. The method may also include the steps of identifying avoidance portions of the electronic signal that represent portions of the scene that are above an identified avoidance temperature; mapping the avoidance portions of the electronic signal to the color image in a color range that is visually distinct from all other portions of the color image, and mapping those portions of the electronic signal that do not represent target and/or avoidance portions to the color image in a neutral color range. Preferably, this is done by producing a multicolor RGB signal representative of the image, and emphasizing at least one color of the multicolor signal. Other colors of the multicolor signal may be de-emphasized so that the relative luminance of the target portions of the image remain approximately equivalent to the relative luminance of portions of the image that represent portions of the scene that are near to the identified temperature target range. The foregoing description of the preferred embodiment of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and many modifications and variations are possible in light of the above teaching. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined only by the claims.

Claims

WE CLAIM:
1. Apparatus for infrared imaging comprising: an housing configured for extending around a user's face at approximately eye level; an infrared camera mounted within said housing for detecting and imaging a scene in front of a user; and a display expanse within said housing for producing an image in front of at least one of a user's eyes, the image representing the detected and imaged scene.
2. The apparatus of claim 1, further comprising a heatsink for protecting elements of the apparatus from environmental heat.
3. The apparatus of claim 1, wherein the apparatus has a weight of approximately 4-pounds or less.
4. The apparatus of claim 1, wherein the display expanse is only in front of one of a user's eyes, and further comprising an opaque blinding expanse in front of the other of a user's eyes.
5. The apparatus of claim 1, wherein the display expanse extends across only an upper portion of a user's field of vision so that an unaided view may be obtained by looking below the display expanse.
6. The apparatus of claim 1, wherein the color image is color coded to represent those portions of the scene that are at within a target temperature range in a color that is visually distinct from all other portions of the color image.
7. The apparatus of claim 6, wherein the color image is further color coded to represent those portions of the scene that are at a temperature of above an avoidance temperature in a color that is visually distinct from all other portions of the color image, thereby dist guishing those portions of the scene that are at a dangerously high temperature.
8. The apparatus of claim 1, wherein the housing has an external volume of less than approximately 120-cubic inches, resulting in high functional density for the apparatus.
9. The apparatus of claim 1, wherein the housing has an internal volume of less than approximately 80-cubic inches, resulting in high functional density for the apparatus.
10. A method of representing a thermal image in a portable device, comprising the steps of: generating an electronic signal representative of a scene in front of a user using an infrared camera; identifying target portions of the electronic signal that represent portions of the scene that are within an identified target temperature range; and mapping the electronic signal to display a color image of the scene, with the target portions of the electronic signal mapped in a color range that is visually distinct from all other portions of the color image.
11. The method of claim 10, further comprising the steps of: identifying avoidance portions of the electronic signal that represent portions of the scene that are above an identified avoidance temperature; and mapping the avoidance portions of the electronic signal to the color image in a color range that is visually distinct from all other portions of the color image.
12. The method of claim 11, further comprising the step of mapping those portions of the electronic signal that do not represent target or avoidance portions to the color image in a neutral color range.
13. The method of claim 10, further comprising the step of mapping those portions of the electronic signal that do not represent target portions to the color image in a neutral color range.
14. The method of claim 13, wherein the step of mapping the target portions to the image includes the steps of: producing a multicolor signal representative of the image; and emphasizing at least one color of the multicolor signal.
15. The method of claim 14, wherein the multicolor signal is an RGB signal.
16. The method of claim 13, wherein the step of mapping the target portions to the image includes the steps of: producing a multicolor signal representative of the image; emphasizing at least one color of the multicolor signal; and de-emphasizing other colors of the multicolor signal so that the relative luminance of the target portions of the image remain approximately equivalent to the relative luminance of portions of the image that represent portions of the scene that are near to the identified temperature target range.
17. An optical engine for portable thermal imaging apparatus comprising: an infrared camera; a display for producing an image representing a scene viewed by the camera; a partially reflective mirror disposed adjacent the display; and a reflective mirror disposed adjacent the partially reflective mirror; wherein the display may be viewed by looking through the partially reflective mirror at the reflective mirror.
18. The engine of claim 17, further comprising a heatsink for protecting elements of the engine from environmental heat.
19. The engine of claim 17, wherein the engine has a weight of approximately 4-pounds or less.
20. The engine of claim 17, wherein the display expanse is only in front of one of a user's eyes, and further comprising an opaque blinding expanse in front of the other of a user's eyes.
21. The engine of claim 17, wherein the display expanse extends across only an upper portion of a user's field of vision so that an unaided view may be obtained by looking below the display expanse.
22. A portable thermal imaging apparatus incorporating the engine of claim 17, wherein the camera, display, and mirrors are enclosed within a housing having an external volume of less than 120-cubic inches.
23. A portable thermal imaging apparatus incorporating the engine of claim 17, wherein the camera, display, and mirrors are enclosed within a housing having an internal volume of less than 80-cubic inches.
24. Vision-enhancement apparatus for viewing a scene through obscurants by a user wearing the apparatus, the apparatus comprising: a housing configured to be worn by a user; an optical engine mounted within said housing, said optical engine defining an optical axis, said optical engine being mounted within said housing such that said optical axis converges with a nominal line of sight of a user to define a point of convergence.
25. The apparatus of claim 24, wherein said point of convergence is spaced away from the optical engine by approximately two to four feet.
26. The apparatus of claim 24, wherein said point of convergence is spaced away from the optical engine by approximately three feet.
27. The apparatus of claim 24, wherein said optical engine is mounted such that the optical axis and the nominal line of sight of the user converge at an angle between approximately 4 and 10 degrees.
28. The apparatus of claim 24, wherein said optical engine is mounted such that the optical axis and the nominal line of sight of the user converge in a vertical plane at an angle of approximately 6 degrees and in a horizontal plane at an angle of approximately 8 degrees.
29. Vision-enhancement apparatus comprising: an approximately horseshoe-shaped housing; and an optical engine mounted within said housing.
30. The apparatus of claim 29, further comprising a battery, wherein: the horseshoe-shaped housing is defined by legs interconnected by a front region; and the optical engine and battery counterbalance one another along the legs.
31. The apparatus of claim 29, further comprising a heatsink, wherein: the horseshoe-shaped housing is defined by legs interconnected by a front region; and the optical engine and heatsink counterbalance one another along the legs.
32. The apparatus of claim 29, further comprising a battery and a heatsink, wherein: the horseshoe-shaped housing is defined by legs interconnected by a front region; the optical engine is counterbalanced along the legs by the battery and the heatsink collectively; and the battery and heatsink counterbalance one another along the front region.
33. A method of representing a thermal image in a portable device, comprising the steps of: generating an electronic signal representative of a scene in front of a user using a thermal imaging system; identifying a plurality of target portions of the electronic signal that represent portions of the scene that are within an identified target temperature range; and mapping the electronic signal to display a color image of the scene, with each of the plurality of target portions of the electronic signal mapped to separate color ranges of the available palette, and with selected remaining portions of the electronic signal mapped to additional separate color ranges of the available palette, wherein the proportion of the available palette that is available for mapping the plurality of target portions of the electronic signal is predefined.
34. The method of claim 33, wherein the proportion of the available palette that is available for mapping the plurality of target portions of the electronic signal is predefined through programmable variables.
PCT/US1999/029359 1998-12-11 1999-12-10 Extreme temperature radiometry and imaging apparatus WO2000037970A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU45202/00A AU4520200A (en) 1998-12-11 1999-12-10 Extreme temperature radiometry and imaging apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/210,167 US6255650B1 (en) 1998-12-11 1998-12-11 Extreme temperature radiometry and imaging apparatus
US09/210,167 1998-12-11
US14528999P 1999-07-23 1999-07-23
US60/145,289 1999-07-23

Publications (4)

Publication Number Publication Date
WO2000037970A2 WO2000037970A2 (en) 2000-06-29
WO2000037970A3 WO2000037970A3 (en) 2000-11-23
WO2000037970B1 WO2000037970B1 (en) 2000-12-28
WO2000037970A9 true WO2000037970A9 (en) 2002-08-29

Family

ID=26842825

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/029359 WO2000037970A2 (en) 1998-12-11 1999-12-10 Extreme temperature radiometry and imaging apparatus

Country Status (2)

Country Link
AU (1) AU4520200A (en)
WO (1) WO2000037970A2 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6255650B1 (en) 1998-12-11 2001-07-03 Flir Systems, Inc. Extreme temperature radiometry and imaging apparatus
WO2003060590A2 (en) * 2001-12-21 2003-07-24 Itt Manufacturing Enterprises, Inc. Video enhanced night vision goggle
SE524024C2 (en) * 2002-03-01 2004-06-15 Flir Systems Ab IR camera
US7767963B1 (en) 2006-12-08 2010-08-03 Draeger Safety, Inc. Thermal imaging camera internal damping system
WO2011151803A2 (en) 2010-06-03 2011-12-08 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi An electro-optical detector device
CN102680109B (en) * 2012-05-21 2014-10-29 浙江雷邦光电技术有限公司 Helmet type thermal imager system
CN104280881B (en) * 2013-07-09 2021-03-23 杭州美盛红外光电技术有限公司 Portable imaging device
US10182195B2 (en) 2014-09-23 2019-01-15 Flir Systems, Inc. Protective window for an infrared sensor array
WO2016049238A1 (en) * 2014-09-23 2016-03-31 Flir Systems, Inc. Modular split-processing infrared imaging system
US10230909B2 (en) 2014-09-23 2019-03-12 Flir Systems, Inc. Modular split-processing infrared imaging system
WO2019028067A1 (en) * 2017-08-04 2019-02-07 Seek Thermal, Inc. Color display modes for a thermal imaging system
CN111095906B (en) * 2017-08-04 2023-09-01 塞克热量股份有限公司 Color display mode for thermal imaging system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8922146D0 (en) * 1989-10-02 1989-11-15 Eev Ltd Thermal camera arrangement
US5637389A (en) * 1992-02-18 1997-06-10 Colvin; David P. Thermally enhanced foam insulation
US6023288A (en) * 1993-03-31 2000-02-08 Cairns & Brother Inc. Combination head-protective helmet and thermal imaging apparatus
US5389788A (en) * 1993-12-13 1995-02-14 Hughes Aircraft Company Infrared transducer and goggles incorporating the same
US6023061A (en) * 1995-12-04 2000-02-08 Microcam Corporation Miniature infrared camera
FR2749177B1 (en) * 1996-06-03 1998-07-17 Inst Francais Du Petrole METHOD AND SYSTEM FOR THE REMOTE SENSING OF THE FLAMMABILITY OF THE DIFFERENT PARTS OF A ZONE OVERFLOW BY AN AIRCRAFT

Also Published As

Publication number Publication date
WO2000037970B1 (en) 2000-12-28
WO2000037970A2 (en) 2000-06-29
WO2000037970A3 (en) 2000-11-23
AU4520200A (en) 2000-07-12

Similar Documents

Publication Publication Date Title
US6849849B1 (en) Portable radiometry and imaging apparatus
US9973692B2 (en) Situational awareness by compressed display of panoramic views
US11092796B2 (en) Long range infrared imager systems and methods
US7345277B2 (en) Image intensifier and LWIR fusion/combination system
CA2884855C (en) Face mounted extreme environment thermal sensor system
EP1588206B1 (en) Helmet for displaying environmental images in critical environments
US7307793B2 (en) Fusion night vision system
EP1779180B1 (en) Method and system for thermal imaging having a selective temperature imaging mode
CA2599821C (en) Digitally enhanced night vision device
US20080266669A1 (en) Electronic Day and Night Vision Spectacles
WO2000037970A9 (en) Extreme temperature radiometry and imaging apparatus
JPH08262366A (en) Modularized helmet mounting display
US10520724B2 (en) Multi-wavelength head up display systems and methods
US20170208262A1 (en) Digital enhanced vision system
EP1300716A1 (en) Head mounted display apparatus
US20190011702A1 (en) Helmet-Mounted Visualization Device Without Parallax and its Operation Method
US20140210987A1 (en) Temperature Display Helmet
US20130127986A1 (en) Common holographic imaging platform
CN100509086C (en) Fire-proof mask equipped with infrared imaging system
KR20180051364A (en) Apparatus and methdo for displaying object information image
Miller et al. Applications and performance of an uncooled infrared helmetcam
Miller et al. Applications and Performance of IR Helmetcams
Guyot Infrared cameras for tracking and surveillance applications

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: B1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: B1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

B Later publication of amended claims
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
AK Designated states

Kind code of ref document: C2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

COP Corrected version of pamphlet

Free format text: PAGES 1-28, DESCRIPTION, REPLACED BY NEW PAGES 1-26; PAGES 29-37, CLAIMS, REPLACED BY NEW PAGES 27-33; PAGES 1/11-11/11, DRAWINGS, REPLACED BY NEW PAGES 1/11-11/11