US10687000B1 - Cross-eyed sensor mosaic - Google Patents

Cross-eyed sensor mosaic Download PDF

Info

Publication number
US10687000B1
US10687000B1 US16/009,787 US201816009787A US10687000B1 US 10687000 B1 US10687000 B1 US 10687000B1 US 201816009787 A US201816009787 A US 201816009787A US 10687000 B1 US10687000 B1 US 10687000B1
Authority
US
United States
Prior art keywords
infrared
view
cameras
field
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/009,787
Inventor
Daniel Henry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockwell Collins Inc
Original Assignee
Rockwell Collins Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rockwell Collins Inc filed Critical Rockwell Collins Inc
Priority to US16/009,787 priority Critical patent/US10687000B1/en
Assigned to ROCKWELL COLLINS, INC. reassignment ROCKWELL COLLINS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HENRY, DANIEL
Application granted granted Critical
Publication of US10687000B1 publication Critical patent/US10687000B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/06Arrangements for eliminating effects of disturbing radiation; Arrangements for compensating changes in sensitivity
    • G01J5/061Arrangements for eliminating effects of disturbing radiation; Arrangements for compensating changes in sensitivity by controlling the temperature of the apparatus or parts thereof, e.g. using cooling means or thermostats
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/007Radiation pyrometry, e.g. infrared or optical thermometry for earth observation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/04Casings
    • G01J5/047Mobile mounting; Scanning arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/07Arrangements for adjusting the solid angle of collected radiation, e.g. adjusting or orienting field of view, tracking position or encoding angular position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0803Arrangements for time-dependent attenuation of radiation signals
    • G01J5/0804Shutters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0875Windows; Arrangements for fastening thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • G01J5/20Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors using resistors, thermistors or semiconductors sensitive to radiation, e.g. photoconductive devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/52Elements optimising image sensor operation, e.g. for electromagnetic interference [EMI] protection or temperature control by heat transfer or cooling elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/63Noise processing, e.g. detecting, correcting, reducing or removing noise applied to dark current
    • H04N5/23232
    • H04N5/247
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • G01J2005/106Arrays

Definitions

  • inventive concepts disclosed herein generally relate to the field of imaging sensors.
  • Imaging sensors may be used for imaging objects in applications such as airborne imaging.
  • the size and resolution of imaging sensors often depends on the wavelength range of radiation which may be detected by detectors of the imaging sensors.
  • imaging sensors for visible light are known to provide higher resolution, e.g. ultra HD 4K, in a single low size, weight, power, and cost (SWAP-C) package.
  • SWAP-C size, weight, power, and cost
  • Uncooled long wave infrared (LWIR) sensors (sensitive to infrared radiation in the wavelength range of 8-14 ⁇ m), for example, have not gotten beyond XGA resolution (1024 ⁇ 768). Moreover, VGA resolution (640 ⁇ 480) for uncooled LWIR sensors is significantly less expensive on a per pixel cost basis.
  • Infrared sensors may be arranged in a configuration where the sensors have a plurality of cameras arranged in a mosaic, and each camera provides an image. Such sensors are arranged such that the field of views (FOVs) of the cameras may overlap, but the center optical axes of the cameras do not intersect. Each of the cameras has its separate window from which it receives IR light to be imaged onto a detector of the camera. Such an arrangement is known as a butted field of view mosaic.
  • inventions of the inventive concepts disclosed herein are directed to an infrared sensor.
  • the infrared sensor includes a plurality of infrared cameras.
  • Each infrared camera includes an infrared detector and imaging optics.
  • the imaging optics are arranged to image infrared energy onto the infrared detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view.
  • the infrared cameras are arranged such that the center optical axes of the field of views intersect each other, and the fields of view overlap each other.
  • inventions of the inventive concepts disclosed herein are directed to an optical sensor.
  • the optical sensor includes a plurality of cameras.
  • Each camera includes a detector and imaging optics.
  • the imaging optics are arranged to image light onto the detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view.
  • the cameras are arranged such that the center optical axes of the fields of view intersect each other, and the field of views overlap each other.
  • FIG. 1 is a schematic illustrating an infrared sensor according to inventive concepts disclosed herein.
  • FIG. 2 is a schematic illustrating an infrared camera of the infrared sensor of FIG. 1 according to inventive concepts disclosed herein.
  • FIG. 3A is a schematic illustrating a side view of an arrangement of infrared cameras according to inventive concepts disclosed herein.
  • FIG. 3B is a schematic illustrating overlapped images corresponding to fields of view of the arrangement of infrared cameras of FIG. 3A according to inventive concepts disclosed herein.
  • FIG. 4A is a schematic illustrating a side view of a single row of infrared cameras of a two dimensional arrangement of infrared cameras according to inventive concepts disclosed herein.
  • FIG. 4B is a schematic illustrating a side view of a single column of infrared cameras of a two dimensional arrangement of infrared cameras according to inventive concepts disclosed herein.
  • FIG. 4C is a schematic illustrating overlapped images corresponding to fields of view of the two dimensional arrangement of infrared cameras of FIGS. 4A and 4B according to inventive concepts disclosed herein.
  • Embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement provide for a lower SWAP-C by allowing for a single optical window through which pass the fields of view of the different cameras arranged in a mosaic. This arrangement is in contrast to a butted field of view mosaic arrangement, which typically have a separate optical window for each camera, or would require a very large window for all of the cameras.
  • a composite image can be generated having an increased resolution as compared to the individual images from the cameras.
  • the composite image generated may have a resolution greater than that of VGA.
  • the arrangement allows for an increase in pixels in a desired FOV compared to that for a single camera.
  • the inventive concepts disclosed herein are not limited to the individual cameras having a VGA resolution, and any other resolutions, such as XGA for example, are possible.
  • FIG. 1 is schematic of an infrared sensor 100 according to inventive concepts disclosed herein.
  • the infrared sensor 100 includes a plurality of infrared cameras 10 , and a controller 20 .
  • the controller 20 may include a processor 22 or a computer configured to, such as by being programmed, or hardwired, or by firmware, to perform certain functions.
  • the controller may include a memory 24 , or other non-transitory computer readable medium, which stores programs executed by the processor 22 of the controller 20 .
  • FIG. 2 is a schematic illustrating one of the infrared cameras 10 having a particular FOV 50 .
  • the infrared camera 10 has imaging optics 30 and an infrared detector 40 .
  • the imaging optics 30 are arranged to image infrared light onto the infrared detector 40 .
  • the imaging optics 30 may include one or more refractive and/or reflective and/or diffractive optical elements.
  • the refractive optical elements may include lenses, for example.
  • the reflective optical elements may include mirrors, for example.
  • the diffractive elements may include diffraction gratings, for example.
  • the imaging optics 30 has a FOV 50 , which has a center optical axis 52 .
  • the FOV 50 is bounded by boundaries 54 a and 54 b , which bound the FOV 50 on sides of the FOV 50 .
  • the infrared image sensor 100 may have a plurality of image cameras 10 like the one in FIG. 2 , where each of the cameras 10 has a FOV 50 with a center optical axis 52 and bounded by boundaries 54 a and 54 b .
  • FIG. 1 illustrates three cameras 10 for ease of illustration, where in general the number of cameras 10 may be more than three, or may be two, according to embodiments of the inventive concepts disclosed herein.
  • the infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other.
  • the center optical axes 52 intersect each other on a side of the focusing optics 30 opposite to the infrared detector 40 .
  • the intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
  • the particular infrared detectors 40 depends on the band of infrared radiation to be detected. For example, if radiation is to be detected in the short wave infrared (SWIR) region, the infrared detectors 40 should be sensitive to short wave infrared radiation in the wavelength range of 1-3 ⁇ m. In this case the infrared detectors 40 may include InGaAs material, for example.
  • the detectors 40 should be sensitive to UV (wavelength ⁇ 0.4 ⁇ m) or visible light in the wavelength range of 0.4-0.7 ⁇ m, or near infrared in the region 0.7-1.0 ⁇ m.
  • the infrared detectors 40 should be sensitive to midwave infrared radiation in the wavelength range of 3-5 ⁇ m.
  • the infrared detectors 40 may include InSb material, for example.
  • the infrared detectors 40 should be sensitive to long wavelength infrared radiation in the wavelength range of 8-14 ⁇ m.
  • the infrared detectors 40 may include vanadium oxide material, for example, and may be bolometers which measure the heat due to absorbed radiation.
  • the infrared detectors 40 may include microbolometers operating at room temperature.
  • the infrared detectors 40 may include cryogenically cooled infrared detectors operating at cryogenically cooled temperatures.
  • each of the infrared detectors 40 may include a single material sensitive to infrared radiation in a particular band.
  • each of the infrared detectors 40 may include multiple different materials, each material sensitive to infrared radiation in a different band.
  • the infrared image sensor 100 may include an optical window 60 through which the FOVs 50 pass.
  • the optical window may be arranged to protect the cameras 10 from contamination and/or severe weather, or dust and abrasion, in addition to proving a viewing window.
  • the optical window 60 may be a single optical window, and all of the FOVs 50 may pass through the optical window 60 .
  • the cross-eyed arrangement of the infrared cameras 10 which allows all of the FOVs 50 to pass through a single optical window 60 beneficially allows for a decrease in the size and weight of the infrared sensor 100 .
  • the infrared sensor 100 need not include an optical window 60 through which pass the FOVs 50 .
  • the cross-eyed arrangement of the infrared cameras 10 would still allow for a decrease in the size and weight of the infrared sensor 100 .
  • Embodiments of the inventive concepts disclosed are not limited to a single or no optical window 60 .
  • the infrared sensor 100 may include multiple optical windows, where the FOVs 50 pass through separate windows of the multiple optical windows.
  • the window 60 may be made of a material which is transparent to the infrared radiation for which the infrared detectors 40 are sensitive.
  • the window 60 may be made of Ge or Si, for certain regions of the infrared spectrum to be detected.
  • the infrared sensor 100 may further include a heater 65 to heat the window 60 .
  • the heater 65 may prevent or reduce icing of the window.
  • power may be reduced due to only a single heater 65 needing to be powered to heat a single window, which may be relatively small.
  • the infrared sensor 100 may further include a single shutter 67 .
  • the shutter 67 allows for a single shutter to be used for Non-Uniformity Correction (NUC) on all of the cameras 10 . Pixels from each of the cameras 10 tend to drift over time. A uniform temperature shutter allows for correction of the drift via NUC. By using a single shutter 67 for all of the cameras 10 , reliability increases, and a significant reduction of cost and size may be achieved.
  • the controller 20 may control the opening and closing of the shutter 67 .
  • the controller 20 may further be arranged to perform Non-Uniformity Correction on each of the cameras 10 based on infrared light from the shutter 67 when closed
  • Each of the infrared cameras 10 may produce an image, which is received by the controller 20 .
  • the image produced by each infrared camera 10 is based on an array of pixels of the individual infrared camera 10 .
  • the controller 20 may generate a composite image based on the individual images received from the infrared cameras 10 .
  • the composite image may be formed in a number of ways. For example, each camera 10 may be displayed on its own screen, and the screens may be butted together. No image processing is required, but there will be some overlap between the cameras on the displays. As another example, the controller 20 may merge the individual images together into a larger array.
  • the individual cameras are registered using fiducial points that are located in the overlap area between the cameras. In either case, the resolution is increased because more pixels are arranged across the FOV.
  • the infrared cameras 10 may be arranged in an array, such as a one-dimensional array or a two-dimensional array.
  • FIGS. 3A and 3B illustrates the infrared cameras 10 in a one-dimensional array.
  • FIG. 3A is a side view of the infrared cameras 10
  • FIG. 3B shows the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the one dimensional array.
  • FIGS. 3A and 3B illustrate an array with three cameras 10 .
  • the number of cameras 10 may be more than three, or may be two, for example.
  • FIG. 3A further illustrates an optical window 60 through which FOVs 50 with a center optical axis 52 pass.
  • infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other.
  • the intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
  • FIG. 3B illustrates the images 70 corresponding to the fields of view of each infrared camera 10 in the one dimensional array.
  • the individual images 70 have the resolution of the individual camera corresponding to the individual image, for example, VGA.
  • This overlap region 12 beneficially allows for registration of adjacent infrared cameras 10 .
  • the overlap region 12 beneficially accommodates for distortion of the image generated from the infrared camera's optics 10 .
  • FIGS. 4A-4C illustrate the infrared cameras 10 arranged in a two-dimensional cross-eyed array.
  • FIGS. 4A and 4B are side views (row and column, respectively) of the infrared cameras 10
  • FIG. 4C shows the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the two-dimensional array.
  • FIGS. 4A-4C illustrate a 3-by-4 array with twelve cameras 10 .
  • the array may be other than a 3-by-4 array, and the number of cameras 10 may be more than twelve, or less than twelve, for example.
  • FIGS. 4A and 4B further illustrate an optical window 60 through which FOVs 50 with a center optical axis 52 pass.
  • infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other.
  • the intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
  • FIG. 4C illustrates the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the two-dimensional array, where the infrared cameras are arranged in rows (left to right) and columns (up and down).
  • this overlap region 12 beneficially allows for registration of adjacent infrared cameras 10 .
  • the overlap region 12 beneficially accommodates for distortion of the image generated from the infrared cameras 10 .
  • the overlap region 12 may be different, or the same, between rows and columns.
  • some embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement provide for a lower SWAP-C by allowing for a single optical window through which pass the field of views of the different cameras arranged in a mosaic.
  • This arrangement is in contrast to a butted field of view mosaic arrangement, which typically have a separate optical window for each camera, or would require a very large window for all of the cameras.
  • a composite image can be generated having an increased resolution and/or field of view as compared to the images from individual cameras.
  • the arrangement allows for an increase in pixels in a desired FOV compared to that for a single camera.
  • the composite image generated may have a resolution greater than that of VGA.
  • Embodiments of the inventive concepts disclosed herein are not limited to an infrared sensor, but may include an optical sensor generally where the detectors of the sensor are sensitive to radiation outside of the infrared region, such as visible light or ultraviolet light, for example.

Abstract

An infrared sensor including a plurality of infrared cameras is described. Each infrared camera includes an infrared detector and imaging optics. The imaging optics are arranged to image infrared light onto the infrared detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view. The infrared cameras are arranged such that the center optical axes of the field of views intersect each other, and the fields of view overlap each other.

Description

The inventive concepts disclosed herein generally relate to the field of imaging sensors.
BACKGROUND
Imaging sensors may be used for imaging objects in applications such as airborne imaging. The size and resolution of imaging sensors often depends on the wavelength range of radiation which may be detected by detectors of the imaging sensors. For example, imaging sensors for visible light are known to provide higher resolution, e.g. ultra HD 4K, in a single low size, weight, power, and cost (SWAP-C) package. Radiation wavelength detection bands other than visible often cannot match visible wavelengths for high resolution and low SWAP-C.
Uncooled long wave infrared (LWIR) sensors (sensitive to infrared radiation in the wavelength range of 8-14 μm), for example, have not gotten beyond XGA resolution (1024×768). Moreover, VGA resolution (640×480) for uncooled LWIR sensors is significantly less expensive on a per pixel cost basis.
Infrared sensors may be arranged in a configuration where the sensors have a plurality of cameras arranged in a mosaic, and each camera provides an image. Such sensors are arranged such that the field of views (FOVs) of the cameras may overlap, but the center optical axes of the cameras do not intersect. Each of the cameras has its separate window from which it receives IR light to be imaged onto a detector of the camera. Such an arrangement is known as a butted field of view mosaic.
SUMMARY
In one aspect, embodiments of the inventive concepts disclosed herein are directed to an infrared sensor. The infrared sensor includes a plurality of infrared cameras. Each infrared camera includes an infrared detector and imaging optics. The imaging optics are arranged to image infrared energy onto the infrared detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view. The infrared cameras are arranged such that the center optical axes of the field of views intersect each other, and the fields of view overlap each other.
In a further aspect, embodiments of the inventive concepts disclosed herein are directed to an optical sensor. The optical sensor includes a plurality of cameras. Each camera includes a detector and imaging optics. The imaging optics are arranged to image light onto the detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view. The cameras are arranged such that the center optical axes of the fields of view intersect each other, and the field of views overlap each other.
BRIEF DESCRIPTION OF THE DRAWINGS
Implementations of the inventive concepts disclosed herein may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the included drawings, which are not necessarily to scale, and in which some features may be exaggerated and some features may be omitted or may be represented schematically in the interest of clarity. Like reference numerals in the drawings may represent and refer to the same or similar element, feature, or function. In the drawings:
FIG. 1 is a schematic illustrating an infrared sensor according to inventive concepts disclosed herein.
FIG. 2 is a schematic illustrating an infrared camera of the infrared sensor of FIG. 1 according to inventive concepts disclosed herein.
FIG. 3A is a schematic illustrating a side view of an arrangement of infrared cameras according to inventive concepts disclosed herein.
FIG. 3B is a schematic illustrating overlapped images corresponding to fields of view of the arrangement of infrared cameras of FIG. 3A according to inventive concepts disclosed herein.
FIG. 4A is a schematic illustrating a side view of a single row of infrared cameras of a two dimensional arrangement of infrared cameras according to inventive concepts disclosed herein.
FIG. 4B is a schematic illustrating a side view of a single column of infrared cameras of a two dimensional arrangement of infrared cameras according to inventive concepts disclosed herein.
FIG. 4C is a schematic illustrating overlapped images corresponding to fields of view of the two dimensional arrangement of infrared cameras of FIGS. 4A and 4B according to inventive concepts disclosed herein.
DETAILED DESCRIPTION
Embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement provide for a lower SWAP-C by allowing for a single optical window through which pass the fields of view of the different cameras arranged in a mosaic. This arrangement is in contrast to a butted field of view mosaic arrangement, which typically have a separate optical window for each camera, or would require a very large window for all of the cameras.
Moreover, according to embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement for the cameras, and a controller that generates a composite image based on the images of each of the cameras, a composite image can be generated having an increased resolution as compared to the individual images from the cameras. Thus, for example, if the resolution of the image of the individual cameras is VGA, the composite image generated may have a resolution greater than that of VGA. The arrangement allows for an increase in pixels in a desired FOV compared to that for a single camera. The inventive concepts disclosed herein are not limited to the individual cameras having a VGA resolution, and any other resolutions, such as XGA for example, are possible.
FIG. 1 is schematic of an infrared sensor 100 according to inventive concepts disclosed herein. The infrared sensor 100 includes a plurality of infrared cameras 10, and a controller 20.
The controller 20 may include a processor 22 or a computer configured to, such as by being programmed, or hardwired, or by firmware, to perform certain functions. The controller may include a memory 24, or other non-transitory computer readable medium, which stores programs executed by the processor 22 of the controller 20.
FIG. 2 is a schematic illustrating one of the infrared cameras 10 having a particular FOV 50. The infrared camera 10 has imaging optics 30 and an infrared detector 40. The imaging optics 30 are arranged to image infrared light onto the infrared detector 40. The imaging optics 30 may include one or more refractive and/or reflective and/or diffractive optical elements. The refractive optical elements may include lenses, for example. The reflective optical elements may include mirrors, for example. The diffractive elements may include diffraction gratings, for example.
Referring to FIG. 2, the imaging optics 30 has a FOV 50, which has a center optical axis 52. The FOV 50 is bounded by boundaries 54 a and 54 b, which bound the FOV 50 on sides of the FOV 50.
Referring back to FIG. 1, the infrared image sensor 100 may have a plurality of image cameras 10 like the one in FIG. 2, where each of the cameras 10 has a FOV 50 with a center optical axis 52 and bounded by boundaries 54 a and 54 b. FIG. 1, for example, illustrates three cameras 10 for ease of illustration, where in general the number of cameras 10 may be more than three, or may be two, according to embodiments of the inventive concepts disclosed herein.
The infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other. In particular, the center optical axes 52 intersect each other on a side of the focusing optics 30 opposite to the infrared detector 40. The intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
The particular infrared detectors 40 depends on the band of infrared radiation to be detected. For example, if radiation is to be detected in the short wave infrared (SWIR) region, the infrared detectors 40 should be sensitive to short wave infrared radiation in the wavelength range of 1-3 μm. In this case the infrared detectors 40 may include InGaAs material, for example.
As another example, if radiation is to be detected in the visible or ultra-violet (UV), or near infrared regions, the detectors 40 should be sensitive to UV (wavelength <0.4 μm) or visible light in the wavelength range of 0.4-0.7 μm, or near infrared in the region 0.7-1.0 μm.
As another example, if radiation is to be detected in the midwave infrared (MWIR) region, the infrared detectors 40 should be sensitive to midwave infrared radiation in the wavelength range of 3-5 μm. In this case the infrared detectors 40 may include InSb material, for example.
As another example, if radiation is to be detected in the long wavelength infrared region, the infrared detectors 40 should be sensitive to long wavelength infrared radiation in the wavelength range of 8-14 μm. In this case the infrared detectors 40 may include vanadium oxide material, for example, and may be bolometers which measure the heat due to absorbed radiation. The infrared detectors 40 may include microbolometers operating at room temperature. Alternatively, the infrared detectors 40 may include cryogenically cooled infrared detectors operating at cryogenically cooled temperatures.
According to the inventive concepts disclosed herein each of the infrared detectors 40 may include a single material sensitive to infrared radiation in a particular band. Alternatively, each of the infrared detectors 40 may include multiple different materials, each material sensitive to infrared radiation in a different band.
The infrared image sensor 100 may include an optical window 60 through which the FOVs 50 pass. The optical window may be arranged to protect the cameras 10 from contamination and/or severe weather, or dust and abrasion, in addition to proving a viewing window. In some embodiments of the inventive concepts disclosed herein, the optical window 60 may be a single optical window, and all of the FOVs 50 may pass through the optical window 60.
The cross-eyed arrangement of the infrared cameras 10 which allows all of the FOVs 50 to pass through a single optical window 60 beneficially allows for a decrease in the size and weight of the infrared sensor 100.
Alternatively, the infrared sensor 100 need not include an optical window 60 through which pass the FOVs 50. The cross-eyed arrangement of the infrared cameras 10 would still allow for a decrease in the size and weight of the infrared sensor 100.
Embodiments of the inventive concepts disclosed are not limited to a single or no optical window 60. For example, the infrared sensor 100 may include multiple optical windows, where the FOVs 50 pass through separate windows of the multiple optical windows.
The window 60 may be made of a material which is transparent to the infrared radiation for which the infrared detectors 40 are sensitive. For example, the window 60 may be made of Ge or Si, for certain regions of the infrared spectrum to be detected.
The infrared sensor 100 may further include a heater 65 to heat the window 60. For example, the heater 65 may prevent or reduce icing of the window. According to embodiments of the inventive concepts disclosed herein where the infrared sensor 100 includes a single window 60 and thus a single heater 65, power may be reduced due to only a single heater 65 needing to be powered to heat a single window, which may be relatively small.
The infrared sensor 100 may further include a single shutter 67. The shutter 67 allows for a single shutter to be used for Non-Uniformity Correction (NUC) on all of the cameras 10. Pixels from each of the cameras 10 tend to drift over time. A uniform temperature shutter allows for correction of the drift via NUC. By using a single shutter 67 for all of the cameras 10, reliability increases, and a significant reduction of cost and size may be achieved. The controller 20 may control the opening and closing of the shutter 67. The controller 20 may further be arranged to perform Non-Uniformity Correction on each of the cameras 10 based on infrared light from the shutter 67 when closed
Each of the infrared cameras 10 may produce an image, which is received by the controller 20. In general, the image produced by each infrared camera 10 is based on an array of pixels of the individual infrared camera 10. The controller 20 may generate a composite image based on the individual images received from the infrared cameras 10. The composite image may be formed in a number of ways. For example, each camera 10 may be displayed on its own screen, and the screens may be butted together. No image processing is required, but there will be some overlap between the cameras on the displays. As another example, the controller 20 may merge the individual images together into a larger array. The individual cameras are registered using fiducial points that are located in the overlap area between the cameras. In either case, the resolution is increased because more pixels are arranged across the FOV.
The infrared cameras 10 may be arranged in an array, such as a one-dimensional array or a two-dimensional array. FIGS. 3A and 3B illustrates the infrared cameras 10 in a one-dimensional array. FIG. 3A is a side view of the infrared cameras 10, while FIG. 3B shows the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the one dimensional array. For ease of illustration FIGS. 3A and 3B illustrate an array with three cameras 10. In general the number of cameras 10 may be more than three, or may be two, for example.
In addition to the one-dimensional array of infrared cameras, FIG. 3A further illustrates an optical window 60 through which FOVs 50 with a center optical axis 52 pass. In a similar fashion to the arrangement in FIG. 1, in FIG. 3A, infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other. The intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
FIG. 3B illustrates the images 70 corresponding to the fields of view of each infrared camera 10 in the one dimensional array. The individual images 70 have the resolution of the individual camera corresponding to the individual image, for example, VGA. As can be seen, there is an overlap region 12 of images 70 between adjacent infrared cameras 10. This overlap region 12 beneficially allows for registration of adjacent infrared cameras 10. Further the overlap region 12 beneficially accommodates for distortion of the image generated from the infrared camera's optics 10.
FIGS. 4A-4C illustrate the infrared cameras 10 arranged in a two-dimensional cross-eyed array. FIGS. 4A and 4B are side views (row and column, respectively) of the infrared cameras 10, while FIG. 4C shows the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the two-dimensional array. For ease of illustration FIGS. 4A-4C illustrate a 3-by-4 array with twelve cameras 10. In general the array may be other than a 3-by-4 array, and the number of cameras 10 may be more than twelve, or less than twelve, for example.
In addition to the two-dimensional array of infrared cameras 10, FIGS. 4A and 4B further illustrate an optical window 60 through which FOVs 50 with a center optical axis 52 pass. In a similar fashion to the arrangement in FIG. 1, in FIGS. 4A and 4B, infrared cameras 10 are arranged and oriented such that the center optical axes 52 of the FOVs 50 intersect with each other, and the FOVs 50 overlap with each other. The intersection of the center optical axes 52 of the FOVs 50 provides an infrared sensor 100 with a cross-eyed sensor mosaic arrangement of the infrared cameras 10 where the center optical axes of the FOVs 50 cross each other.
FIG. 4C illustrates the overlapped images 70 corresponding to the fields of view of the infrared cameras 10 in the two-dimensional array, where the infrared cameras are arranged in rows (left to right) and columns (up and down). As can be seen, there is an overlap region 12 between images 70 corresponding to the fields of view of adjacent infrared cameras 10. As mentioned above with respect to FIG. 3B, this overlap region 12 beneficially allows for registration of adjacent infrared cameras 10. Further the overlap region 12 beneficially accommodates for distortion of the image generated from the infrared cameras 10. The overlap region 12 may be different, or the same, between rows and columns.
As described above, some embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement provide for a lower SWAP-C by allowing for a single optical window through which pass the field of views of the different cameras arranged in a mosaic. This arrangement is in contrast to a butted field of view mosaic arrangement, which typically have a separate optical window for each camera, or would require a very large window for all of the cameras.
As described further above, according to embodiments of the inventive concepts disclosed herein regarding a cross-eyed field of view arrangement for the cameras, and a controller that generates a composite image based on the images of each of the cameras, a composite image can be generated having an increased resolution and/or field of view as compared to the images from individual cameras. The arrangement allows for an increase in pixels in a desired FOV compared to that for a single camera. Thus, for example, if the resolution of the image of the individual cameras is VGA, the composite image generated may have a resolution greater than that of VGA.
Embodiments of the inventive concepts disclosed herein are not limited to an infrared sensor, but may include an optical sensor generally where the detectors of the sensor are sensitive to radiation outside of the infrared region, such as visible light or ultraviolet light, for example.
The embodiments of the inventive concepts disclosed herein have been described in detail with particular reference to preferred embodiments thereof, but it will be understood by those skilled in the art that variations and modifications can be effected within the spirit and scope of the inventive concepts.

Claims (20)

What is claimed is:
1. An infrared sensor comprising:
a single optical window;
a plurality of infrared cameras arranged in a two-dimensional array that spatially extends in two directions,
each infrared camera comprising:
an infrared detector; and
imaging optics arranged to image infrared light onto the infrared detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view, wherein the infrared cameras are arranged such that the center optical axes of the fields of view intersect each other at a common intersection point, and the fields of view overlap each other;
wherein an entirety of the field of view of each of the plurality of infrared cameras at the single optical window passes through the single optical window.
2. The infrared sensor of claim 1, further comprising a controller configured to receive images from each of the infrared cameras and to generate a composite image based on the received images from each of the infrared cameras.
3. The infrared sensor of claim 2, wherein the composite image has a higher resolution than a resolution of a single one of the infrared cameras with a same total field of view.
4. The infrared sensor of claim 3, wherein the composite image has a larger field of view than a field of view of the single one of the infrared cameras.
5. The infrared sensor of claim 2, wherein adjacent of the infrared cameras have overlapping images.
6. The infrared sensor of claim 1, further comprising a heater arranged to heat the single optical window.
7. The infrared sensor of claim 2, further comprising a single shutter,
the controller arranged to open and close the shutter, and to perform Non-Uniformity Correction on each of the cameras based on infrared light from the shutter when closed.
8. The infrared sensor of claim 1, where the infrared detectors are sensitive to at least one of near infrared radiation in a wavelength range of 0.7 to 1 μm, short wave infrared radiation in the wavelength range of 1 to 3 μm, midwave infrared radiation in the wavelength range of 3 to 5 μm and long wavelength infrared radiation in the wavelength range of 8-14 μm.
9. The infrared sensor of claim 1, where the infrared detectors are room temperature detectors.
10. The infrared sensor of claim 1, where the infrared detectors are cryogenically cooled detectors.
11. The infrared sensor of claim 1, wherein the infrared detectors include bolometers.
12. The infrared sensor of claim 2, wherein the controller is configured to register each of the plurality of infrared cameras using fiducial points located in an overlap portion of an image received from each of the plurality of infrared cameras.
13. The infrared sensor of claim 1, wherein the single optical window is a material that is transparent to infrared radiation having a wavelength that the plurality of infrared cameras are configured to measure.
14. An infrared sensor comprising:
a plurality of infrared cameras,
each infrared camera comprising:
an infrared detector sensitive to long wavelength infrared radiation in the wavelength range of 8-14 μm; and
imaging optics arranged to imaging infrared light onto the infrared detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view, wherein the infrared cameras are arranged such that the center optical axes of the fields of view intersect each other, and the fields of view overlap each other; and
a controller configured to receive images from each of the infrared cameras and to generate a composite image based on the received images from each of the infrared cameras,
wherein the plurality of infrared cameras are spatially arranged according to an array in rows and column and are oriented towards a common point of interest.
15. The infrared sensor of claim 14, further comprising a single optical window, wherein all of the fields of view pass through the single optical window.
16. The infrared sensor of claim 14, wherein the infrared detector comprises a plurality of materials each sensitive to a different wavelength band of infrared radiation.
17. The infrared sensor of claim 16, wherein at least one of the plurality of materials is sensitive to long wavelength infrared radiation in a wavelength range of 8-14 μm, another one of the plurality of materials is sensitive to midwave infrared radiation in a wavelength range of 3-5 μm, and another of the of the plurality of materials is sensitive to short wave infrared radiation in a wavelength range of 1-3 μm.
18. An optical sensor comprising:
a plurality of cameras spatially arranged in two directions and each spatially oriented towards a common point of interest,
each camera comprising:
a detector; and
imaging optics arranged to image light onto the detector, the imaging optics having a field of view with a center optical axis, and field of view boundary lines bounding the field of view on sides of the field of view, wherein the cameras are arranged such that the center optical axes of the fields of view intersect each other at the common point of interest, and the fields of view overlap each other.
19. The sensor of claim 18, where the detectors are sensitive to visible light.
20. The sensor of claim 18, where the detectors are sensitive to ultraviolet light.
US16/009,787 2018-06-15 2018-06-15 Cross-eyed sensor mosaic Active US10687000B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/009,787 US10687000B1 (en) 2018-06-15 2018-06-15 Cross-eyed sensor mosaic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/009,787 US10687000B1 (en) 2018-06-15 2018-06-15 Cross-eyed sensor mosaic

Publications (1)

Publication Number Publication Date
US10687000B1 true US10687000B1 (en) 2020-06-16

Family

ID=71075169

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/009,787 Active US10687000B1 (en) 2018-06-15 2018-06-15 Cross-eyed sensor mosaic

Country Status (1)

Country Link
US (1) US10687000B1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075427A1 (en) * 2010-09-24 2012-03-29 Microsoft Corporation Wide angle field of view active illumination imaging system
US20150185592A1 (en) * 2012-07-02 2015-07-02 Agricam Ab Camera housings, camera modules, and monitoring systems
US20170211984A1 (en) * 2014-10-16 2017-07-27 Flir Systems, Inc. Bolometer circuitry and methods for difference imaging
US20180061008A1 (en) * 2016-08-31 2018-03-01 Autoliv Asp, Inc. Imaging system and method
US9924078B2 (en) * 2011-09-01 2018-03-20 Siemens Aktiengesellschaft Image-capturing device, in particular person-counting mechanism, having a housing which is transparent in the infrared range and nontransparent in the optically visible range
US10145743B2 (en) * 2013-03-05 2018-12-04 Teknologian Tutkimuskeskus Vtt Oy Superconducting thermal detector (bolometer) of terahertz (sub-millimeter wave) radiation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075427A1 (en) * 2010-09-24 2012-03-29 Microsoft Corporation Wide angle field of view active illumination imaging system
US9924078B2 (en) * 2011-09-01 2018-03-20 Siemens Aktiengesellschaft Image-capturing device, in particular person-counting mechanism, having a housing which is transparent in the infrared range and nontransparent in the optically visible range
US20150185592A1 (en) * 2012-07-02 2015-07-02 Agricam Ab Camera housings, camera modules, and monitoring systems
US10145743B2 (en) * 2013-03-05 2018-12-04 Teknologian Tutkimuskeskus Vtt Oy Superconducting thermal detector (bolometer) of terahertz (sub-millimeter wave) radiation
US20170211984A1 (en) * 2014-10-16 2017-07-27 Flir Systems, Inc. Bolometer circuitry and methods for difference imaging
US20180061008A1 (en) * 2016-08-31 2018-03-01 Autoliv Asp, Inc. Imaging system and method

Similar Documents

Publication Publication Date Title
US10706514B2 (en) Systems and methods for enhanced dynamic range infrared imaging
US9110276B2 (en) Full-field GEO imager optics with extended spectral coverage
US10547799B2 (en) Infrared detector with increased image resolution
WO2016182961A1 (en) Isothermal image enhancement systems and methods
US11375143B1 (en) Device for non-uniformity correction
US9759611B2 (en) Dual spectral imager with no moving parts
US20240011842A1 (en) Non-uniformity correction for focal plane arrays
US10687000B1 (en) Cross-eyed sensor mosaic
JP6567764B2 (en) Dual-pupil dual-band wide-field re-imaging optical system
US9876972B1 (en) Multiple mode and multiple waveband detector systems and methods
US20230046320A1 (en) Non-uniformity correction calibrations in infrared imaging systems and methods
CA3109032C (en) Image acquisition method for microbolometer thermal imaging systems
US11368637B1 (en) Image acquisition method for microbolometer thermal imaging systems
Göttfert et al. Optimizing microscan for radiometry with cooled IR cameras
US20230288784A1 (en) Snap-fit lens barrel systems and methods
US20230140342A1 (en) Wide field of view imaging systems and methods
Vollheim et al. Application of cooled IR focal plane arrays in thermographic cameras
US20230069029A1 (en) Variable sensitivity in infrared imaging systems and methods
de la Barrière et al. Development of an infrared ultra-compact multichannel camera integrated in a SOFRADIR's detector Dewar cooler assembly
Masterson et al. MWIR wide-area step and stare imager
Hagen Dynamic radiometric calibration and radiometric matching of multiple microbolometer detector arrays
Slemer et al. Spectral response of the stereo imaging channel of SIMBIO-SYS on-board the ESA BepiColombo mission
CN116736463A (en) Snap fit lens barrel system and method
Lampton Microbolometer Arrays for Airborne Fire Sensing
WO2023101923A1 (en) Detection threshold determination for infrared imaging systems and methods

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4