EP4378156A2 - Dynamic infrared scene generation and projection system and methods - Google Patents
Dynamic infrared scene generation and projection system and methodsInfo
- Publication number
- EP4378156A2 EP4378156A2 EP22873361.4A EP22873361A EP4378156A2 EP 4378156 A2 EP4378156 A2 EP 4378156A2 EP 22873361 A EP22873361 A EP 22873361A EP 4378156 A2 EP4378156 A2 EP 4378156A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- laser
- image
- scene
- screen
- thermal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 68
- 229920001721 polyimide Polymers 0.000 claims abstract description 76
- 238000012545 processing Methods 0.000 claims abstract description 58
- 239000004642 Polyimide Substances 0.000 claims abstract description 44
- 238000004891 communication Methods 0.000 claims description 16
- 238000004422 calculation algorithm Methods 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 8
- 238000010200 validation analysis Methods 0.000 claims description 5
- 230000000977 initiatory effect Effects 0.000 claims description 4
- 239000004744 fabric Substances 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 238000010304 firing Methods 0.000 description 13
- 238000012549 training Methods 0.000 description 12
- 238000010438 heat treatment Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 238000003860 storage Methods 0.000 description 8
- 238000001816 cooling Methods 0.000 description 7
- 230000008685 targeting Effects 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 229920003223 poly(pyromellitimide-1,4-diphenyl ether) Polymers 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 239000011120 plywood Substances 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 241000251477 Chimaera Species 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000005094 computer simulation Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000004952 Polyamide Substances 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004093 laser heating Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000011089 mechanical engineering Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 229920002647 polyamide Polymers 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000000135 prohibitive effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000010512 thermal transition Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
- G02B27/022—Viewing apparatus
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/02—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/16—Sighting devices adapted for indirect laying of fire
- F41G3/165—Sighting devices adapted for indirect laying of fire using a TV-monitor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J2/00—Reflecting targets, e.g. radar-reflector targets; Active targets transmitting electromagnetic or acoustic waves
- F41J2/02—Active targets transmitting infrared radiation
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/54—Accessories
- G03B21/56—Projection screens
- G03B21/60—Projection screens characterised by the nature of the surface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3158—Modulator illumination systems for controlling the spectrum
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G7/00—Direction control systems for self-propelled missiles
- F41G7/006—Guided missiles training or simulation devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
Definitions
- the present invention relates to the field of thermal sensor scene projection.
- dynamic Infrared Scene Projection (IRSP) for targeting systems and weapons training can operate using on one or more characteristics of a proposed target.
- IRSP Infrared Scene Projection
- Infrared scene projection systems perform a vital role in the testing of tactical and theatre weapons systems.
- Weapon targeting systems often rely on one or more attributes of a proposed target (e.g., thermal signatures).
- Technological advancements have been limited and fail to provide a solution for providing dynamic and/or unique scene generation that may be used as a target for training weapons systems.
- Current non-collimated field IR Scene Projection relies on crude static cutouts using plywood consisting of thermal heating pads to create a “hot-spot” by being affixed to plywood as “pop-up” silhouettes to represent static-aspects of vehicles and their thermal signatures.
- LWIR target systems employed at live-fire ranges lack the fidelity to support recognition, classification, and identification of the target.
- dynamic thermal attributes can be necessary to identify strategic engagement opportunities with a target.
- the current state of the art employs crude cartoon-like hot-spots based on manual placement of the thermal heating pads on or around the plywood cutout. There has been an overall failure in the art to provide any opportunity for dynamic and/or strategic identification of an optimal engagement location on the target.
- cooled lasers have been used to generate a static hot-spot in a background of a small screen, with a cryogenically cooled thermal sensor to monitor the hot-spot viewed from the back side of the screen (Cooper, Susan and Eaton, John; Spatially Resolved Surface Temperature Control Using Scanned Laser Heating. Dept, of Mechanical Engineering of Stanford University. Aug. 2002).
- the laser systems described therein were limited to uniform hot-spot incapable and prohibitive of dynamic scene generation and control.
- a dynamic infrared (IR) scene generation system can comprise a polyimide screen configured to react to contact by an IR laser; a laser scanning system configured to contact the polyimide screen with laser beam, the laser scanning system comprising a laser scanner, one or more modulators, and an IR laser source; an image processing system operably connected to a controller configured to control the laser scanning system based on data from the image processing system, wherein the laser scanning system can be configured to contact the polyimide screen with the IR laser based on an IR scene input.
- IR infrared
- the system can comprise a closed feedback system configured to validate a thermal target signature acquired by a thermal imager.
- the thermal target signature can be a dynamic scene based on the polyimide screen reaction to contact by the IR laser.
- the image processing system can be configured to correct one or more pixel intensities of a IR scene input, wherein the image processing system can comprise one or more drivers configured to generate data based on the IR scene input, wherein controller can be configured to control the laser scanning system based on the data.
- the system further comprises a thermal imager operably connected to the image processing system, the thermal imager can be configured to acquire one or more images from the polyimide screen, wherein the one or more images can be configured to be adjusted by the controller based on adjustment data from the image processing system, wherein the adjustment data can be associated with the one or more acquired images.
- the laser scanning system can be configured to generate one or more thermally dynamic images on the polyimide screen, wherein each of the one or more thermally dynamic images can be targetable by a weapon system.
- the polyimide screen can be a multilayer screen comprising a polyimide film contactable by the IR laser, and a fabric backing that can be configured to dissipate heat from the laser transmitted through the polyimide film.
- the system comprises a hit detection system in operable communication with the thermal imager, the hit detection system can comprise one or more radio frequency sensors, one or more electromagnetic sensors, or one or more identification sensors.
- one or more physics-based algorithms can be executed in a computing system. The one or more physicsbased algorithms can be configured to generate the IR scene input, wherein the generated IR scene input can be configured to have one or more thermally dynamic attributes based on the one or more physics-based algorithms.
- the laser scanning system can be uncooled.
- a method of dynamic infrared (IR) scene generation with an uncooled collimated laser can comprise first generating an IR scene input. Then, an image processing system generating data based on the IR scene input. Then, initiating a laser scanning system comprising a laser scanner, wherein a controller can direct operations of the laser scanning system based on the generated data. Then, contacting a polyimide screen with a laser beam directed by the laser scanner. Then, generating a target signature on the polyimide screen, wherein the laser scanner can scan the laser beam on the polyimide screen based on the IR scene input.
- IR infrared
- a method described herein can further comprise acquiring a thermal image emitted by the polyimide screen with an IR thermal imager. Then, validating the acquired thermal image against the IR scene input, wherein if the acquired thermal image fails validation, the method can further comprise processing the acquired thermal image with the image processing system and adjusting one or more system components based on adjustment data generated by the image processing, the adjustment data can be based on the processed thermal image.
- the IR scene input comprises a predetermined sequence of one or more simulated dynamic adjustments to one or more regions of the IR scene input, wherein the one or more simulated dynamic adjustment is based on one or more changes in simulated thermal activity of the generated IR scene input.
- a method described herein can further comprise repeating the method in a sequence based on one or more predetermined adjustments in the IR scene input.
- a method described herein can further comprise modulating the laser beam with an acousto-optic modulator, and passing the modulated laser beam through one or more optics before contacting a polyimide screen with a laser beam directed by the laser scanner.
- the IR scene input can be a video comprising sequential frames
- the thermal imager acquires the sequential frames from the video target signature.
- the acquired sequential frames can be validated after all of the sequential frames are acquired, wherein the method further comprises the thermal imager storing sequential frames acquired from the IR scene input video target signature.
- a method described herein can further comprise detecting a hit with one or more sensors in operable communication with the polyimide screen, wherein a hit is detected when a weapon system acquires the target signature.
- a method described herein can further comprise the controller adjusting operation of the laser scanning system, wherein the target signature is validated against the IR scene input, wherein the operation of the laser scanning system is adjusted until the target signature passes validation.
- FIG. 1 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
- FIG. 2 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
- FIG. 3 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
- FIG. 4 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
- FIG. 5A and 5B are images illustrating examples of dynamic thermal activity of a scene generated by a system described herein.
- FIG. 6A, FIG. 6B and FIG. 6C are progressive examples of the temporal progression and dynamic thermal change of an IR scene generated by a system described herein.
- FIGS. 7A to 7D are examples of dynamic thermal transitions over time of a firing sequence simulated by IR scene generated by a system described herein.
- FIG. 8 is an example of the adjustment capabilities associated with a feedback loop system of a dynamic IR scene generation system described herein.
- FIG. 9 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
- a dynamic infrared scene generation and projection system can comprise a screen (e.g., a thermally responsive screen) configured to be selectively activatable on contact from a light source (e.g., a laser).
- a laser scanning device can be configured to emit energy in an arrangement based on an infrared (IR) scene generated by an infrared scene generator (IRSG) and processed by an image processor system.
- the image processing system can transmit scene generation instructions to the laser scanning system.
- the laser scanning system may comprise a laser system configured to emit an IR laser that can be modulated by one or more modulation devices and transmitted through optics to the laser scanning device for dynamic activation of the thermally responsive screen according to the generated IR scene.
- An image acquisition system (e.g., an IR thermal imager) can be in operable communication with the dynamic infrared scene generation and projection system.
- the image acquisition system can be configured to acquire images displayed on the thermally responsive screen.
- the image acquisition system can be operably connected to a frame grabber configured to grab or isolate one or more frames from an acquired image for input into the image refinement/image processing system.
- acquired images and data associated with the dynamic scene generated and displayed on the thermally responsive screen can be transmitted to the image processing system.
- These images and data may be incorporated into one or more algorithms configured to validate, refine, adjust, optimize, and/or modify, etc. one or more of the systems or components described herein.
- the image acquisition system may acquire an image presented on the thermally responsive screen and transmit associated data of the image to the image processing system for comparison of the acquired image (e.g., output image) to the image and/or scene generated by the IRSG. Based on the comparison one or more system components and/or processing functions may be adjusted to ensure optimization of the output image based on the image and/or scene generated by the IRSG.
- any system described herein may produce a signal or signal product that can be displayed on the thermally responsive screen.
- a screen may comprise materials thermally responsive screen may comprise a material based on electrical properties, mechanical properties, chemical properties and/or thermal properties.
- electrical properties may comprise dielectric constant, dielectric strength, electrical resistivity, sheet resistivity, etc.
- mechanical (e.g., physical) properties may comprise elastic modulus, flexural modulus, flexural strength, maximum allowed stress, tensile strength, etc.
- thermal properties may comprise coefficient of thermal expansion, heat deflection temperature, specific heat capacity, thermal conductivity, maximum service temperature, etc.
- the screen can be a film comprising one or more polymers.
- any system described herein may comprise a polyimide film as the thermally responsive screen.
- a polyimide film that can be Kapton® polyimide films.
- the screen e.g., thermally responsive screen
- Kapton® polyimide films comprises Kapton® polyimide films.
- any system described herein may comprise a screen having more than one layer of material.
- any system described herein may comprise a multi-layer screen.
- Any screen described herein may have a first layer comprising a polyimide film and a second layer can be configured as a barrier or backdrop material.
- a second layer of screen may comprise a fabric configured to absorb or substantially absorb heat away from the polyimide layer.
- the composition of one or more additional layers in communication with the polyimide layer, or otherwise comprising the screen may be configured to reduce heat reflected from the one or more additional layers and prevent contamination of the image (e.g., scene and/or target) scanned by the laser on the polyimide film.
- any system described herein may comprise a screen (e.g., a polyimide screen) arranged on a frame.
- the frame may be a rigid structure configured to support and/or retain the screen in a predetermined position and location.
- the frame may provide for dynamic adjustment of the screen.
- the frame may be configured to adjust or otherwise change from a first arrangement or orientation to a second arrangement or orientation (e.g., angle, height, etc.).
- any system described herein may comprise a screen arranged to receive a continuous scan from a laser scanner.
- a polyimide film screen may be in operable communication with a laser scanner to receive and be thermally activated by the laser scanner such that a target signature is generated and/or presented on the front of the screen.
- Any system described herein may comprise a laser scanner configured to transmit a laser beam onto a polyimide screen.
- a single optical laser scanner may be configured for step-scanning a laser beam on, across and/or through the screen (e.g., polyimide film) to generate a target signature based on the scanning of the laser beam on the polyimide film.
- the laser scanner can be configured to continuously and dynamically scan a laser beam to contact the polyimide film based on the IRSG input image such that the scanning of the laser beam by the laser scanner can result in the presentation of a target signature on the polyimide film.
- the continuous scanning of the laser beam by the laser scanner can provide a dynamic target signature with one or more changing or dynamic attributes in real-time based on the IRSG and/or programmed changes to an initial IRSG input into the system for target signature development.
- a laser scanner described herein may be arranged based on the location and orientation of the screen.
- the laser scanner may be positioned at a predetermined angle relative to the face or front of the polyimide film.
- the laser scanner can be positioned to direct and scan the laser across the polyimide film at a right angle between the path of the laser and the plane of the screen.
- the angle between the path of the laser and the plane of the screen can be greater than or less than a right angle.
- the angle between the path of the laser and the plane of the screen is dynamic and changes in real time.
- Any system described herein may comprise one or more optical devices (e.g., optics) configured to direct and optically adjust the laser beam from the laser (e.g., the IR laser).
- the optics can be one or more lenses arranged to calibrate, adjust, or otherwise coordinate one or more attributes of the laser beam passing from the laser to the screen.
- the optics are operably connected and/or positioned between the laser scanner and one or more modulators (e.g., an AOM).
- the laser beam can transition through the one or more modulators, then through the optics to the laser scanner for deployment or presentation on the screen (e.g., polyimide film).
- the laser beam from the one or more modulators may be considered a modulated laser beam as it contacts and passes through the optics.
- the laser beam may be considered a collimated laser beam after it has passed through the optics.
- any system described herein may comprise optics configured to collimate the laser beam for processing, scanning, and/or transmission of the laser beam by the laser scanner to the screen.
- Any system described herein may comprise one or more modulators configured to modulate one or more attributes or characteristics of the laser beam.
- An example of a modulator may be an acousto-optic modulator (AOM), Bragg cell and/or an acousto-optic deflector (AOD).
- AOM acousto-optic modulator
- AOD acousto-optic deflector
- an AOM may be configured to modulate diffraction, intensity, frequency, phase, polarization, etc. o a laser beam passing therethrough, uses the acousto-optic effect to diffract and shift the frequency of light using sound waves (usually at radio-frequency).
- the AOM may be configured and/or selected based on the required speeds of modulation.
- any system described herein may comprise an AOM with beamdump configured to adjust the intensity of the laser in real-time corresponding to the radiance intensity of the IRSG video pixel.
- an AOM may be configured to provide modulation frequencies greater than 200 MHz.
- any system described herein may comprise one or more lasers.
- a laser, as described herein may be configured to generate and/or emit energy (e.g., a laser beam).
- the laser may emit a beam based on parameters and/or instruction from an image processing system, as described herein.
- the laser beam may be based on the image processing system determining the orientation, arrangement and/or configuration of the IRSG input.
- the laser may be an uncooled laser system.
- the laser may generate or otherwise emit a laser of a predetermined wattage (W).
- W a predetermined wattage
- a laser beam may be 1W, 5W, 10W, 20W, 30W, 40W, 50W, 60W, 70W, 80W, 90W, 100W or greater.
- the laser beam may be any watt between 0W and 200W.
- a single, uncooled 10.6 um laser 80W or less).
- a dynamic infrared scene generation and projection system may comprise a single controller configured to control the angular position of an optical laser-scanner and/or the intensity of the single laser corresponding to the radiance location and intensity from the IRSG video (e.g., input image).
- one or more controllers may comprise a hardware controller configured to receive device control data from the image processor system and control one or more system hardware components (e.g., laser scanner, modulator, laser, fans etc.).
- a hardware controller may receive device control data configured to operate one or more fans in communication with the screen (e.g., polyimide film).
- the image acquisition system described herein may comprise a thermal imager configured to acquire an image (e.g., a target signature) presented on the polyimide film.
- the thermal imager can be a LWIR FLIR camera systems.
- the thermal image grabber e.g., camera systems
- the image acquisition system may operate within a spectral band range.
- the spectral band range may be between 4 pm to 30 pm.
- the spectral band range may be between 8 pm and 14 pm, In some examples, the spectral band range may be any range sufficient for image acquisition and processing as described herein.
- the thermal imager (e.g., image acquisition system) may have a field of view at least 5 degrees, at least 10 degrees, at least 15 degrees, at least 20 degrees, at least 25 degrees, at least 30 degrees, at least 35 degrees, at least 40 degrees, at least 45 degrees, at least 50 degrees or great.
- the thermal imager may have a field of view at any degrees between 1 and 100 or greater.
- the thermal imager (e.g., the image acquisition system) may be configured and/or capable of radiometry functionality.
- a thermal imager described herein may be capable of acquiring, observing and/or registering an image or thermal signature having a temperature between - lOOOoC to +1000oC.
- a thermal imager described herein may be configured to acquire, observe and/or register an image or thermal signature having a temperature greater than lOoC, greater than 50oC, greater than lOOoC, greater than 500oC and/or any temperature therebetween.
- any system described herein may comprise an uncooled microbolometer thermal imager.
- One or more filters may be associated with the thermal image and configured to protect the thermal imager from laser reflectance.
- a notch filter e.g., a 10.6um notch filter
- any system described herein may comprise a frame-grabber configured to continuously capture the camera images at the frame rate of the IRSG.
- the thermal imager may be arranged at an angle relative to the screen and/or the laser scanner.
- the thermal imager can be positioned at an angle relative to the screen to acquire a thermal image presented on the screen by the scanning from the laser scanner.
- any system described herein may comprise a thermal imager in operable communication with the FGU and the screen.
- One or more filters may be disposed between the thermal image grabber and the screen to protect the thermal imager.
- Any system described herein may comprise an image processing system (e.g., an image processor).
- An image processing system as described herein, may function as an initial image processor configured to process an IRSG image and/or video input.
- a desired target signature may be developed (e.g., using IRSG physics-based development algorithms) and submitted to the image processor.
- the image processor may refine the image. For example, pixels may be adjusted or corrected based on intensity errors.
- the IRSG input image may then be transmitted to a scanner driver.
- frame data associated with the IRSG input image after being corrected may be transmitted to a scanner driver.
- the scanner driver may comprise an AOM driver configured to receive the adjusted pixel data associated with the corrected IRSG input image. After the image data has been processed by the AOM driver and/or the scanner driver, the image data is transmitted to one or more controllers (e.g., a hardware controller) to influence and instruct operation of one or more hardware components of the system.
- controllers e.g., a hardware controller
- Any system described herein may comprise an image processing system (e.g., image processor) operably connected or in communication with a thermal imager (e.g., image acquisition system).
- the image processing system may be the same as the initial image process system.
- the image processing system may be separate or secondary to the initial image processing system.
- An image processing system in communication with the thermal imager may comprise a frame grabber in communication with the thermal imager and configured to obtain one or more frames acquired from the thermal imager. The frames may be transmitted to an image adjuster (e.g., image pixel correction component) and continue through the image processor components in a similar manner as described above (e.g., via the scanner driver and/or the AOM driver to the hardware controller).
- an image adjuster e.g., image pixel correction component
- any system described herein may operate based on signatureproducing (e.g., IRSG signal-products) technologies comprising thermal-reflective technology, thermal-emissive technology, thermal-fluorescent technology, and/or a combination thereof.
- signatureproducing e.g., IRSG signal-products
- any system described herein may produce a target signature in a thermal-emissive manner.
- a screen, of any system described herein may radiate heat after being contacted with energy from a light source (e.g., a laser).
- the emission of heat may be regulated by modulation of the energy from the laser.
- an intensity, duration, frequency, motion, and/or other configuration of the laser may impact the thermal emission from the screen (e.g., polyimide film) and can be dynamically adjusted to simulate target signature attributes.
- the thermal-emissive technology can provide a cost-effective and/or expendable target signature for use with live-fire training.
- Live-fire training may include the use of live ammunition deployed against a screen, as described here.
- the live ammunition may destroy the screen on contact. Therefore, it may be a substantial benefit to employ thermalemissive configurations of the screen for cost-effective replacement.
- the thermal-emissive design utilizes a laser to scan projected imagery onto the target screen (e.g., polyimide film).
- the screen’s polyamide film can heat up as the laser contacts it, and the heated film emits in the LWIR band.
- any system described herein can be calibrated such that the laser can be scanned across the target screen to project a recognizable scene (e.g., a T-72 tank) in the LWIR band.
- any system described herein is an uncooled system. In some examples, any system described herein is substantially uncooled. In some examples, any system described herein may comprise one or more cooling systems configured to optimize the target signature presented on the polyimide film. In some examples, any system described herein is void of any element cooling systems or components (e.g., cryogenic or chiller coolers, other than fan-based cooling). In some examples, any system described herein is uncooled and configured to dynamically-changing, highly spatially-resolved, high dynamic range target thermal signatures in complex backgrounds with very low-cost, expendable screens for live-fire exercises or other activities where the screen might be routinely damaged or destroyed and need to be replaced at practical cost.
- any element cooling systems or components e.g., cryogenic or chiller coolers, other than fan-based cooling.
- any system described herein may comprise a screen cooling system.
- one or more cooling devices e.g., fans
- a cooling system described herein may be configured to direct air flow (e.g., cool) over or at the polyimide film to sufficiently increase a convection coefficient of the film.
- a cooling system as described herein may be configured to selectively lower an intensity of the laser system within a scene (e.g., target signature) to improve thermally dynamic scene display (e.g. the dissipation of the barrel flare after firing), and/or spatially dynamic scenes.
- FIG. 1 shows a schematic diagram of an example arrangement for a dynamic infrared scene generation and projection system.
- An IR scene may include an image, multiple images, and/or a video (e.g., a sequence of compiled image frames).
- the IR scene may be generated using a physics-based algorithm relating to the subject matter of the IR scene.
- a vehicle may have a known engine configuration, weapon system, and other thermally active components.
- any system described herein may be configured to generate an IR scene input having an initial configuration that is dynamically adjustable based on pre-determined simulated changes in one or more thermal characteristics of the IR scene subject.
- the image processing system has one or more components configured to process, correct, modify, enhance, adjust, optimize, modify, or otherwise manipulate the IR scene file for processing.
- the image processing system has an image correction engine that may be configured to adjust of enhance the IR scene image (e.g., adjust or correct pixel values).
- the data from the corrected image is then transmitted to a scanner driver 215.
- the scanner driver accepts the error corrected frame data from the IR scene generated image.
- Additional drivers such as the AOM modulator driver 220 that may receive adjusted pixel data (e.g., adjusted pixel intensity data).
- the drivers may be configured to compile and package data for device control based on the processed image data through the drivers.
- the device control data is then sent to the hardware controller 225.
- the hardware controller 225 can be configured to control one or more hardware components based on the device control data output from the image processing system 200.
- the hardware controller is operably connected to the laser scanning system, and in particular to the AOM 310, laser scanner 315, and the IR laser source 320.
- the hardware controller 225 can control the AOM 310 based on pixel intensity power modulation control, or other relevant pixel data configured to determine the necessary modulation of the laser passing through the AOM.
- the hardware controller 225 may control power to the IR laser source 320 and pixel position control to the laser scanner 315.
- the hardware controller 225 is also shown operably connected to a convection fan in some examples of any system described herein.
- the laser scanning system example of FIG. 1 illustrates the IR laser source emitting the laser beam towards the modulator, then through the optics 325, then through the laser scanner 315.
- the laser beam can be modulated by the AOM and collimated by the optics before passing through the laser scanner to the polyimide film screen 400.
- the laser scanner contacts the polyimide screen 400 with the scanner adjusted collimated laser beam to generate and present the IR scene.
- the IR scene is dynamically adjustable based on the characteristics of the contact between the laser beam and the polyimide screen. For example, longer duration or increased intensity may result in increased temperature at a position on the polyimide screen and can be used to depict an increase in thermal activity of the simulated IR scene generated thereon.
- a thermal imager 500 acquires a screen emitted thermal image.
- the thermal imager may be coupled to one or more filters 510 (e.g., a notch filter) that can be configured to protect the thermal imager from the screen emitted thermal image.
- the thermal imager is operably connected to the frame grabber as a secondary pathway for introduction of the acquired thermal image for validation against the initial IR scene.
- the acquired thermal image may result in additional and/or adjustments in the processing of the image data that can result in adjustment of laser scanning system components by the hardware controller based on the differences.
- FIG. 2 illustrates a similar configuration to FIG. 1 with variations of the controller configuration.
- the laser scanner system may be controlled by more than one controller.
- the AOM 310 and the laser scanner 315 are controlled by an AOM/scanner controller, while the IR laser source is controlled by a power overdrive 230 component in the image processing system 200.
- the power overdrive component may receive data from the pixel correction and initiate the IR laser source 320 upon receipt of the image data.
- the image processing system 200 example in FIG. 2 shows a frame cache component 235 that can be configured to compare against previous frame data and store current frame data.
- FIG. 2 illustrates an example of an image registration mechanism 240 that can be configured to register images acquired by the thermal imager prior to subjecting the images to the image processing system for adjustment.
- FIG. 1 and FIG. 2 also illustrate a beam dump component 325 in communication with the AOM and configured to receive waster laser energy after modulation by the AOM.
- FIG. 3 illustrates another arrangement example of a dynamic infrared scene generation and projection system, as described herein.
- the laser scanner 315 is illustrated in an offset position relative to the thermal imager 500. Accordingly, the laser scanner 315 may adjust scanning of the image to compensate for the offset position to provide a functional and appropriate orientation of the IR scene to be generated on the polyimide screen 400.
- the IR scene 100 generation operably connected directly to the AOM 310.
- the IR scene generator 100 is also coupled to a processor 110 that received data from the IR scene generator and the frame grabber 115.
- the optic and beam dump 325 are positioned near one another allowing for a parallel direction of the laser from the AOM 310.
- FIG. 4 illustrates and example arrangement of dynamic infrared scene generation and projection system
- the IR scene generator 100 is connected in a loop with the processor 110, whereby pixel correction data is fed back to the IR scene generator for modification.
- the IR scene generator is also operably connected direct to the IR laser source 320, the AOM 310, and the controller 226.
- the frame grabber 115 can be configured to transmit acquired thermal image data to the processor and then to the IR scene generator for modification on an initial output from the IR scene generator.
- the IRSG is based on physics or known attributes, activities, and/or operation of the intended target signature to be generated.
- a T-72 tank may be the subject of an IRSG and operation of the tank may be input into one or more algorithms associated with the IRSG image generation. Firing sequences, movement, engine operation, users, and any other element that may impact thermal attributes of an intended target signature for IRSG may be considered in the generation of the IRSG image.
- the IRSG is generated using one or more algorithms executable within a computer-based system. For example, parameters of IRSG input target signatures may be provided in a library associated with the target signature to be generated.
- any system described herein may scan an image of a dynamic target signature on the screen based on the thermal characteristics and realOtime dynamic changes associated with the simulated target signature function.
- IRSG dynamic physicsbased IR scene generator
- FIGS. 5 A and 5B illustrate examples dynamic changes in the thermal attributes of a target signature.
- a tank is shown having various thermally active regions.
- the rear of the tank is simulated to be bright and would be associated with a thermally active engine. This may provide a strategic targetable region of the tank that can be more susceptible to attack.
- the barrel of the tank can be seen bright to represent and simulated period during or after the firing sequence where a projectile travels through the barrel and the combustion of the munition combined with the friction have increased the thermal characteristics of the barrel. This capability can provide substantial intelligence on target identification and profiling.
- FIGS. 7A to 7D illustrate another example of a dynamic transition between period of time for an IR scene generated by any system described herein.
- the target signature is again a tank, now shown having fired a projectile.
- the heat generated by a tank firing may be difficult to observe.
- any system described herein can be configured and capable of illustrating a firing sequence and residual thermal signatures (e.g., residual heat from an expelled projectile).
- any system described herein operates through a method comprising the function and interaction between one or more hardware components and one or more algorithms.
- the step of initialization/image registration comprises the IRSG sends test pattern image to an image processor.
- a FLIR sends projected test pattern image to image processor.
- the FLIR is uncooled.
- the image processor computes and stores an image transformation matrix which auto-registers the IRSG input image with its corresponding (off-axis) projected image; for unambiguous determination of a one-to- one pixel mapping between the two image spaces.
- a method of generating a dynamic infrared scene can include generation and projection of an associated target signature.
- the method may comprise first, generating a scene.
- Generating a scene may comprise determining the desired target signature to be generated and presented.
- the scene may be generated using one or more physics based algorithms.
- using the generated scene to develop corresponding scanner instructions.
- the IRSG that was first developed may be submitted to the image processing system for data generation to instruct the operation of system hardware in the presentation of the target signature on the polyimide film.
- the laser can be initiated and the laser scanning process can be initiated.
- a method of generating a dynamic infrared scene can include capturing a lased (e.g., laser scanned) imagery with a thermal imager, as described herein.
- the captured imagery can then be displayed and valuated against the initial scene input.
- the initial scene e.g., target
- the initial scene may be compared against the captured image for appropriate homography between the input image and the captured image.
- the steps of capturing the lased image may be continuously repeated to compare changes in the dynamic presentation of the lased image on the screen.
- the comparison may support or require modification of one or more hardware components based on the image input from the thermal imager into the image processing system. For example, if a captured image is rejected, the image processing system may adjust instructions to the hardware controller to correct any deficiencies and/or support homology between the IRSG and the captured image.
- the instructions form the image refinement/correction loop can be repeated until the program is halted.
- a method of generating a dynamic infrared scene can include image registration.
- a method of generating a dynamic infrared scene can include generating scene in OSV, then developing a scene data based on the generated scene, the data may correspond to scanner instructions. Then, activating the laser and initiating scanning procedure.
- the method may comprise the feedback loop including capturing lased imagery with thermal imager. Then, applying registration transformation to captured imagery. Then, displaying the registered imagery.
- adding registration capability to this run loop may require use of a homography (transformation matrix) that can be calculated in advance.
- the run loop for finding this homography can comprise generating input image. Then, developing corresponding scanner instructions based on the generated input image. Then, activating the laser and initiating scanning procedure. Then, capturing lased imagery with thermal imager. Then, saving captured imagery. Then, validating and/or registration the image by comparing input image against reference image. After a valid registration has been found, the homography in that registration can be stored and applied to any other image captured by the thermal imager from the same position.
- target dynamic state e.g., engine state, motion state/frictional heating, etc.
- time of day time of year, etc.
- the uncooled FLIR images received by the IR thermal imager are captured by the frame grabber, converted to radiance images of the same format as those coming from the IRSG, and sent to the image processor for comparison.
- the image processor automatically performs image registration using the transformation stored during initialization, and then compares the two images (original from IRSG and projected from frame grabber), generating a 2D radiance difference map. This difference map is then applied to the next frame of synthetic imagery from the IRSG as a correction, and the result passed to the AOM for beam modulation.
- the laser scanner’s scan angle and the AOM’s modulation fraction can be coordinated within the scanner driver (i.e. software). This may ensure that the beam intensity corresponding to a particular scan angle (screen position) spatially correlates with the proper input image pixel. Accordingly, the hardware controller can coordinate raster scanning.
- a feedback loop for any system described herein may comprise driving per pixel corrections to the IRSP image within a specified threshold, accommodating image responses corresponding to dynamic state changes of the simulated vehicle.
- any system described herein may comprise adjustment and refinement of any of the components and/or associated hardware configurations to account such environmental influence.
- a laser scanner and modulator e.g., AOM
- AOM a laser scanner and modulator
- a scene generation unit e.g., a Chimaera Scene Generation Unit
- Any adjustment in response to environmental influences on the target signature production can be sufficient to address and calibrate relative to the negative impact of environmental influence on the target signature produced on the polyimide film by any system described herein.
- a closed-loop system as described herein may comprise a scene generation unit (SGU) (e.g., a Chimaera SGU), a laser scanning modulation system (LSMS), a LWIR FLIR camera, and (4) Frame Grabber Unit (FGU) (e.g., an image acquisition system such as a Chimera FGU).
- SGU scene generation unit
- LSMS laser scanning modulation system
- FGU Frame Grabber Unit
- the FGU may comprise and or operate with Radiometric Adjustment Feedback Software (RAFS).
- RAFS Radiometric Adjustment Feedback Software
- a method of generating a dynamic infrared scene can include generation and projection of an associated target signature including a closed-loop feedback system with steps comprising sending desired synthetic target/background radiometric image to the LSMS and FGU RAFS with the SGU. Then, the LSMS accepting the SGU image and controls the laser power and scan pattern to form the thermal emissive pattern. Then, LWIR FLIR can constantly sense the thermal signature of the screen (e.g., at 30Hz) and relay it to a frame grabber card in the FGU. Then, the FGU capturing the FLIR images (e.g., at 30Hz).
- the FGU RAFS differences the SGU-requested image and the FLIR/FGU-captured image to form a delta matrix (per pixel gain and level array). Then, the FGU can send the delta matrix to the SGU to apply to the desired target thermal emission image.
- the closed feedback loop method can then repeat at each frame update (e.g., at each 30Hz frame update), thereby iteratively maintaining the correct radiometric signature, even in the presence of environmental influences on the screen.
- a closed loop system described herein adjusts one or more system components and/or associated component configurations. For example, an image is generated, run through the closed loop system for comparison against the input image, then, an algorithm considers differences between the generated image and the input image. Based on the differences, either the elements may be physically adjusted in position relative to the Kapton® film; and/or the laser adjusts the image generation according to the results of the closed loop system comparison the adjust the image generation and correct for defects.
- the closed loop system may include identifying one or more regions or elements (e.g., key points) within an image (e.g., an input image/ output image) to relate or align the output image and the input image. For example, to correctly determine the transformation that is used to register an image, there may be need for one or more (e.g., at least 4 key points) corresponding key points from each image. More key points may result in a better transformation.
- regions or elements e.g., key points
- FIG. 8 illustrates an example of adjustment of an image generated by any system described herein.
- the image on left may be the image presented on the polyimide screen after initial IR scene generation and laser scanning.
- a thermal imager may be configured to acquire the thermal image (e.g., the image on the left) and input associated data into the image processing system for adjustment.
- the image processing system can be configured and adjust the controller, which in-tum can adjust one or more components of the laser scanner system to correct and optimize the image, as illustrated by the corrected image on the right.
- the key point selection is optimized based on corresponding keypoints were correct mathematical best-matches based on their descriptors.
- the key points are fiducial markers are classified as reference points in an imaging system that can be used for measurement (QR codes and barcodes are common examples).
- a method of generating a dynamic infrared scene can include generation and projection can comprise a registration algorithm comprising detecting and describing one or more keypoints of one or more images. Then, eliminating one or more keypoints below a defined threshold of their descriptors. Then, comparing one or more remaining keypoints from each image. Then, calculating the homography using corresponding points. Then, applying the homography to the input image to align it with the reference image.
- any system described herein may perform a method including the dynamic heating or transition of thermal changes for a target signature simulation presented on the screen.
- the OSV scene can be configured to highlight only activated thermal regions at start state were the tank’s engine and exhaust.
- the scene can be updated in real-time to heat the wheels and then the artillery.
- the scene can be updated in real-time to heat the wheels and then the artillery.
- the first 10 seconds the only active thermal regions are the engine and exhaust, and then the wheels are activated and allowed to heat up for another 10 seconds.
- the artillery begins to heat up.
- This dynamic heating can provide a simulated function and strategic targeting opportunities for weapons systems being trained.
- any system described herein may comprise a housing configured to house one or more of any of the system components.
- a housing may be configured to house the laser, laser scanner, one or more modulators, image acquisition system, image processing system components, optics, etc.
- any system described herein is configured to generate and present a dynamic IR scene (e.g., target signature) on a screen for use with live firing training or operations.
- a dynamic IR scene e.g., target signature
- Any system described herein may comprise a hit detection system associated with live firing weapons training.
- a target signature may be presented on a screen (e.g., polyimide film) based on contact from a laser beam that has been modulated after a laser device has received instructions from an image processing system based on a generated or desired IR scene input.
- the target signature may include one or more dynamic elements (e.g., a heating barrel, engine and/or tires) that may be considered in the weapons training for strategic targeting.
- a weapons system may be configured to acquire the target signature in a manner similar to target acquisition of a real (e.g., non-simulated/non-generated target) and deploy ammunition to the screen in a similar manner to deploying ammunitions on a real target.
- the screen may be destroyed as a result of the live fire training and replacement of the screen can allow the system to re-present another target signature.
- any system described herein is configured to generate and present a dynamic IR scene (e.g., target signature) on a screen for use with non-live firing training or operations.
- a dynamic IR scene e.g., target signature
- Any system described herein may comprise a hit detection system associated with non-live firing weapons training.
- Non-live firing training or operations may comprise the use of weapons targeting systems and/or other target acquisition systems and methods that can provide for acquisition of a target and simulated deployment of munitions.
- a target signature may be presented on a screen (e.g., polyimide film) based on contact from a laser beam that has been modulated after a laser device has received instructions from an image processing system based on a generated or desired IR scene input.
- the target signature may include one or more dynamic elements (e.g., a heating barrel, engine and/or tires) that may be considered in the weapons training for strategic targeting.
- a weapons system may be configured to acquire the target signature in a manner similar to target acquisition of a real (e.g., non-simulated/non- generated target) and deploy an acquisition element (e.g., a laser guidance system, radar, sonar, etc.) to the screen in a similar manner to acquiring a real target.
- the screen may respond to the weapon targeting system to indicate a hit or positive acquisition of the target.
- one or more radio frequency sensors RF
- identification system e.g., IFF identification system friend or foe
- EM electromagnetic
- IR sensor e.g., IR sensor
- the one or more sensory systems may be configured to register or otherwise identify activity of the target acquisition system relative to the target signature presented on the polyimide film. An alert or other form of indication relating to the positive acquisition of the target signature by the target acquisition system may be triggered.
- a negative acquisition feedback may be provided by a system described herein.
- FIG. 9 illustrates an example of a dynamic infrared scene generation and projection system having the hit detection system 800 with a plurality of sensors 801, 802, 803 as described herein in operable communication with the polyimide screen.
- the sensors can be configured to detect a hit or other engagement of the target signature on the screen.
- the thermal imager can depict, and the system may be configured to simulate damage based on the specific characteristics of the detected hit.
- any system described herein include a hit detection system configured to detect and indicate or otherwise provide real-time information relating to the efficacy and/or damage delivered by the weapon system used on the target signature.
- the image acquisition system may be configured to acquire an image and/or data related to damage and/or the impact of the weapons system deployed on the target signature (e.g., presented on the polyimide film).
- the image capture system may transmit acquired/observed results of the weapon system impact on the target signature to the image processing system, which can be configured to provide simulated details of an impact by the weapon system.
- any of the methods (including user interfaces) described herein may be implemented as software, hardware or firmware, and may be described as a non-transitory computer-readable storage medium storing a set of instructions capable of being executed by a processor (e.g., computer, tablet, smartphone, etc.), that when executed by the processor causes the processor to control perform any of the steps, including but not limited to: displaying, communicating with the user, analyzing, modifying parameters (including timing, frequency, intensity, etc.), determining, alerting, or the like.
- any of the methods described herein may be performed, at least in part, by an apparatus including one or more processors having a memory storing a non-transitory computer-readable storage medium storing a set of instructions for the processes(s) of the method.
- computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein.
- these computing device(s) may each comprise at least one memory device and at least one physical processor.
- memory or “memory device,” as used herein, generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions.
- a memory device may store, load, and/or maintain one or more of the modules described herein.
- Examples of memory devices comprise, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory.
- processor or “physical processor,” as used herein, generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions.
- a physical processor may access and/or modify one or more modules stored in the above-described memory device.
- Examples of physical processors comprise, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
- the method steps described and/or illustrated herein may represent portions of a single application.
- one or more of these steps may represent or correspond to one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks, such as the method step.
- one or more of the devices described herein may transform data, physical devices, and/or representations of physical devices from one form to another. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form of computing device to another form of computing device by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
- computer-readable medium generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions.
- Examples of computer-readable media comprise, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic- storage media (e.g., solid-state drives and flash media), and other distribution systems.
- transmission-type media such as carrier waves
- non-transitory-type media such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic- storage media (e.g., solid-state drives and flash media), and other
- Any system described herein may comprise one or more power sources or power systems operably connected to one or more or any components requiring power for operation. Controllers, modulators, regulators, may be included as necessary or beneficial for the functional operation of the system and/or any of the system components.
- the processor as described herein can be configured to perform one or more steps of any method disclosed herein. Alternatively or in combination, the processor can be configured to combine one or more steps of one or more methods as disclosed herein.
- the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
- first and second may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element.
- a first feature/element discussed below could be termed a second feature/element
- a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
- any of the apparatuses and methods described herein should be understood to be inclusive, but all or a sub-set of the components and/or steps may alternatively be exclusive, and may be expressed as “consisting of’ or alternatively “consisting essentially of’ the various components, steps, sub-components or sub-steps.
- a numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/- 1% of the stated value (or range of values), +/- 2% of the stated value (or range of values), +/- 5% of the stated value (or range of values), +/- 10% of the stated value (or range of values), etc.
- Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value "10" is disclosed, then “about 10" is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Computer Hardware Design (AREA)
- Mechanical Optical Scanning Systems (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
A dynamic infrared (IR) scene generation system comprising polyimide screen configured to react to contact by an IR laser; a laser scanning system configured to contact the polyimide screen with laser beam, the laser scanning system comprising a laser scanner, one or more modulators, and an IR laser source; and an image processing system operably connected to a controller configured to control the laser scanning system based on data from the image processing system, wherein the laser scanning system can be configured to contact the polyimide screen with the IR laser based on an IR scene input.
Description
DYNAMIC INFRARED SCENE GENERATION AND PROJECTION SYSTEM AND METHODS
CLAIM OF PRIORITY
[0001] The present application claims priority to U.S. Provisional Patent Application No. 63/225,929 entitled “UNCOOLED LIVE-FIRE INFRARED SCENE PROJECTION SYSTEM" filed on July 26, 2021, the contents of which are hereby incorporated in its entirety.
INCORPORATION BY REFERENCE
[0002] All publications and patent applications mentioned in this specification are herein incorporated by reference in their entirety to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference.
STATEMENT AS TO FEDERALLY SPONSORED RESEARCH
[0003] This invention was made with government support under Department of Defense,
Army Contracting Command awarded by SBIR contract number W900KK19C0047. The government has certain rights in the invention.
FIELD
[0004] The present invention relates to the field of thermal sensor scene projection. In particular, dynamic Infrared Scene Projection (IRSP) for targeting systems and weapons training can operate using on one or more characteristics of a proposed target.
BACKGROUND
[0005] Infrared scene projection systems perform a vital role in the testing of tactical and theatre weapons systems. Weapon targeting systems often rely on one or more attributes of a proposed target (e.g., thermal signatures). Technological advancements have been limited and fail to provide a solution for providing dynamic and/or unique scene generation that may be used as a target for training weapons systems. Current non-collimated field IR Scene Projection relies on crude static cutouts using plywood consisting of thermal heating pads to create a “hot-spot” by being affixed to plywood as “pop-up” silhouettes to represent static-aspects of vehicles and their thermal signatures. These current Long Wave Infrared (LWIR) target systems employed at live-fire ranges lack the fidelity to support recognition, classification, and identification of the target.
[0006] In addition to a scene visual signature, dynamic thermal attributes can be necessary to identify strategic engagement opportunities with a target. The current state of the art employs crude cartoon-like hot-spots based on manual placement of the thermal heating pads on or around the plywood cutout. There has been an overall failure in the art to provide any opportunity for dynamic and/or strategic identification of an optimal engagement location on the target.
[0007] Attempts have been made to generate an image using a laser system in place of the thermal heating pads. For example, cooled lasers have been used to generate a static hot-spot in a background of a small screen, with a cryogenically cooled thermal sensor to monitor the hot-spot viewed from the back side of the screen (Cooper, Susan and Eaton, John; Spatially Resolved Surface Temperature Control Using Scanned Laser Heating. Dept, of Mechanical Engineering of Stanford University. Aug. 2002). The laser systems described therein were limited to uniform hot-spot incapable and prohibitive of dynamic scene generation and control. Additionally, Cooper and the current state of the art fail provide expendable and/or low-cost, “life size” target screens that do not require the use of collimators, resolved, high-spatial resolutions targets in complex backgrounds, dynamically changing target thermal signatures, automatic image registration for closed-loop image refinement, front-screen viewing for “live fire” use, and/or “hit” detection system, etc.
[0008] For these reasons, it would be desirable to provide improved systems and methods for dynamic target and scene generation. It would be particularly desirable to provide cost-effective systems and assemblies for realistic life-size dynamic scene generation. At least some of these objectives will be met by the various embodiments that follow.
SUMMARY OF THE DISCLOSURE
[0009] Described herein are systems and methods for dynamic infrared scene generation and projection.
[0010] Generally, a dynamic infrared (IR) scene generation system can comprise a polyimide screen configured to react to contact by an IR laser; a laser scanning system configured to contact the polyimide screen with laser beam, the laser scanning system comprising a laser scanner, one or more modulators, and an IR laser source; an image processing system operably connected to a controller configured to control the laser scanning system based on data from the image processing system, wherein the laser scanning system can be configured to contact the polyimide screen with the IR laser based on an IR scene input.
[0011] In some examples, the system can comprise a closed feedback system configured to validate a thermal target signature acquired by a thermal imager. The thermal target signature
can be a dynamic scene based on the polyimide screen reaction to contact by the IR laser. In some examples, the image processing system can be configured to correct one or more pixel intensities of a IR scene input, wherein the image processing system can comprise one or more drivers configured to generate data based on the IR scene input, wherein controller can be configured to control the laser scanning system based on the data.
[0012] In some examples, the system further comprises a thermal imager operably connected to the image processing system, the thermal imager can be configured to acquire one or more images from the polyimide screen, wherein the one or more images can be configured to be adjusted by the controller based on adjustment data from the image processing system, wherein the adjustment data can be associated with the one or more acquired images. The laser scanning system can be configured to generate one or more thermally dynamic images on the polyimide screen, wherein each of the one or more thermally dynamic images can be targetable by a weapon system.
[0013] In some examples, the polyimide screen can be a multilayer screen comprising a polyimide film contactable by the IR laser, and a fabric backing that can be configured to dissipate heat from the laser transmitted through the polyimide film. In some examples, the system comprises a hit detection system in operable communication with the thermal imager, the hit detection system can comprise one or more radio frequency sensors, one or more electromagnetic sensors, or one or more identification sensors. In some examples, one or more physics-based algorithms can be executed in a computing system. The one or more physicsbased algorithms can be configured to generate the IR scene input, wherein the generated IR scene input can be configured to have one or more thermally dynamic attributes based on the one or more physics-based algorithms. In some examples, the laser scanning system can be uncooled. [0014] Generally, a method of dynamic infrared (IR) scene generation with an uncooled collimated laser can comprise first generating an IR scene input. Then, an image processing system generating data based on the IR scene input. Then, initiating a laser scanning system comprising a laser scanner, wherein a controller can direct operations of the laser scanning system based on the generated data. Then, contacting a polyimide screen with a laser beam directed by the laser scanner. Then, generating a target signature on the polyimide screen, wherein the laser scanner can scan the laser beam on the polyimide screen based on the IR scene input.
[0015] In some examples, a method described herein can further comprise acquiring a thermal image emitted by the polyimide screen with an IR thermal imager. Then, validating the acquired thermal image against the IR scene input, wherein if the acquired thermal image fails validation, the method can further comprise processing the acquired thermal image with the
image processing system and adjusting one or more system components based on adjustment data generated by the image processing, the adjustment data can be based on the processed thermal image.
[0016] In some examples, the IR scene input comprises a predetermined sequence of one or more simulated dynamic adjustments to one or more regions of the IR scene input, wherein the one or more simulated dynamic adjustment is based on one or more changes in simulated thermal activity of the generated IR scene input. In some examples, a method described herein can further comprise repeating the method in a sequence based on one or more predetermined adjustments in the IR scene input. In some examples, a method described herein can further comprise modulating the laser beam with an acousto-optic modulator, and passing the modulated laser beam through one or more optics before contacting a polyimide screen with a laser beam directed by the laser scanner.
[0017] In some examples, the IR scene input can be a video comprising sequential frames, wherein the thermal imager acquires the sequential frames from the video target signature. In some examples, the acquired sequential frames can be validated after all of the sequential frames are acquired, wherein the method further comprises the thermal imager storing sequential frames acquired from the IR scene input video target signature.
[0018] In some examples, a method described herein can further comprise detecting a hit with one or more sensors in operable communication with the polyimide screen, wherein a hit is detected when a weapon system acquires the target signature.
[0019] In some examples, a method described herein can further comprise the controller adjusting operation of the laser scanning system, wherein the target signature is validated against the IR scene input, wherein the operation of the laser scanning system is adjusted until the target signature passes validation.
[0020] All of the methods and apparatuses described herein, in any combination, are herein contemplated and can be used to achieve the benefits as described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] A better understanding of the features and advantages of the methods and apparatuses described herein will be obtained by reference to the following detailed description that sets forth illustrative embodiments, and the accompanying drawings of which:
[0022] FIG. 1 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
[0023] FIG. 2 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
[0024] FIG. 3 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
[0025] FIG. 4 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
[0026] FIG. 5A and 5B are images illustrating examples of dynamic thermal activity of a scene generated by a system described herein.
[0027] FIG. 6A, FIG. 6B and FIG. 6C are progressive examples of the temporal progression and dynamic thermal change of an IR scene generated by a system described herein.
[0028] FIGS. 7A to 7D are examples of dynamic thermal transitions over time of a firing sequence simulated by IR scene generated by a system described herein.
[0029] FIG. 8 is an example of the adjustment capabilities associated with a feedback loop system of a dynamic IR scene generation system described herein.
[0030] FIG. 9 illustrates a schematic example of an arrangement of system components and an operational flow of generating a dynamic thermal image, as described herein.
DETAILED DESCRIPTION
[0031] A dynamic infrared scene generation and projection system can comprise a screen (e.g., a thermally responsive screen) configured to be selectively activatable on contact from a light source (e.g., a laser). A laser scanning device can be configured to emit energy in an arrangement based on an infrared (IR) scene generated by an infrared scene generator (IRSG) and processed by an image processor system. The image processing system can transmit scene generation instructions to the laser scanning system. The laser scanning system may comprise a laser system configured to emit an IR laser that can be modulated by one or more modulation devices and transmitted through optics to the laser scanning device for dynamic activation of the thermally responsive screen according to the generated IR scene.
[0032] An image acquisition system (e.g., an IR thermal imager) can be in operable communication with the dynamic infrared scene generation and projection system. The image acquisition system can be configured to acquire images displayed on the thermally responsive screen. The image acquisition system can be operably connected to a frame grabber configured to grab or isolate one or more frames from an acquired image for input into the image refinement/image processing system. For example, acquired images and data associated with the dynamic scene generated and displayed on the thermally responsive screen can be transmitted to the image processing system. These images and data may be incorporated into one or more algorithms configured to validate, refine, adjust, optimize, and/or modify, etc. one or more of the systems or components described herein.
[0033] In some examples, the image acquisition system may acquire an image presented on the thermally responsive screen and transmit associated data of the image to the image processing system for comparison of the acquired image (e.g., output image) to the image and/or scene generated by the IRSG. Based on the comparison one or more system components and/or processing functions may be adjusted to ensure optimization of the output image based on the image and/or scene generated by the IRSG.
[0034] In some examples, any system described herein may produce a signal or signal product that can be displayed on the thermally responsive screen. In some examples, a screen may comprise materials thermally responsive screen may comprise a material based on electrical properties, mechanical properties, chemical properties and/or thermal properties. For example, electrical properties may comprise dielectric constant, dielectric strength, electrical resistivity, sheet resistivity, etc. For example, mechanical (e.g., physical) properties may comprise elastic modulus, flexural modulus, flexural strength, maximum allowed stress, tensile strength, etc. For example, thermal properties may comprise coefficient of thermal expansion, heat deflection temperature, specific heat capacity, thermal conductivity, maximum service temperature, etc. [0035] In some examples, the screen can be a film comprising one or more polymers. For example, any system described herein may comprise a polyimide film as the thermally responsive screen. An example of a polyimide film that can be Kapton® polyimide films. In some examples, the screen (e.g., thermally responsive screen) comprises Kapton® polyimide films.
[0036] In some examples, any system described herein may comprise a screen having more than one layer of material. For example, any system described herein may comprise a multi-layer screen. Any screen described herein may have a first layer comprising a polyimide film and a second layer can be configured as a barrier or backdrop material. For example, a second layer of screen may comprise a fabric configured to absorb or substantially absorb heat away from the polyimide layer. The composition of one or more additional layers in communication with the polyimide layer, or otherwise comprising the screen, may be configured to reduce heat reflected from the one or more additional layers and prevent contamination of the image (e.g., scene and/or target) scanned by the laser on the polyimide film. Additional layers may be included based on thermal properties of the additional layer material to reduce or eliminate reflection of heat passing through the polyimide layer. For example, one or more additional layers may be configured to absorb or otherwise draw energy (e.g., heat) from the laser scanner that has passed through the polyimide layer to prevent inadvertent heating of the polyimide layer from the side of the polyimide layer in communication with the one or more additional layers.
[0037] In some examples, any system described herein may comprise a screen (e.g., a polyimide screen) arranged on a frame. The frame may be a rigid structure configured to support and/or retain the screen in a predetermined position and location. In some examples, the frame may provide for dynamic adjustment of the screen. For example, the frame may be configured to adjust or otherwise change from a first arrangement or orientation to a second arrangement or orientation (e.g., angle, height, etc.).
[0038] In some examples, any system described herein may comprise a screen arranged to receive a continuous scan from a laser scanner. For example, a polyimide film screen may be in operable communication with a laser scanner to receive and be thermally activated by the laser scanner such that a target signature is generated and/or presented on the front of the screen. [0039] Any system described herein may comprise a laser scanner configured to transmit a laser beam onto a polyimide screen. For example, a single optical laser scanner may be configured for step-scanning a laser beam on, across and/or through the screen (e.g., polyimide film) to generate a target signature based on the scanning of the laser beam on the polyimide film. The laser scanner can be configured to continuously and dynamically scan a laser beam to contact the polyimide film based on the IRSG input image such that the scanning of the laser beam by the laser scanner can result in the presentation of a target signature on the polyimide film. In some examples, the continuous scanning of the laser beam by the laser scanner can provide a dynamic target signature with one or more changing or dynamic attributes in real-time based on the IRSG and/or programmed changes to an initial IRSG input into the system for target signature development.
[0040] In some examples, a laser scanner described herein may be arranged based on the location and orientation of the screen. For example, the laser scanner may be positioned at a predetermined angle relative to the face or front of the polyimide film. For example, the laser scanner can be positioned to direct and scan the laser across the polyimide film at a right angle between the path of the laser and the plane of the screen. In some examples, the angle between the path of the laser and the plane of the screen can be greater than or less than a right angle. In some examples, the angle between the path of the laser and the plane of the screen is dynamic and changes in real time.
[0041] Any system described herein may comprise one or more optical devices (e.g., optics) configured to direct and optically adjust the laser beam from the laser (e.g., the IR laser). The optics can be one or more lenses arranged to calibrate, adjust, or otherwise coordinate one or more attributes of the laser beam passing from the laser to the screen.
[0042] In some examples, the optics are operably connected and/or positioned between the laser scanner and one or more modulators (e.g., an AOM). The laser beam can transition through
the one or more modulators, then through the optics to the laser scanner for deployment or presentation on the screen (e.g., polyimide film).
[0043] In some examples, the laser beam from the one or more modulators may be considered a modulated laser beam as it contacts and passes through the optics. The laser beam may be considered a collimated laser beam after it has passed through the optics. For example, any system described herein may comprise optics configured to collimate the laser beam for processing, scanning, and/or transmission of the laser beam by the laser scanner to the screen. [0044] Any system described herein may comprise one or more modulators configured to modulate one or more attributes or characteristics of the laser beam. An example of a modulator may be an acousto-optic modulator (AOM), Bragg cell and/or an acousto-optic deflector (AOD). For example, an AOM may be configured to modulate diffraction, intensity, frequency, phase, polarization, etc. o a laser beam passing therethrough, uses the acousto-optic effect to diffract and shift the frequency of light using sound waves (usually at radio-frequency). In some examples, the AOM may be configured and/or selected based on the required speeds of modulation.
[0045] In some examples, any system described herein may comprise an AOM with beamdump configured to adjust the intensity of the laser in real-time corresponding to the radiance intensity of the IRSG video pixel. In some examples, an AOM may be configured to provide modulation frequencies greater than 200 MHz.
[0046] In some examples, any system described herein may comprise one or more lasers. A laser, as described herein may be configured to generate and/or emit energy (e.g., a laser beam). In some examples, the laser may emit a beam based on parameters and/or instruction from an image processing system, as described herein. For example, the laser beam may be based on the image processing system determining the orientation, arrangement and/or configuration of the IRSG input.
[0047] In some examples, the laser may be an uncooled laser system. The laser may generate or otherwise emit a laser of a predetermined wattage (W). For example, a laser beam may be 1W, 5W, 10W, 20W, 30W, 40W, 50W, 60W, 70W, 80W, 90W, 100W or greater. In some examples, the laser beam may be any watt between 0W and 200W. For example, a single, uncooled 10.6 um laser (80W or less).
[0048] Any system described herein may comprise one or more controllers. For example, a dynamic infrared scene generation and projection system may comprise a single controller configured to control the angular position of an optical laser-scanner and/or the intensity of the single laser corresponding to the radiance location and intensity from the IRSG video (e.g., input image).
[0049] In some examples, one or more controllers may comprise a hardware controller configured to receive device control data from the image processor system and control one or more system hardware components (e.g., laser scanner, modulator, laser, fans etc.). For example, a hardware controller may receive device control data configured to operate one or more fans in communication with the screen (e.g., polyimide film).
[0050] In some examples, the image acquisition system described herein may comprise a thermal imager configured to acquire an image (e.g., a target signature) presented on the polyimide film. For example, the thermal imager can be a LWIR FLIR camera systems. In some examples, the thermal image grabber (e.g., camera systems) can comprise and/or be configured with spectral band, resolution, frame rate and digital video interfaces based on the system FGU. [0051] In some examples, the image acquisition system (e.g., FLIR camera systems) may operate within a spectral band range. In some examples, the spectral band range may be between 4 pm to 30 pm. In some examples, the spectral band range may be between 8 pm and 14 pm, In some examples, the spectral band range may be any range sufficient for image acquisition and processing as described herein.
[0052] In some examples, the thermal imager (e.g., image acquisition system) may have a field of view at least 5 degrees, at least 10 degrees, at least 15 degrees, at least 20 degrees, at least 25 degrees, at least 30 degrees, at least 35 degrees, at least 40 degrees, at least 45 degrees, at least 50 degrees or great. In some examples, the thermal imager may have a field of view at any degrees between 1 and 100 or greater. In some examples, the thermal imager (e.g., the image acquisition system) may be configured and/or capable of radiometry functionality.
[0053] In some examples, a thermal imager described herein may be capable of acquiring, observing and/or registering an image or thermal signature having a temperature between - lOOOoC to +1000oC. In some examples, a thermal imager described herein may be configured to acquire, observe and/or register an image or thermal signature having a temperature greater than lOoC, greater than 50oC, greater than lOOoC, greater than 500oC and/or any temperature therebetween.
[0054] In some examples, any system described herein may comprise an uncooled microbolometer thermal imager. One or more filters may be associated with the thermal image and configured to protect the thermal imager from laser reflectance. For example, a notch filter (e.g., a 10.6um notch filter) may be arranged to protect the thermal imager from laser reflectance back into the camera detector elements. In some examples, any system described herein may comprise a frame-grabber configured to continuously capture the camera images at the frame rate of the IRSG.
[0055] In some examples, the thermal imager may be arranged at an angle relative to the screen and/or the laser scanner. For example, the thermal imager can be positioned at an angle relative to the screen to acquire a thermal image presented on the screen by the scanning from the laser scanner. In some examples, any system described herein may comprise a thermal imager in operable communication with the FGU and the screen. One or more filters may be disposed between the thermal image grabber and the screen to protect the thermal imager.
[0056] Any system described herein may comprise an image processing system (e.g., an image processor). An image processing system, as described herein, may function as an initial image processor configured to process an IRSG image and/or video input. For example, a desired target signature may be developed (e.g., using IRSG physics-based development algorithms) and submitted to the image processor. The image processor may refine the image. For example, pixels may be adjusted or corrected based on intensity errors. In some examples, the IRSG input image may then be transmitted to a scanner driver. For example, frame data associated with the IRSG input image after being corrected may be transmitted to a scanner driver. The scanner driver may comprise an AOM driver configured to receive the adjusted pixel data associated with the corrected IRSG input image. After the image data has been processed by the AOM driver and/or the scanner driver, the image data is transmitted to one or more controllers (e.g., a hardware controller) to influence and instruct operation of one or more hardware components of the system.
[0057] Any system described herein may comprise an image processing system (e.g., image processor) operably connected or in communication with a thermal imager (e.g., image acquisition system). In some examples, the image processing system may be the same as the initial image process system. In some examples, the image processing system may be separate or secondary to the initial image processing system. An image processing system in communication with the thermal imager may comprise a frame grabber in communication with the thermal imager and configured to obtain one or more frames acquired from the thermal imager. The frames may be transmitted to an image adjuster (e.g., image pixel correction component) and continue through the image processor components in a similar manner as described above (e.g., via the scanner driver and/or the AOM driver to the hardware controller).
[0058] In some examples, any system described herein may operate based on signatureproducing (e.g., IRSG signal-products) technologies comprising thermal-reflective technology, thermal-emissive technology, thermal-fluorescent technology, and/or a combination thereof. For example, any system described herein may produce a target signature in a thermal-emissive manner. A screen, of any system described herein, may radiate heat after being contacted with energy from a light source (e.g., a laser). The emission of heat may be regulated by modulation
of the energy from the laser. For example, an intensity, duration, frequency, motion, and/or other configuration of the laser may impact the thermal emission from the screen (e.g., polyimide film) and can be dynamically adjusted to simulate target signature attributes.
[0059] In some examples, the thermal-emissive technology can provide a cost-effective and/or expendable target signature for use with live-fire training. Live-fire training may include the use of live ammunition deployed against a screen, as described here. The live ammunition may destroy the screen on contact. Therefore, it may be a substantial benefit to employ thermalemissive configurations of the screen for cost-effective replacement.
[0060] In some examples, the thermal-emissive design utilizes a laser to scan projected imagery onto the target screen (e.g., polyimide film). The screen’s polyamide film can heat up as the laser contacts it, and the heated film emits in the LWIR band. In some examples, any system described herein can be calibrated such that the laser can be scanned across the target screen to project a recognizable scene (e.g., a T-72 tank) in the LWIR band.
[0061] In some examples, any system described herein is an uncooled system. In some examples, any system described herein is substantially uncooled. In some examples, any system described herein may comprise one or more cooling systems configured to optimize the target signature presented on the polyimide film. In some examples, any system described herein is void of any element cooling systems or components (e.g., cryogenic or chiller coolers, other than fan-based cooling). In some examples, any system described herein is uncooled and configured to dynamically-changing, highly spatially-resolved, high dynamic range target thermal signatures in complex backgrounds with very low-cost, expendable screens for live-fire exercises or other activities where the screen might be routinely damaged or destroyed and need to be replaced at practical cost.
[0062] In some examples, any system described herein may comprise a screen cooling system. For examples, one or more cooling devices (e.g., fans) may be positioned to cool the polyimide file (e.g., Kapton® Film). For example, a cooling system described herein may configured to direct air flow (e.g., cool) over or at the polyimide film to sufficiently increase a convection coefficient of the film. In some examples, a cooling system as described herein may be configured to selectively lower an intensity of the laser system within a scene (e.g., target signature) to improve thermally dynamic scene display (e.g. the dissipation of the barrel flare after firing), and/or spatially dynamic scenes.
[0063] FIG. 1 shows a schematic diagram of an example arrangement for a dynamic infrared scene generation and projection system. Starting with the development of an IR scene 100. An IR scene may include an image, multiple images, and/or a video (e.g., a sequence of compiled image frames). In some examples, the IR scene may be generated using a physics-based
algorithm relating to the subject matter of the IR scene. For example, a vehicle may have a known engine configuration, weapon system, and other thermally active components. Based on a database or library of information about the scene subject, any system described herein may be configured to generate an IR scene input having an initial configuration that is dynamically adjustable based on pre-determined simulated changes in one or more thermal characteristics of the IR scene subject. Once the IR scene has been generated, the image and/or video is submitted to the image processing system 200. The image processing system has one or more components configured to process, correct, modify, enhance, adjust, optimize, modify, or otherwise manipulate the IR scene file for processing. In some examples, the image processing system has an image correction engine that may be configured to adjust of enhance the IR scene image (e.g., adjust or correct pixel values). The data from the corrected image is then transmitted to a scanner driver 215. The scanner driver accepts the error corrected frame data from the IR scene generated image. Additional drivers such as the AOM modulator driver 220 that may receive adjusted pixel data (e.g., adjusted pixel intensity data). In some examples, the drivers may be configured to compile and package data for device control based on the processed image data through the drivers. Here, the device control data is then sent to the hardware controller 225. The hardware controller 225 can be configured to control one or more hardware components based on the device control data output from the image processing system 200. As shown in FIG. 1, the hardware controller is operably connected to the laser scanning system, and in particular to the AOM 310, laser scanner 315, and the IR laser source 320. The hardware controller 225 can control the AOM 310 based on pixel intensity power modulation control, or other relevant pixel data configured to determine the necessary modulation of the laser passing through the AOM. The hardware controller 225 may control power to the IR laser source 320 and pixel position control to the laser scanner 315. The hardware controller 225 is also shown operably connected to a convection fan in some examples of any system described herein.
[0064] The laser scanning system example of FIG. 1 illustrates the IR laser source emitting the laser beam towards the modulator, then through the optics 325, then through the laser scanner 315. The laser beam can be modulated by the AOM and collimated by the optics before passing through the laser scanner to the polyimide film screen 400. The laser scanner contacts the polyimide screen 400 with the scanner adjusted collimated laser beam to generate and present the IR scene. In some examples, the IR scene is dynamically adjustable based on the characteristics of the contact between the laser beam and the polyimide screen. For example, longer duration or increased intensity may result in increased temperature at a position on the polyimide screen and can be used to depict an increase in thermal activity of the simulated IR scene generated thereon. Finally, an example of the feedback loop system in shown in FIG. 1 whereby a thermal imager
500 acquires a screen emitted thermal image. The thermal imager may be coupled to one or more filters 510 (e.g., a notch filter) that can be configured to protect the thermal imager from the screen emitted thermal image. The thermal imager is operably connected to the frame grabber as a secondary pathway for introduction of the acquired thermal image for validation against the initial IR scene. The acquired thermal image may result in additional and/or adjustments in the processing of the image data that can result in adjustment of laser scanning system components by the hardware controller based on the differences.
[0065] FIG. 2 illustrates a similar configuration to FIG. 1 with variations of the controller configuration. Here, in place of the hardware controller, the laser scanner system may be controlled by more than one controller. For example, in FIG. 2, the AOM 310 and the laser scanner 315 are controlled by an AOM/scanner controller, while the IR laser source is controlled by a power overdrive 230 component in the image processing system 200. For example, the power overdrive component may receive data from the pixel correction and initiate the IR laser source 320 upon receipt of the image data. Additionally, the image processing system 200 example in FIG. 2 shows a frame cache component 235 that can be configured to compare against previous frame data and store current frame data. The feedback loop in FIG. 2 illustrates an example of an image registration mechanism 240 that can be configured to register images acquired by the thermal imager prior to subjecting the images to the image processing system for adjustment. Both FIG. 1 and FIG. 2 also illustrate a beam dump component 325 in communication with the AOM and configured to receive waster laser energy after modulation by the AOM.
[0066] FIG. 3 illustrates another arrangement example of a dynamic infrared scene generation and projection system, as described herein. Of note, the laser scanner 315 is illustrated in an offset position relative to the thermal imager 500. Accordingly, the laser scanner 315 may adjust scanning of the image to compensate for the offset position to provide a functional and appropriate orientation of the IR scene to be generated on the polyimide screen 400. Also shown in FIG. 3 is the IR scene 100 generation operably connected directly to the AOM 310. The IR scene generator 100 is also coupled to a processor 110 that received data from the IR scene generator and the frame grabber 115. The optic and beam dump 325 are positioned near one another allowing for a parallel direction of the laser from the AOM 310.
[0067] Similar to FIG. 3, FIG. 4 illustrates and example arrangement of dynamic infrared scene generation and projection system where the IR scene generator 100 is connected in a loop with the processor 110, whereby pixel correction data is fed back to the IR scene generator for modification. The IR scene generator is also operably connected direct to the IR laser source 320, the AOM 310, and the controller 226. The frame grabber 115 can be configured to transmit
acquired thermal image data to the processor and then to the IR scene generator for modification on an initial output from the IR scene generator.
[0068] In some examples, the IRSG is based on physics or known attributes, activities, and/or operation of the intended target signature to be generated. For example, a T-72 tank may be the subject of an IRSG and operation of the tank may be input into one or more algorithms associated with the IRSG image generation. Firing sequences, movement, engine operation, users, and any other element that may impact thermal attributes of an intended target signature for IRSG may be considered in the generation of the IRSG image. In some examples, the IRSG is generated using one or more algorithms executable within a computer-based system. For example, parameters of IRSG input target signatures may be provided in a library associated with the target signature to be generated. Based on the parameters, any system described herein may scan an image of a dynamic target signature on the screen based on the thermal characteristics and realOtime dynamic changes associated with the simulated target signature function. In some examples, any system described herein may comprises a dynamic physicsbased IR scene generator (IRSG) that feeds resolved, high-dynamic range (for example, >=60oC or greater than 60oC@ 12 bits+/- 50bits, high-spatial-resolution 2D/3D radiance video imagery of a target with dynamically-changing thermal signatures in response to the scenario target active states (gun barrel firing, frictional heating of the treads, bogey wheels and drive wheel, engine exhaust, etc.) into the system at less than 30Hz, 30Hz, or greater than 30Hz.
[0069] FIGS. 5 A and 5B illustrate examples dynamic changes in the thermal attributes of a target signature. In this example, a tank is shown having various thermally active regions. For examples, in FIG. 5A, the rear of the tank is simulated to be bright and would be associated with a thermally active engine. This may provide a strategic targetable region of the tank that can be more susceptible to attack. In FIG. 5B, the barrel of the tank can be seen bright to represent and simulated period during or after the firing sequence where a projectile travels through the barrel and the combustion of the munition combined with the friction have increased the thermal characteristics of the barrel. This capability can provide substantial intelligence on target identification and profiling. Similarly, FIGS. 6A to 6B illustrate another example of a firing sequence that can be generated by any of the systems described herein. The barrel is again simulated as increased thermal characteristics in FIG. 6C compared to 6A and 6B. The engine of the target signature tank can be seen as thermally active in all three images. The dynamic capabilities of any system described herein provide for a specific selectively adjustable dynamic simulation of operation for a realistic target signature that can be appreciated by weapon systems (e.g., heat-seeking weapons systems).
[0070] The range of dynamic simulation for an IR generated scene by any system described herein is expansive. FIGS. 7A to 7D illustrate another example of a dynamic transition between period of time for an IR scene generated by any system described herein. In particular, the target signature is again a tank, now shown having fired a projectile. Visually, the heat generated by a tank firing may be difficult to observe. However, any system described herein can be configured and capable of illustrating a firing sequence and residual thermal signatures (e.g., residual heat from an expelled projectile).
[0071] In some examples, any system described herein operates through a method comprising the function and interaction between one or more hardware components and one or more algorithms. In some embodiments, the step of initialization/image registration comprises the IRSG sends test pattern image to an image processor. A FLIR sends projected test pattern image to image processor. In some embodiments the FLIR is uncooled. The image processor computes and stores an image transformation matrix which auto-registers the IRSG input image with its corresponding (off-axis) projected image; for unambiguous determination of a one-to- one pixel mapping between the two image spaces.
[0072] In some examples, a method of generating a dynamic infrared scene can include generation and projection of an associated target signature. The method may comprise first, generating a scene. Generating a scene may comprise determining the desired target signature to be generated and presented. The scene may be generated using one or more physics based algorithms. Then, using the generated scene to develop corresponding scanner instructions. For example, the IRSG that was first developed may be submitted to the image processing system for data generation to instruct the operation of system hardware in the presentation of the target signature on the polyimide film. The, the laser can be initiated and the laser scanning process can be initiated.
[0073] In some examples a method of generating a dynamic infrared scene can include capturing a lased (e.g., laser scanned) imagery with a thermal imager, as described herein. The captured imagery can then be displayed and valuated against the initial scene input. For example, the initial scene (e.g., target) may be compared against the captured image for appropriate homography between the input image and the captured image.
[0074] In some examples, the steps of capturing the lased image may be continuously repeated to compare changes in the dynamic presentation of the lased image on the screen. The comparison may support or require modification of one or more hardware components based on the image input from the thermal imager into the image processing system. For example, if a captured image is rejected, the image processing system may adjust instructions to the hardware controller to correct any deficiencies and/or support homology between the IRSG and the
captured image. For examples, the instructions form the image refinement/correction loop can be repeated until the program is halted.
[0075] In some examples, a method of generating a dynamic infrared scene can include image registration. For examples, a method of generating a dynamic infrared scene can include generating scene in OSV, then developing a scene data based on the generated scene, the data may correspond to scanner instructions. Then, activating the laser and initiating scanning procedure. The method may comprise the feedback loop including capturing lased imagery with thermal imager. Then, applying registration transformation to captured imagery. Then, displaying the registered imagery.
[0076] In some examples, adding registration capability to this run loop may require use of a homography (transformation matrix) that can be calculated in advance. In JRM’s existing image registration applications, the run loop for finding this homography can comprise generating input image. Then, developing corresponding scanner instructions based on the generated input image. Then, activating the laser and initiating scanning procedure. Then, capturing lased imagery with thermal imager. Then, saving captured imagery. Then, validating and/or registration the image by comparing input image against reference image. After a valid registration has been found, the homography in that registration can be stored and applied to any other image captured by the thermal imager from the same position.
[0077] In some examples, any system described herein may comprise operational methods associated with a continuous error feedback loop comprises: the IRSG generating dynamic 2D radiance imagery corresponding to the physics-based predicted at-object signature of the 3D target and corresponding background, for the user-specified scenario weather, atmosphere, target dynamic state (e.g., engine state, motion state/frictional heating, etc.), time of day, time of year, etc. These 2D radiance images are fed to the image processor for passing along to the AOM, which modulates the beam intensity passing through the optics to the scanner, on a pixel-by- pixel basis. The uncooled FLIR images received by the IR thermal imager are captured by the frame grabber, converted to radiance images of the same format as those coming from the IRSG, and sent to the image processor for comparison. The image processor automatically performs image registration using the transformation stored during initialization, and then compares the two images (original from IRSG and projected from frame grabber), generating a 2D radiance difference map. This difference map is then applied to the next frame of synthetic imagery from the IRSG as a correction, and the result passed to the AOM for beam modulation.
[0078] In some examples, prior to the pixel-by-pixel raster scanning of each image frame, the laser scanner’s scan angle and the AOM’s modulation fraction can be coordinated within the scanner driver (i.e. software). This may ensure that the beam intensity corresponding to a
particular scan angle (screen position) spatially correlates with the proper input image pixel. Accordingly, the hardware controller can coordinate raster scanning.
[0079] In some examples, a feedback loop for any system described herein may comprise driving per pixel corrections to the IRSP image within a specified threshold, accommodating image responses corresponding to dynamic state changes of the simulated vehicle.
[0080] In some examples, environmental (e.g., field-based) influences may cause anomalies and complications in the IRSG target signature. Accordingly, any system described herein may comprise adjustment and refinement of any of the components and/or associated hardware configurations to account such environmental influence. For example, a laser scanner and modulator (e.g., AOM) can be controlled iteratively to adjust the effective laser power and scanning pattern for desired production of a spatial radiometric thermal emission sent to it by a scene generation unit (e.g., a Chimaera Scene Generation Unit). Any adjustment in response to environmental influences on the target signature production can be sufficient to address and calibrate relative to the negative impact of environmental influence on the target signature produced on the polyimide film by any system described herein.
[0081] In some examples, a closed-loop system as described herein may comprise a scene generation unit (SGU) (e.g., a Chimaera SGU), a laser scanning modulation system (LSMS), a LWIR FLIR camera, and (4) Frame Grabber Unit (FGU) (e.g., an image acquisition system such as a Chimera FGU). In some examples, the FGU may comprise and or operate with Radiometric Adjustment Feedback Software (RAFS).
[0082] In some examples, a method of generating a dynamic infrared scene can include generation and projection of an associated target signature including a closed-loop feedback system with steps comprising sending desired synthetic target/background radiometric image to the LSMS and FGU RAFS with the SGU. Then, the LSMS accepting the SGU image and controls the laser power and scan pattern to form the thermal emissive pattern. Then, LWIR FLIR can constantly sense the thermal signature of the screen (e.g., at 30Hz) and relay it to a frame grabber card in the FGU. Then, the FGU capturing the FLIR images (e.g., at 30Hz). Then, the FGU RAFS differences the SGU-requested image and the FLIR/FGU-captured image to form a delta matrix (per pixel gain and level array). Then, the FGU can send the delta matrix to the SGU to apply to the desired target thermal emission image.
[0083] In some examples, the closed feedback loop method can then repeat at each frame update (e.g., at each 30Hz frame update), thereby iteratively maintaining the correct radiometric signature, even in the presence of environmental influences on the screen.
[0084] In some examples, a closed loop system described herein adjusts one or more system components and/or associated component configurations. For example, an image is generated,
run through the closed loop system for comparison against the input image, then, an algorithm considers differences between the generated image and the input image. Based on the differences, either the elements may be physically adjusted in position relative to the Kapton® film; and/or the laser adjusts the image generation according to the results of the closed loop system comparison the adjust the image generation and correct for defects.
[0085] In some examples, the closed loop system may include identifying one or more regions or elements (e.g., key points) within an image (e.g., an input image/ output image) to relate or align the output image and the input image. For example, to correctly determine the transformation that is used to register an image, there may be need for one or more (e.g., at least 4 key points) corresponding key points from each image. More key points may result in a better transformation.
[0086] For example, FIG. 8 illustrates an example of adjustment of an image generated by any system described herein. For example, the image on left may be the image presented on the polyimide screen after initial IR scene generation and laser scanning. A thermal imager may be configured to acquire the thermal image (e.g., the image on the left) and input associated data into the image processing system for adjustment. According to the data from the processed acquired thermal image, the image processing system can be configured and adjust the controller, which in-tum can adjust one or more components of the laser scanner system to correct and optimize the image, as illustrated by the corrected image on the right.
[0087] In some examples, the key point selection is optimized based on corresponding keypoints were correct mathematical best-matches based on their descriptors. In some examples, the key points are fiducial markers are classified as reference points in an imaging system that can be used for measurement (QR codes and barcodes are common examples).
[0088] In some examples, a method of generating a dynamic infrared scene can include generation and projection can comprise a registration algorithm comprising detecting and describing one or more keypoints of one or more images. Then, eliminating one or more keypoints below a defined threshold of their descriptors. Then, comparing one or more remaining keypoints from each image. Then, calculating the homography using corresponding points. Then, applying the homography to the input image to align it with the reference image. [0089] In some examples, any system described herein may perform a method including the dynamic heating or transition of thermal changes for a target signature simulation presented on the screen. For example, the OSV scene can be configured to highlight only activated thermal regions at start state were the tank’s engine and exhaust. Then, over a period of time, the scene can be updated in real-time to heat the wheels and then the artillery. For example, the first 10 seconds, the only active thermal regions are the engine and exhaust, and then the wheels are
activated and allowed to heat up for another 10 seconds. Then, the artillery begins to heat up. This dynamic heating can provide a simulated function and strategic targeting opportunities for weapons systems being trained.
[0090] In some examples, any system described herein may comprise a housing configured to house one or more of any of the system components. For example, a housing may be configured to house the laser, laser scanner, one or more modulators, image acquisition system, image processing system components, optics, etc.
[0091] In some examples, any system described herein is configured to generate and present a dynamic IR scene (e.g., target signature) on a screen for use with live firing training or operations. Any system described herein may comprise a hit detection system associated with live firing weapons training. For example, a target signature may be presented on a screen (e.g., polyimide film) based on contact from a laser beam that has been modulated after a laser device has received instructions from an image processing system based on a generated or desired IR scene input. The target signature may include one or more dynamic elements (e.g., a heating barrel, engine and/or tires) that may be considered in the weapons training for strategic targeting. A weapons system may be configured to acquire the target signature in a manner similar to target acquisition of a real (e.g., non-simulated/non-generated target) and deploy ammunition to the screen in a similar manner to deploying ammunitions on a real target. The screen may be destroyed as a result of the live fire training and replacement of the screen can allow the system to re-present another target signature.
[0092] In some examples, any system described herein is configured to generate and present a dynamic IR scene (e.g., target signature) on a screen for use with non-live firing training or operations. Any system described herein may comprise a hit detection system associated with non-live firing weapons training. Non-live firing training or operations may comprise the use of weapons targeting systems and/or other target acquisition systems and methods that can provide for acquisition of a target and simulated deployment of munitions. For example, a target signature may be presented on a screen (e.g., polyimide film) based on contact from a laser beam that has been modulated after a laser device has received instructions from an image processing system based on a generated or desired IR scene input. The target signature may include one or more dynamic elements (e.g., a heating barrel, engine and/or tires) that may be considered in the weapons training for strategic targeting. A weapons system may be configured to acquire the target signature in a manner similar to target acquisition of a real (e.g., non-simulated/non- generated target) and deploy an acquisition element (e.g., a laser guidance system, radar, sonar, etc.) to the screen in a similar manner to acquiring a real target. The screen may respond to the weapon targeting system to indicate a hit or positive acquisition of the target. For example, one
or more radio frequency sensors (RF), identification system (e.g., IFF identification system friend or foe), electromagnetic (EM), IR sensor and/or other sensory systems may be in operable communication with any system described herein. The one or more sensory systems may be configured to register or otherwise identify activity of the target acquisition system relative to the target signature presented on the polyimide film. An alert or other form of indication relating to the positive acquisition of the target signature by the target acquisition system may be triggered. Alternatively, where the target acquisition system fails to acquire the target signature presented on the polyimide film, a negative acquisition feedback may be provided by a system described herein.
[0093] FIG. 9 illustrates an example of a dynamic infrared scene generation and projection system having the hit detection system 800 with a plurality of sensors 801, 802, 803 as described herein in operable communication with the polyimide screen. The sensors can be configured to detect a hit or other engagement of the target signature on the screen. In some examples, the thermal imager can depict, and the system may be configured to simulate damage based on the specific characteristics of the detected hit.
[0094] In some examples, any system described herein include a hit detection system configured to detect and indicate or otherwise provide real-time information relating to the efficacy and/or damage delivered by the weapon system used on the target signature. For example, the image acquisition system may be configured to acquire an image and/or data related to damage and/or the impact of the weapons system deployed on the target signature (e.g., presented on the polyimide film). The image capture system may transmit acquired/observed results of the weapon system impact on the target signature to the image processing system, which can be configured to provide simulated details of an impact by the weapon system.
[0095] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein and may be used to achieve the benefits described herein.
[0096] The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various example methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
[0097] Any of the methods (including user interfaces) described herein may be implemented as software, hardware or firmware, and may be described as a non-transitory computer-readable storage medium storing a set of instructions capable of being executed by a processor (e.g., computer, tablet, smartphone, etc.), that when executed by the processor causes the processor to control perform any of the steps, including but not limited to: displaying, communicating with the user, analyzing, modifying parameters (including timing, frequency, intensity, etc.), determining, alerting, or the like. For example, any of the methods described herein may be performed, at least in part, by an apparatus including one or more processors having a memory storing a non-transitory computer-readable storage medium storing a set of instructions for the processes(s) of the method.
[0098] While various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these example embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the example embodiments disclosed herein.
[0099] As described herein, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each comprise at least one memory device and at least one physical processor.
[0100] The term “memory” or “memory device,” as used herein, generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices comprise, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory. [0101] In addition, the term “processor” or “physical processor,” as used herein, generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device.
Examples of physical processors comprise, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
[0102] Although illustrated as separate elements, the method steps described and/or illustrated herein may represent portions of a single application. In addition, in some embodiments one or more of these steps may represent or correspond to one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks, such as the method step.
[0103] In addition, one or more of the devices described herein may transform data, physical devices, and/or representations of physical devices from one form to another. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form of computing device to another form of computing device by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
[0104] The term “computer-readable medium,” as used herein, generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media comprise, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic- storage media (e.g., solid-state drives and flash media), and other distribution systems.
[0105] Any system described herein may comprise one or more power sources or power systems operably connected to one or more or any components requiring power for operation. Controllers, modulators, regulators, may be included as necessary or beneficial for the functional operation of the system and/or any of the system components.
[0106] A person of ordinary skill in the art will recognize that any process or method disclosed herein can be modified in many ways. The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed.
[0107] The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or comprise additional steps in addition to those disclosed. Further, a step of any method as disclosed herein can be combined with any one or more steps of any other method as disclosed herein.
[0108] The processor as described herein can be configured to perform one or more steps of any method disclosed herein. Alternatively or in combination, the processor can be configured to combine one or more steps of one or more methods as disclosed herein.
[0109] When a feature or element is herein referred to as being "on" another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being "directly on" another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being "connected", "attached" or "coupled" to another feature or element, it can be directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being "directly connected", "directly attached" or "directly coupled" to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed "adjacent" another feature may have portions that overlap or underlie the adjacent feature.
[0110] Terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. For example, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items and may be abbreviated as "/" .
[0111] Spatially relative terms, such as "under", "below", "lower", "over", "upper" and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as "under" or "beneath" other elements or features would
then be oriented "over" the other elements or features. Thus, the exemplary term "under" can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms "upwardly", "downwardly", "vertical", "horizontal" and the like are used herein for the purpose of explanation only unless specifically indicated otherwise. [0112] Although the terms “first” and “second” may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
[0113] Throughout this specification and the claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” and “comprising” means various components can be co-jointly employed in the methods and articles (e.g., compositions and apparatuses including device and methods). For example, the term “comprising” will be understood to imply the inclusion of any stated elements or steps but not the exclusion of any other elements or steps.
[0114] In general, any of the apparatuses and methods described herein should be understood to be inclusive, but all or a sub-set of the components and/or steps may alternatively be exclusive, and may be expressed as “consisting of’ or alternatively “consisting essentially of’ the various components, steps, sub-components or sub-steps.
[0115] As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word "about" or “approximately,” even if the term does not expressly appear. The phrase “about” or “approximately” may be used when describing magnitude and/or position to indicate that the value and/or position described is within a reasonable expected range of values and/or positions. For example, a numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/- 1% of the stated value (or range of values), +/- 2% of the stated value (or range of values), +/- 5% of the stated value (or range of values), +/- 10% of the stated value (or range of values), etc. Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value "10" is disclosed, then "about 10" is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that "less than or equal to" the value, "greater than or equal to the value" and possible ranges between
values are also disclosed, as appropriately understood by the skilled artisan. For example, if the value "X" is disclosed the "less than or equal to X" as well as "greater than or equal to X" (e.g., where X is a numerical value) is also disclosed. It is also understood that the throughout the application, data is provided in a number of different formats, and that this data, represents endpoints and starting points, and ranges for any combination of the data points. For example, if a particular data point “10” and a particular data point “15” are disclosed, it is understood that greater than, greater than or equal to, less than, less than or equal to, and equal to 10 and 15 are considered disclosed as well as between 10 and 15. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.
[0116] Although various illustrative embodiments are described above, any of a number of changes may be made to various embodiments without departing from the scope of the invention as described by the claims. For example, the order in which various described method steps are performed may often be changed in alternative embodiments, and in other alternative embodiments one or more method steps may be skipped altogether. Optional features of various device and system embodiments may be included in some embodiments and not in others. Therefore, the foregoing description is provided primarily for exemplary purposes and should not be interpreted to limit the scope of the invention as it is set forth in the claims.
[0117] The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. As mentioned, other embodiments may be utilized and derived there from, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is, in fact, disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
Claims
1. A dynamic infrared (IR) scene generation system comprising: a polyimide screen configured to react to contact by an IR laser; a laser scanning system configured to contact the polyimide screen with laser beam, the laser scanning system comprising a laser scanner, one or more modulators, and an IR laser source; and image processing system operably connected to a controller configured to control the laser scanning system based on data from the image processing system, wherein the laser scanning system is configured to contact the polyimide screen with the IR laser based on an IR scene input.
2. The system of claim 1, further comprising a closed feedback system configured to validate a thermal target signature acquired by a thermal imager, wherein the thermal target signature is a dynamic scene based on the polyimide screen reaction to contact by the IR laser.
3. The system of claim 1, wherein the image processing system is configured to correct one or more pixel intensities of a IR scene input, wherein the image processing system comprises one or more drivers configured to generate data based on the IR scene input, wherein controller is configured to control the laser scanning system based on the data.
4. The system of claim 1, further comprising a thermal imager operably connected to the image processing system, the thermal imager configured to acquire one or more images from the polyimide screen, wherein the one or more images are configured to be adjusted by the controller based on adjustment data from the image processing system, wherein the adjustment data is associated with the one or more acquired images.
5. The system of claim 3, wherein the laser scanning system is configured to generate one or more thermally dynamic images on the polyimide screen, wherein each of the one or more thermally dynamic images is targetable by a weapon system.
6. The system of claim 1, wherein the polyimide screen is a multilayer screen comprising a polyimide film contactable by the IR laser, and a fabric backing configured to dissipate heat from the laser transmitted through the polyimide film.
- 26 -
7. The system of claim 2, further comprising a hit detection system in operable communication with the thermal imager, the hit detection system comprising one or more radio frequency sensors, one or more electromagnetic sensors, or one or more identification sensors.
8. The system of claim 1, wherein one or more physics-based algorithms are executed in a computing system, wherein the one or more physics-based algorithms are configured to generate the IR scene input, wherein the generated IR scene input is configured to have one or more thermally dynamic attributes based on the one or more physics-based algorithms.
9. The system of claim 1, wherein the laser scanning system is uncooled.
10. A method of dynamic infrared (IR) scene generation with an uncooled collimated laser, the method comprising: generating an IR scene input; an image processing system generating data based on the IR scene input; initiating a laser scanning system comprising a laser scanner, wherein a controller directs operations of the laser scanning system based on the generated data; contacting a polyimide screen with a laser beam directed by the laser scanner; and generating a target signature on the polyimide screen, wherein the laser scanner scans the laser beam on the polyimide screen based on the IR scene input.
11. The method of claim 10, further comprising: acquiring a thermal image emitted by the polyimide screen with an IR thermal imager; validating the acquired thermal image against the IR scene input, wherein if the acquired thermal image fails validation, the method further comprises processing the acquired thermal image with the image processing system and adjusting one or more system components based on adjustment data generated by the image processing, the adjustment data based on the processed thermal image.
12. The method of claim 10, wherein the IR scene input comprises a predetermined sequence of one or more simulated dynamic adjustments to one or more regions of the IR scene input, wherein the one or more simulated dynamic adjustment is based on one or more changes in simulated thermal activity of the generated IR scene input.
13. The method of claim 12, further comprising repeating the method in a sequence based on one or more predetermined adjustments in the IR scene input.
14. The method of claim 10, further comprising modulating the laser beam with an acousto-optic modulator, and passing the modulated laser beam through one or more optics before contacting a polyimide screen with a laser beam directed by the laser scanner.
15. The method of claim 11, wherein the IR scene input is a video comprising sequential frames, wherein the thermal imager acquires the sequential frames from the video target signature.
16. The method of claim 15, wherein the acquired sequential frames are validated after all of the sequential frames are acquired, wherein the method further comprises the thermal imager storing sequential frames acquired from the IR scene input video target signature.
17. The method of claim 11, further comprising detecting a hit with one or more sensors in operable communication with the polyimide screen, wherein a hit is detected when a weapon system acquires the target signature.
18. The method of claim 11, further comprising the controller adjusting operation of the laser scanning system, wherein the target signature is validated against the IR scene input, wherein the operation of the laser scanning system is adjusted until the target signature passes validation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163225929P | 2021-07-26 | 2021-07-26 | |
PCT/US2022/038395 WO2023048818A2 (en) | 2021-07-26 | 2022-07-26 | Dynamic infrared scene generation and projection system and methods |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4378156A2 true EP4378156A2 (en) | 2024-06-05 |
Family
ID=85721072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP22873361.4A Pending EP4378156A2 (en) | 2021-07-26 | 2022-07-26 | Dynamic infrared scene generation and projection system and methods |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240212542A1 (en) |
EP (1) | EP4378156A2 (en) |
KR (1) | KR20240035599A (en) |
WO (1) | WO2023048818A2 (en) |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6611546B1 (en) * | 2001-08-15 | 2003-08-26 | Blueleaf, Inc. | Optical transmitter comprising a stepwise tunable laser |
US6765220B2 (en) * | 2001-01-10 | 2004-07-20 | Lockheed Martin Corporation | Infrared scene generator using fluorescent conversion material |
US6635892B2 (en) * | 2002-01-24 | 2003-10-21 | Pei Electronics, Inc. | Compact integrated infrared scene projector |
ATE487953T1 (en) * | 2003-06-04 | 2010-11-15 | Elop Electrooptics Ind Ltd | FIBER LASER-ASSISTED JAMming SYSTEM |
US8402819B2 (en) * | 2007-05-15 | 2013-03-26 | Anasys Instruments, Inc. | High frequency deflection measurement of IR absorption |
CA2788852C (en) * | 2009-02-02 | 2019-01-15 | R. J. Dwayne Miller | Soft ablative desorption method and system |
US8164543B2 (en) * | 2009-05-18 | 2012-04-24 | GM Global Technology Operations LLC | Night vision on full windshield head-up display |
US8899886B1 (en) * | 2009-11-25 | 2014-12-02 | The Boeing Company | Laser signature vision system |
US20120194550A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Sensor-based command and control of external devices with feedback from the external device to the ar glasses |
US9223134B2 (en) * | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US20130314303A1 (en) * | 2010-02-28 | 2013-11-28 | Osterhout Group, Inc. | Ar glasses with user action control of and between internal and external applications with feedback |
US20120212484A1 (en) * | 2010-02-28 | 2012-08-23 | Osterhout Group, Inc. | System and method for display content placement using distance and location information |
US9128281B2 (en) * | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US9229227B2 (en) * | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9182596B2 (en) * | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US9052513B2 (en) * | 2011-05-06 | 2015-06-09 | Lexmark International, Inc. | Laser scan unit for an imaging device |
US8552381B2 (en) * | 2011-07-08 | 2013-10-08 | The Johns Hopkins University | Agile IR scene projector |
US8927935B1 (en) * | 2012-05-21 | 2015-01-06 | The Boeing Company | All electro optical based method for deconfliction of multiple, co-located directed energy, high energy laser platforms on multiple, near simultaneous threat targets in the same battle space |
US9671566B2 (en) * | 2012-06-11 | 2017-06-06 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
US10215553B2 (en) * | 2015-03-12 | 2019-02-26 | Apple Inc. | Thin PSD for laser-scanning systems |
CN108292093B (en) * | 2015-08-31 | 2023-01-31 | 利赛奥谱特科技责任有限公司 | Apparatus and method for using a scanned light beam for film or surface modification |
US10524664B2 (en) * | 2016-04-29 | 2020-01-07 | Northwestern University | Devices, methods, and systems of functional optical coherence tomography |
US10802120B1 (en) * | 2019-08-20 | 2020-10-13 | Luminar Technologies, Inc. | Coherent pulsed lidar system |
US20220043127A1 (en) * | 2020-08-10 | 2022-02-10 | Luminar, Llc | Lidar system with input optical element |
WO2022240554A2 (en) * | 2021-04-20 | 2022-11-17 | Luminar, Llc | Coherent pulsed lidar system with two-sided detector |
-
2022
- 2022-07-26 EP EP22873361.4A patent/EP4378156A2/en active Pending
- 2022-07-26 WO PCT/US2022/038395 patent/WO2023048818A2/en active Application Filing
- 2022-07-26 KR KR1020247005940A patent/KR20240035599A/en unknown
-
2023
- 2023-08-09 US US18/447,106 patent/US20240212542A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20240212542A1 (en) | 2024-06-27 |
WO2023048818A2 (en) | 2023-03-30 |
KR20240035599A (en) | 2024-03-15 |
WO2023048818A3 (en) | 2023-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8082832B1 (en) | Missile system using two-color missile-signature simulation using mid-infrared test source semiconductor lasers | |
US6984039B2 (en) | Laser projector having silhouette blanking for objects in the output light path | |
US20200068110A1 (en) | Image processing methods and apparatuses, computer readable storage media, and electronic devices | |
EP3292686B1 (en) | Thermal compensation in image projection | |
US7606640B2 (en) | Cooling apparatus and projection type display device | |
US8188434B2 (en) | Systems and methods for thermal spectral generation, projection and correlation | |
JP5340725B2 (en) | Method and apparatus for presenting images | |
US20190360779A1 (en) | Encoded signal detection and display | |
US7106435B2 (en) | Hyperspectral scene generator and method of use | |
US10310086B2 (en) | Method and device for local stabilization of a radiation spot on a remote target object | |
US20170133823A1 (en) | Laser system with reduced apparent speckle | |
US9898947B2 (en) | Image display device and control method of image display device | |
US8426820B2 (en) | Image sensor system | |
US20150070511A1 (en) | System and moving modulated target with unmodulated position references for characterization of imaging sensors | |
US20240212542A1 (en) | Dynamic infrared scene generation and projection system and methods | |
Dupuis et al. | Two-band DMD-based infrared scene simulator | |
LaVeigne et al. | A two-color 1024x1024 dynamic infrared scene projection system | |
KR102546720B1 (en) | Optical Testbed System for Jamming of Image Tracking Threats | |
US10288990B1 (en) | Calculation of beam speed and position in a laser projection system using a graphics processing unit | |
KR20240008112A (en) | Apparatus and method for simulating moving target for IR camera using laser | |
KR101916732B1 (en) | Infrared electric warfare signal generator | |
US20240364500A1 (en) | Systems and methods for authenticating video feeds | |
Williams | Dynamic infrared projection analysis: an overview | |
Dupuis et al. | Contrast analysis for DMD-based IR scene projector | |
CN106997141A (en) | The Light path correction method and optical projection system of a kind of optical projection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20240208 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
RAX | Requested extension states of the european patent have changed |
Extension state: ME Payment date: 20240208 |