EP4321920A1 - Navigation dans des environnements independants du gps par vision nocturne amelioree avec dispositif optique transparent - Google Patents

Navigation dans des environnements independants du gps par vision nocturne amelioree avec dispositif optique transparent Download PDF

Info

Publication number
EP4321920A1
EP4321920A1 EP23188634.2A EP23188634A EP4321920A1 EP 4321920 A1 EP4321920 A1 EP 4321920A1 EP 23188634 A EP23188634 A EP 23188634A EP 4321920 A1 EP4321920 A1 EP 4321920A1
Authority
EP
European Patent Office
Prior art keywords
optical device
transparent optical
light
nightvision
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23188634.2A
Other languages
German (de)
English (en)
Inventor
Eric Ramsay
James A. Lebeau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
L3Harris Technologies Inc
Original Assignee
L3Harris Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L3Harris Technologies Inc filed Critical L3Harris Technologies Inc
Publication of EP4321920A1 publication Critical patent/EP4321920A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/12Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G01C21/025Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/04Adaptation of rangefinders for combination with telescopes or binoculars
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/16Optical objectives specially designed for the purposes specified below for use in conjunction with image converters or intensifiers, or for use with projectors, e.g. objectives for projection TV
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • Nightvision systems allow a user to see in low-light environments without external human visible illumination. This allows for covert vision in a low-light environment to prevent flooding the environment with human visible light.
  • Some nightvision systems function by receiving low levels of light reflected off of, or emitted from objects and providing that light to an image intensifier (sometimes referred to as I 2 ).
  • the image intensifier has a photocathode. When photons strike the photocathode, electrons are emitted into a vacuum tube, and directed towards a microchannel plate to amplify the electrons.
  • the amplified electrons strike a phosphor screen.
  • the phosphor screen is typically chosen such that it emits human visible light when the amplified electrons strike the phosphor screen.
  • the phosphor screen light emission is coupled, typically through an inverting fiber-optic, to an eyepiece where the user can directly view the illuminated phosphor screen, thus allowing the user to see the objects.
  • Modern nightvision systems include ancillary functionality.
  • some nightvision systems may include location hardware, such as global positioning satellite (GPS) radios that allow for determining the location of the nightvision system. This information can be displayed to a user to help the user know their geographical location as they view a local nightvision scene.
  • GPS global positioning satellite
  • certain environments may have difficulties using GPS signals. This can be for one or more of a number of different reasons. For example, GPS signals may be blocked by environmental conditions, such as canyon walls, foliage, moisture, being indoors, etc. Further, cosmic radiation may disrupt GPS signals. In some situations, GPS signals may be jammed by adversaries to prevent users from using GPS. GPS signals may be spoofed by adversaries to cause GPS radios to estimate incorrect locations. Thus, there are a multitude of conditions and factors that can prevent proper functioning of GPS systems.
  • the nightvision system includes an underlying device that is configured to provide output light in a first spectrum from input light received at the underlying device.
  • a transparent optical device is optically coupled in an overlapping fashion to the underlying device.
  • the transparent optical device includes an active area of a semiconductor chip.
  • the active area includes active elements configured to cause the underlying device to detect light from the underlying device and transparent regions formed in the active area which are transparent to the light in the first spectrum to allow light in the first spectrum to pass through from the underlying device to a user.
  • An image processor is configured to process images produced using light detected by the active elements to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the active elements.
  • Embodiments illustrated herein are directed to using a transparent optical device in conjunction with nightvision equipment, such as an image intensifier (or other nightvision system).
  • nightvision equipment such as an image intensifier (or other nightvision system).
  • embodiments are directed to a nightvision system which includes and underlying nightvision device and a transparent optical device.
  • the transparent optical device allows light produced by the underlying nightvision device to pass through the transparent optical device and be transmitted to a user.
  • the transparent optical device is an active device which is capable of detecting light produced by the underlying nightvision device.
  • Object and feature recognition can be performed by an image processor to identify various objects, edges, features and the like.
  • the nightvision system may further include functionality for matching the detected objects, edges, features and the like with known environmental characteristics.
  • embodiments can determine a navigational heading, the speed at which the user is moving, coordinates, and the like. For example, embodiments may be able to identify terrain characteristics, celestial bodies and/or constellations, vehicles operating in the environment where characteristics of the vehicles such as position and speed are known, etc. and use this identified information to establish position of the nightvision device, speed at which the nightvision device is moving, three-dimensional angles of orientation of the nightvision device, etc.
  • Figure 1 illustrates the PVS - 14 nightvision system 100.
  • the nightvision system 100 includes a housing 124.
  • the housing 124 houses an image intensifier, a transparent optical device (see e.g., Figures 3A , 3B and 3C and transparent optical device 118), and various other components.
  • the nightvision system 100 further includes an objective 102 which receives weak light reflected and/or generated in an environment.
  • the objective 102 includes optics such as lenses, waveguides, and/or other optical components for receiving and transmitting light to an image intensifier, discussed in more detail below.
  • the nightvision system 100 further includes an eyepiece 122.
  • the eyepiece 122 includes optics for filtering images created by the nightvision system 100, including images created by an image intensifier and images created by a transparent optical device, into the eye of the user.
  • Figure 2 illustrates an image 200 including a heads-up display displayed on a nightvision image output from an intensifier tube.
  • Some embodiments described herein are directed to implementing a heads-up display implemented by adding image overlay capabilities with a nightvision system, where the image overlay capabilities are added by using a transparent optical device including a display.
  • the heads-up display may display to the user, in or around the field-of-view of an environment, various pieces of information to create an augmented reality (AR) environment.
  • AR augmented reality
  • Such information may include, for example, a navigational heading, the speed at which the user is moving, coordinates, communication messages (such as email, SMS, etc.), time of day or other timing information, vital signs for the user such as heart rate or respiration rate, indicators indicating whether an object being viewed by the nightvision system is friendly or adversarial, battery charge level for the nightvision system or other devices, weather conditions, contact information, audio information (such as volume, playlist information, artist, etc.), etc.
  • the transparent optical device 118 includes photodetectors for detecting intensified light to determine brightness in a scene, the locations of various objects in the field of view, or other information. This information can be used for location functionality as described above and in more detail below, target indicators, or other images output by the transparent optical device 118.
  • the navigational heading, the speed at which the user is moving, coordinates and the like are derived from GPS data available at the nightvision system 100, such as from GPS radios included in the nightvision system 100.
  • GPS data can be derived from geolocation information using cellular tower triangulation.
  • the nightvision system 100 may be operated in an environment where GPS or cellular signals are partially or wholly denied. This prevents radios at the nightvision system 100 from using GPS and/or cellular data from being used to geolocate the nightvision system 100, thus preventing the data from being used to display the navigational heading, the speed at which the user is moving, coordinates and the like to the user as shown in Figure 2 .
  • embodiments illustrated herein implement a transparent optical device that can be used to detect features of objects in a nightvision scene.
  • An image processor can process detected features to identify specific, particular objects. For example, specific landmarks can be identified. Alternatively, celestial bodies can be identified (such as the moon, particular stars, or combinations therefor, such as in constellations, etc.). Etc. This information can be used by the image processor to determine navigational heading, speed at which the user is moving, coordinates and the like. Additional details are now illustrated.
  • Figures 3A , 3B and 3C illustrate a block diagram, a cutaway and a block diagram respectively of embodiments of the invention.
  • a nightvision system typically includes an objective 102 to focus input light 101 into an underlying device 104.
  • the underlying device 104 may be one or more of a number of different types of nightvision devices, such as IR CCD cameras, CMOS cameras, image intensifiers, and the like.
  • the underlying device 104 is an image intensifier.
  • Input light 101 may be, for example, from ambient sources, such as light from heavenly bodies such as stars, the moon, or even faint light from the setting sun.
  • ambient sources could include light from buildings, automobiles, or other faint sources of light that cause reflection of light from an object being viewed in a nightvision environment into the objective.
  • a second source of light may be light being emitted from an external source towards an object, reflected off the object, and into the objective.
  • the source may be an infrared source that is not viewable in the viewable spectrum for human observers.
  • a third source of light may be light emitted by an object itself. For example, this may be related to visible light, infrared heat energy emitted by the object and directed into the objective, etc. Nonetheless, the nightvision system is able to convert the light emitted from the source into a viewable image for the user.
  • the objective directs input light 101 into the underlying device 104.
  • the underlying device 104 may include functionality for amplifying light received from the objective to create a sufficiently strong image that can be viewed by the user. This may be accomplished using various technologies.
  • a photocathode 106, a microchannel plate 110, and a phosphor screen 112 are used.
  • the photocathode 106 may be configured to generate photo electrons in response to incoming photons. Electrons from the photocathode 106 are emitted into the microchannel plate 110. Electrons are multiplied in the microchannel plate 110.
  • Electrons are emitted from the microchannel plate 110 to a phosphor screen 112 which glows as a result of electrons striking the phosphor screen 112. This creates a monochrome image from the input light 101.
  • a fiber-optic 113 carries this image as intensified light to the eyepiece (such as eyepiece 122 illustrated in Figure 1 ) of a nightvision system where it can be output to the user.
  • This fiber-optic 113 can be twisted 180 degrees to undo the inversion caused by the system objective to allow for convenient direct viewing of the phosphor screen 112.
  • FIGS 3A , 3B and 3C further illustrate the transparent optical device 118.
  • the transparent optical device 118 allows intensified light to pass through the transparent optical device 118, but also, in some embodiments, generates its own light, from LEDs or other light emitters, to transmit the generated light to a user.
  • Creating a transparent optical device may be accomplished, for example, using the teachings of United States Patent Application No. 16/868,306, filed on May 6, 2020 , titled "Backside Etch Process For Transparent Silicon Oxide Technology", which is incorporated herein by reference, in its entirety.
  • the transparent optical device 118 is typically implemented behind the fiber-optic 113 (i.e., closer to the eyepiece than the fiber-optic 113), but in other embodiments may be implemented in front of the fiber-optic 113.
  • the use of a fiber-optic within nightvision systems inverts and translates the focal plane allowing the transparent optical device 118 overlay to be presented on either side without impacting the ability for the eyepiece to focus on the image.
  • certain manufacturing or mechanical constraints may incentivize placement of the transparent optical device 118 behind the fiber-optic including the difficulty in inserting electronics within the vacuum package of the underlying device 104.
  • the transparent optical device 118 may include functionality for displaying information to a user. Such information may include graphical content, including text, images, and the like.
  • the transparent optical device 118 may display in shaded monochrome. Alternatively, or additionally, the transparent optical device 118 may display in multiple colors. Alternatively, or additionally, the transparent optical device 118 may display in 1-bit monochrome.
  • the transparent optical device 118 may display a navigational heading, the speed at which the user is moving, coordinates, etc.
  • transparent optical device 118 may be implemented in an optical path.
  • some transparent optical devices may emit light, while other transparent optical devices absorb and detect light.
  • Embodiments illustrated herein include at least one transparent optical device capable of absorbing and detecting light for use in detecting features of objects to identify navigational heading, speed at which a user is moving, coordinates, etc. when GPS or other geo-locating signals are denied or diminished.
  • the transparent optical device 118 outputs display light 128 which can be sent to the eyepiece (such as the eyepiece 122 illustrated in Figure 1 ).
  • the intensified light 130 is also provided to the eyepiece.
  • an image such as that illustrated in Figure 2 is presented to the user in the nightvision system.
  • the transparent optical device 118 is composed of active silicon elements, typically in a grid arrangement to implement various pixels.
  • the active elements cause the device to have certain optical performance capabilities. Such capabilities may be one or more of abilities to output color output, output monochrome output, detect light, etc.
  • the transparent optical device 118 is a digital detector and potentially display having a certain pixel density for the detector and potentially a display.
  • each pixel is implemented on a single active island, although in other embodiments, an island may have multiple pixels, or even only a single sub-pixel element.
  • Each pixel may have one or more transistors controlling one or more OLED emitters (or other light emitting devices). Pixels may additionally or alternatively include light detectors.
  • This detected light can be used to characterize an image intensifier (or other) image.
  • the detected light can be used for recording scene events.
  • the detected light can be used for improving placement of elements displayed on the heads-up display shown in Figure 2 .
  • edge detection techniques may be used using the detected light, and images generated and displayed by the transparent optical device 118 can be keyed off of these detected edges.
  • Edge detection techniques may be used using the detected light, and navigational heading, speed at which the user is moving, coordinates, etc., can be keyed off of these detected edges as explained in more detail below.
  • the transparent optical device 118 is representative of a stacked device formed in a semiconductor chip that overlaps an underlying device 104.
  • the underlying device 104 is an image intensifier.
  • the transparent optical device 118 is transparent to light in a first spectrum (according to some predefined transmission efficiency), which in this case is the visible spectrum of light output by the phosphor screen 112. That is, the transparent optical device 118 is not fully transparent due to the blocking of the active devices, but transparency referred to herein refers to at least partial transparency according to some transmission efficiency.
  • overlapping as used herein means that elements are in the same optical path. This can be accomplished by having elements be in coaxial alignment when the optical path is straight. Alternatively, this can be accomplished by using various waveguides or other elements to align optical paths thus not requiring physical coaxial alignment.
  • a photodetector implemented in the transparent optical device absorbs a portion of the intensified light converting it to electrical signals.
  • the photodetector can be a two-dimensional array of light detectors, such as photodiodes, that generates charge currents, or any other form of digital data level proportional to intensity of the intensified light as a function of position.
  • the photodetector may generate a two-dimensional array of electrical charge that represents the intensified image.
  • this two-dimensional array of electrical charge can be periodically read from the photodetector (e.g., the detected signal can be read from the photodetector in a fashion similar to a charged coupled device (CCD) camera).
  • CCD charged coupled device
  • the two-dimensional array of electrical signals from the photodetector is processed (such as by the image processor 119) and/or used locally, e.g., within the transparent optical device 118, at the readout or pixel levels, to modulate in real time the amplitude of the display light 128 emitted from the transparent optical device 118.
  • the transparent regions shown in the preceding figures can be created in a number of particular ways.
  • the transparent regions can be created by using the processes described in United States Patent Application serial number 16/686,306 titled "Backside Etch Process For Transparent Silicon Oxide Technology", which is incorporated herein by reference in its entirety. Briefly, that application describes a process for creating transparent regions in otherwise opaque portions of semiconductor materials.
  • Figure 5 illustrates a transparent optical device 518 including active silicon islands (which may be native silicon islands) such as active silicon island 550.
  • active silicon islands include transistors such as transistor 552 which control detectors and potentially OLED emitters in an OLED stack 554.
  • each of the active silicon islands represents a pixel or sub-pixel of the transparent optical device 518.
  • an image can be detected by active elements in the active region.
  • an image can be created and output to a user, such as by outputting display light such as the display light 128 illustrated in Figure 3C .
  • intensified light is transmitted through the transparent optical device 118 to the eyepiece of the nightvision system, and then to the user. Note, however, that the intensified light is transmitted to the user through the transparent optical device 118, meaning that the intensified light will be affected by characteristics of the transparent optical device 118.
  • light 128 represents the light output by the light emitting portions of the transparent optical device 118.
  • Light 130 represents intensified light from the phosphor screen 112 transmitted through the transparent optical device 118. That is, light 130 may include or may be, in its entirety, light in the first spectrum.
  • light 526 represents a portion of light that is transmitted through transparent regions, illustrated by transparent region 556, of the transparent optical device 518
  • light 532 represents a portion of light that is blocked by active portions of the transparent optical device 518.
  • Light 532 may be detected by the transparent optical device 518 (or 118) and used as described herein.
  • the light detected by the transparent optical device can be used for generating: a navigational heading, the speed at which the user is moving, coordinates, etc.
  • the transparent region 556 is backfilled with a transparent back fill material.
  • transmission of light through the transparent optical device is nonetheless increased by removing portions of silicon that are not needed for implementing active electrical components or for supporting metal traces.
  • Anode size for the sub pixels is 8 ⁇ m ⁇ 5.1 ⁇ m.
  • Pixel area is 10.1 ⁇ m ⁇ 12.4 ⁇ m.
  • Pixel pitch is 22.5 ⁇ m ⁇ 22.5 ⁇ m. In one example, this provides a resolution of 800 ⁇ 800.
  • transparency of the transparent optical device is about 33%. In contrast, transparency is about 61% if the non-active silicon islands are removed such as in the structure illustrated in Figure 5 .
  • transparency of a transparent optical device is increased by more than 80% by removing silicon and/or oxide trenches.
  • a transparent optical device with a 36 ⁇ m pitch can obtain a transparency of 81%, while a transparent optical device of 22.5 ⁇ m pitch can obtain a transparency of 67%, while a transparent optical device having a 17.5 ⁇ m pitch will be about 55% transparency when non-active silicon islands are removed from the transparent optical device in each of the illustrated examples.
  • some embodiments may be able to create a transparent optical device with at least a 36 ⁇ m pitch with at least a transparency of 75%, or a transparent optical device of at least a 22.5 ⁇ m pitch with at least a transparency of 60%, or a transparent optical device having at least a 17.5 ⁇ m pitch with at least a 50% transparency when silicon is removed between active silicon islands.
  • a transparent optical device with at least a 36 ⁇ m pitch with at least a transparency of 75%
  • a transparent optical device of at least a 22.5 ⁇ m pitch with at least a transparency of 60% or a transparent optical device having at least a 17.5 ⁇ m pitch with at least a 50% transparency when silicon is removed between active silicon islands.
  • Pitch and transparency values may be specific to a given semiconductor manufacturing process-also known as the technology or process node-and will of course vary as the node changes.
  • the technology node will dictate the area of required active silicon for the display CMOS based on the transistor size.
  • the node minimum feature size decreases, whether it be through alternate foundries or improvements in technology, the same need for maximizing transparency applies. Indeed, the benefit to removing non-active silicon islands improves as the ratio of inactive- to active-silicon increases with smaller transistors.
  • FIG. 3C light 101 is input into the objective 102, where it is transmitted to an underlying device 104, in this case, an image intensifier.
  • Figures 3A , 3B and 3C further illustrate the transparent optical device 118.
  • the transparent optical device 118 includes light detectors that are able to detect light produced by the underlying device 104 to produce a feature map 132.
  • the feature map 132 indicates features identified in different portions of a scene 133. In particular, analysis of the feature map 132 can show features detected in a scene 133 detected by the underlying device 104.
  • the feature map 132 shows a constellation feature 134 representing the constellation 136 in the scene 133.
  • the feature map 132 further shows a terrain feature 138 generated from edge detection processes which detect the edge of the terrain 114 in the scene 133. Note that the feature map 132 can maintain relative position of the various objects in the scene 133. This can be used to determine navigational heading, speed at which the user is moving, coordinates, etc. as explained in more detail below.
  • Figures 3A , 3B and 3C further illustrate an image processor 119.
  • the image processor 119 is able to create and/or process feature maps from features detected by the detectors in the transparent optical device 118.
  • the image processor 119 can analyze and identify points, edges, shapes, etc. in the feature map.
  • geolocation functionality may be aided by using features such as object edge detection, object recognition, identifying regions of interest, etc. in a feature map. This may be accomplished in some embodiments by the image processor including certain artificial intelligence and/or machine learning functionality.
  • embodiments include a nightvision system 100.
  • the nightvision system 100 includes an underlying device 104.
  • an image intensifier is an example of an underlying device.
  • other underlying devices may be used in addition or alternatively.
  • some embodiments may include infrared CCD based or other low light level digital sensor system.
  • the underlying device 104 is configured to provide output light in a first spectrum from input light received at the underlying device 104.
  • the first spectrum may be light produced by the phosphor screen 112.
  • the nightvision system 100 includes a transparent optical device, such as transparent optical device 118, optically coupled in an overlapping fashion to the underlying device 104.
  • the transparent optical device 118 is configured to transmit light in the first spectrum from the underlying device 104 through the transparent optical device 118.
  • the transparent optical device 118 includes: an active area of a semiconductor chip.
  • the active area includes a plurality of active elements configured to cause the underlying device 104 to detect light from the underlying device 104.
  • active elements configured to cause the underlying device 104 to detect light from the underlying device 104.
  • light detectors integrated into a photodetector may be used to detect light from the underlying device 104.
  • the transparent optical device 118 further includes a plurality of transparent regions formed in the active area which are transparent to the light in the first spectrum to allow light in the first spectrum to pass through from the underlying device 104 to a user.
  • the transparent regions are configured in size and shape to cause the transparent optical device 118 to have a particular transmission efficiency for light in the first spectrum.
  • the nightvision system further includes an image processor, such as image processor 119 coupled to the transparent optical device 118.
  • the image processor is configured to process feature maps, such as feature map 132, produced using light detected by the plurality of active elements. This processing can be used to generate a navigational heading, the speed at which the user is moving, coordinates, etc.
  • the image processor 119 receives input from the transparent optical device 118.
  • the image processor 119 is further shown as receiving input from an integrated inertial measurement unit (IMU) sensor 120.
  • IMU integrated inertial measurement unit
  • the IMU sensor 120 may be integrated into the nightvision system 100.
  • Typical IMU sensors include accelerometers, gyroscopes, and potentially magnetometers that can be used to indicate force, angular rate, orientation, etc. this information can be used in combination with features detected from information captured by the transparent optical device 118 to generate a navigational heading, the speed at which the user is moving, coordinates, etc.
  • inclinometer or similar device can provide additional data for determining a navigational heading, the speed at which the user is moving, coordinates.
  • Figure 3A further illustrates a clock 121 providing data to the image processor 119.
  • This may be useful when making determinations having a temporal factor.
  • the transparent optical device 118 and the image processor 119 are able to detect features associated with lunar, celestial, and/other events that change over time, timing may be important for establishing a navigational heading, the speed at which the user is moving, coordinates, etc.
  • a present time can be used to pinpoint a navigational heading, the speed at which the user is moving, coordinates, etc.
  • embodiments may utilize an angle of inclination to attain celestial position at a known date and time.
  • Speed can be determined by using a plurality of captured images at various times where changes between captured images can be used to determine speed.
  • determination of position, heading, and/or speed can be accomplished using terrain or landmark featured with lunar and/or celestial tracking. Examples of this are illustrated in Figure 3B , where a feature map 132 includes a terrain feature correlated with a constellation feature 134. In this example, the IMU and/or inclinometer does not need to be used for the determination.
  • a range finder 125 may be additional used for additional terrain, topography, and landmark tracking precision.
  • the transparent optical device 118 is able to detect a particular topographical feature, landmark feature, or terrain feature, which is a known feature having a known location, and the range finder 125 is able to provide range information to the image processor 119, user position can be determined by using the known location of the known feature and simply adding the distance determined by the range finder 125.
  • the database 126 may include sufficient information that a perspective can be determined from the features detected by the transparent optical device 118 to allow for proper computation of position using the range finder 125 information and the known location of detected features.
  • some embodiments may further include functionality for connecting to online databases such that various on-line videos, commercial maps, image search information, satellite images, etc., can be used to match detected features with known features.
  • information may be stored locally in the database 126. This is particularly relevant when it is known beforehand in what area a user will be using the nightvision system 100.
  • the database 126 may store a topographical map including contour lines and the like.
  • the database 126 may store star maps that are particularly relevant to the time and place where a user will be using the nightvision system 100.
  • the database 126 may store images for particular landmarks or terrain that is expected to be encountered by the user.
  • the database 126 may store satellite images for a general location.
  • Some embodiments may include functionality for using currently available information regarding cloud formations or other weather-related information to determine location.
  • doppler radar or other weather information is available, such as by on-line databases, to the nightvision system 100, and weather features can be detected, then this information can be used to determine a location, heading, and/or speed in a fashion similar to other methods of determining location, heading, and/or speed using detected features.
  • Some embodiments may use aircraft whose position is known to determine location, heading, and/or speed. For example, embodiments could have information about a drone with a known flight pattern sending light beacons and/or having other features detectable by the transparent optical device. Knowing the position of the aircraft both in actual space as well as in an image captured by the transparent optical device 118 can allow the image processor 119 to determine location, heading and/or speed of the nightvision system 100. Note that embodiments may use commercially available flight trackers providing flight tracking information on-line which can be compared with features detected by the transparent optical device 118.
  • speed may be determined by the image processor 119. In some embodiments, this may be accomplished by using a sequential series of images captured by the transparent optical device. Knowing time between images and differences in location of the same features in each of the images can be used to estimate speed of travel.
  • a nightvision based "cairn" For example, an object, detectable by nightvision systems, with known features can be left in a known location.
  • the cairn can be detected by the transparent optical device 118, and analyzed by the image processor 119 to estimate a location of the nightvision system 100 based on the known location.
  • the cairn may be implemented as a beacon.
  • the cairn can output an IR signal that is detectable by the nightvision system, but imperceptible by a human's naked eye.
  • the IR signal may have a particular pattern.
  • an IR signal pattern may be a series of light pulses representing different alphanumeric characters.
  • the IR signal pattern may simply provide a unique identifier for the cairn where the database 126 includes a correlation (such as a table) correlating unique cairns with locations.
  • the IR signal pattern may be configured to specifically identify the location of the cairn, such as by providing a longitude/latitude or other location identification.
  • FIG. 4 illustrates a first nightvision system 100-1 and a second nightvision system 100-2.
  • Each of the nightvision systems are able to detect an object 140.
  • transparent optical devices and image processors in each of the respective nightvision systems 100-1 and 100-2 are able to detect and characterize the object 140.
  • the nightvision systems 100-1 and 100-2 are able to communicate with each other through communication channels, such as channel 142, which may be, for example, a wireless communication channel using wireless communication hardware.
  • the nightvision system 100-1 can obtain more precise location and/or heading information by using information from a transparent optical device and image processor at the nightvision system 100-1 as well as by using information provided over the communication channel 142 obtained at the nightvision system 100-2 using a different transparent optical device and image processor at the nightvision system 100-2. Headsets looking at item to determine location of one of the headsets, and or location of an item.
  • one nightvision system can determine a location of a different nightvision system when both systems characterize the same object from different perspectives by sharing the characterization of the object between the nightvision systems.
  • nightvision system 100-1 and 100-2 may both characterize and create feature maps of the object 140 from different perspectives.
  • the nightvision system 100-2 share feature map information on the communication channel with the nightvision system 100-1, the nightvision system 100-1 can determine the location of the nightvision system 100-2.
  • this information can be used to determine or refresh heading and/or inclination.
  • this information can be used to determine or refresh heading and/or inclination.
  • embodiments can use known GPS position to determine or refresh heading or inclination by looking at the, moon, horizon or other landmarks and then back calculating from the known GPS location.
  • Embodiments may include functionality for determining a position of an object detected by the transparent optical device 118 and the image processor 119 once the position of a nightvision system is known.
  • such systems work best by using passive means.
  • such systems may function best when the image processor 119 computes the position of a detected object rather than measuring the distance of the detected object from the nightvision system 100.
  • These computations may be performed using depth computed using captured images from a plurality of different nightvision systems. For example, stereoscopic computations can be used when transparent optical devices from two different nightvision systems capture images.
  • a single image processor can be used to process the images to compute distance and to further compute position using known position of a nightvision system.
  • communication can occur between image processors, such as is illustrated in Figure 4 , allowing distance communications to be performed.
  • a size of a detected object in an image captured by a transparent optical device can be used to compute distance from a nightvision system with a known location.
  • the detected size provides information that can be used to compute the distance of the object of the nightvision system.
  • expected sizes of classes of objects such as human objects, vehicle objects, or building objects
  • particular object sizes may be known.
  • systems may be able to identify a specific human and have specific information on the size of that specific human.
  • systems may know the size of particular specific vehicles or buildings and may be able to use that information to compute distances from a known location of a nightvision system.
  • determined location, heading, and/or speed information using the methods illustrated herein can be displayed to the user using optional display functionality of the transparent optical device 118 or other devices.
  • location, heading, and/or speed information can be transmitted to other entities. This may be done irrespective of whether or not the information is displayed to the user.
  • the nightvision system may be implemented where the underlying device 104 comprises an image intensifier.
  • the nightvision system 100 may be implanted where the image processor is configured to process images produced using light detected by the first plurality of active elements of the transparent optical device to identify celestial features to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • the nightvision system 100 may be implanted where the image processor is configured to process images produced using light detected by the first plurality of active elements of the transparent optical device to identify topographical features to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • the nightvision system 100 may be implanted where the image processor is configured to process images produced using light detected by the first plurality of active elements of the transparent optical device to identify landmark features to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • the nightvision system 100 may further include a clock.
  • the clock is used to process images produced using light detected by the first plurality of active elements of the transparent optical device to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • the nightvision system 100 may further include an IMU.
  • the IMU is used to process images produced using light detected by the first plurality of active elements of the transparent optical device to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • the nightvision system 100 may further include a database.
  • the database includes at least one of maps, topographic maps, satellite imagery, celestial motion catalog, or landmarks. At least one of the maps, topographic maps, satellite imagery, celestial motion catalog, or landmarks is used to process images produced using light detected by the first plurality of active elements of the transparent optical device to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • the nightvision system 100 may be implanted where the image processor is configured to process images produced using light detected by the first plurality of active elements of the transparent optical device to identify a cairn to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • the nightvision system 100 may be implanted where the image processor is configured to process images produced using light detected by the first plurality of active elements of the transparent optical device to identify an IR beacon to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • the method 600 includes detecting light using a transparent photodetector (act 610).
  • output light is provided from an underlying device, in a first spectrum from input light received at the underlying device.
  • the light in the first spectrum is transmitted through a transparent optical device optically coupled in an overlapping fashion to the underlying device, through an active area of a semiconductor chip, through a first plurality of transparent regions formed in the active area which are transparent to the light in the first spectrum to allow light in the first spectrum to pass through from the underlying device to a user.
  • the first plurality of transparent regions are configured in size and shape to cause the transparent optical device to have a first transmission efficiency for the light in the first spectrum.
  • the transparent optical device comprises the photodetector.
  • Act 610 may be performed by detecting light from the underlying device using a first plurality of active elements (implemented as a photodetector, in this example) configured in the active area.
  • the method 600 further includes processing feature maps produced using the detected light (act 620) This may be performed at an image processor.
  • the method 600 further includes determining at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected (act 630)
  • the method 600 may be practiced where processing feature maps comprises identifying celestial features to determine at least one of location, heading, elevation, or speed of the nightvision system.
  • processing feature maps comprises identifying topographical features to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • processing feature maps comprises identifying landmark features to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • processing feature maps comprises using a clock input to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • processing feature maps comprises using an IMU input to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • processing feature maps comprises using at least one of maps, topographic maps, satellite imagery, celestial motion catalog, or landmarks in a database at the nightvision system to determine at least one of location, heading, elevation, or speed of the nightvision system or location of objects detected by the first plurality of active elements.
  • the method 600 may be practiced where wherein processing feature maps comprises using feature maps from a plurality of different nightvision systems.
  • the methods may be practiced by a computer system including one or more processors and computer-readable media such as computer memory.
  • the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Optical Communication System (AREA)
EP23188634.2A 2022-08-01 2023-07-31 Navigation dans des environnements independants du gps par vision nocturne amelioree avec dispositif optique transparent Pending EP4321920A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/878,817 US20240035848A1 (en) 2022-08-01 2022-08-01 Navigation in gps denied environments through enhanced nightvision with transparent optical device

Publications (1)

Publication Number Publication Date
EP4321920A1 true EP4321920A1 (fr) 2024-02-14

Family

ID=87550985

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23188634.2A Pending EP4321920A1 (fr) 2022-08-01 2023-07-31 Navigation dans des environnements independants du gps par vision nocturne amelioree avec dispositif optique transparent

Country Status (2)

Country Link
US (1) US20240035848A1 (fr)
EP (1) EP4321920A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200400944A1 (en) * 2016-07-21 2020-12-24 Eotech, Llc Enhanced vision systems and methods
US20210293546A1 (en) * 2016-03-11 2021-09-23 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8471906B2 (en) * 2006-11-24 2013-06-25 Trex Enterprises Corp Miniature celestial direction detection system
US20140293266A1 (en) * 2011-08-04 2014-10-02 Ying Hsu Local Alignment and Positioning Device and Method
US10158427B2 (en) * 2017-03-13 2018-12-18 Bae Systems Information And Electronic Systems Integration Inc. Celestial navigation using laser communication system
US12078454B2 (en) * 2019-08-14 2024-09-03 Cubic Defense Applications, Inc. Universal laserless training architecture

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210293546A1 (en) * 2016-03-11 2021-09-23 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof
US20200400944A1 (en) * 2016-07-21 2020-12-24 Eotech, Llc Enhanced vision systems and methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MELZER JAMES E ET AL: "Location and head orientation tracking in GPS-denied environments", PROCEEDINGS OF SPIE; [PROCEEDINGS OF SPIE ISSN 0277-786X VOLUME 10524], SPIE, US, vol. 10642, 2 May 2018 (2018-05-02), pages 106420P - 106420P, XP060107428, ISBN: 978-1-5106-1533-5, DOI: 10.1117/12.2304013 *

Also Published As

Publication number Publication date
US20240035848A1 (en) 2024-02-01

Similar Documents

Publication Publication Date Title
US10180327B1 (en) Methods and apparatus for navigational aiding using celestial object tracking
US10613231B2 (en) Portable GNSS survey system
US20140293266A1 (en) Local Alignment and Positioning Device and Method
US11796682B2 (en) Methods for geospatial positioning and portable positioning devices thereof
CN102575960A (zh) 导航装置和方法
US20090008554A1 (en) Method for infrared imaging of living or non-living objects including terrains that are either natural or manmade
CN106468547A (zh) 利用多个光学传感器的用于自导飞行器的不依赖全球定位系统(“gps”)的导航系统
US8363928B1 (en) General orientation positioning system
US20150130950A1 (en) Method and system for integrated optical systems
EP4261591A1 (fr) Éclairage intelligent pour la distribution de voisinage à l'aide d'un réseau de détecteurs semi-transparent
CA3171345C (fr) Reseau de detecteurs semi-transparents pour des systemes de vision nocturne a mise au point automatique
US8942421B1 (en) Geolocation of remotely sensed pixels by introspective landmarking
CN108253942B (zh) 一种提高倾斜摄影测量空三质量的方法
US11460302B2 (en) Terrestrial observation device having location determination functionality
JP6707378B2 (ja) 自己位置推定装置および自己位置推定方法
EP4321920A1 (fr) Navigation dans des environnements independants du gps par vision nocturne amelioree avec dispositif optique transparent
CN115235386A (zh) 一种基于航拍的冰川面积变化检测方法
Degnan et al. Second generation airborne 3D imaging lidars based on photon counting
JP3353571B2 (ja) 地球形状計測装置
KR100679864B1 (ko) 지역정보의 표시가 가능한 이동통신 단말기 및 그 지역정보표시방법
US11762096B2 (en) Methods and apparatuses for determining rotation parameters for conversion between coordinate systems
EP4414653A1 (fr) Système de vision nocturne à décodage par balayage, acquisition, suivi et prf de cible laser
JP2006153772A (ja) 測量装置
Meguro et al. Development of positioning technique using omni-directional IR camera and aerial survey data
EP4167584A1 (fr) Auto-alignement de superposition multi-capteurs à l'aide d'un détecteur/affichage bidirectionnel

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230731

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR