EP2778745A2 - Night vision display overlaid with sensor data - Google Patents
Night vision display overlaid with sensor data Download PDFInfo
- Publication number
- EP2778745A2 EP2778745A2 EP14159939.9A EP14159939A EP2778745A2 EP 2778745 A2 EP2778745 A2 EP 2778745A2 EP 14159939 A EP14159939 A EP 14159939A EP 2778745 A2 EP2778745 A2 EP 2778745A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- night vision
- user
- sensor data
- environment
- video signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/12—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
- G02B23/125—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification head-mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0189—Sight systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
Definitions
- the present disclosure relates to night vision display devices, and in particular, user wearable night vision display devices such as night vision goggles.
- GPS global positioning system
- GPS receivers allow users, such as soldiers to navigate in unfamiliar territory, track potential targets, and provide guidance to "smart" bombs and missiles.
- a night vision device such as a pair of night vision goggles, provides enhanced images of low light environments.
- Night vision is made possible by a combination of two approaches: increasing spectral range, and increasing intensity range.
- some night vision devices operate by collecting tiny amounts of visible light, converting the photons of light into electrons, amplifying the number of electrons in a microchannel plate, and converting the amplified electrons back to a visible image.
- the enhanced images allow soldiers to operate effectively while remaining safe under the cover of darkness.
- a gunfire detector is a system that detects and conveys the location of gunfire or other weapon fire using acoustic and/or optical sensors or arrays of sensors.
- the detectors are used by law enforcement, security, military and businesses to identify the source and, in some cases, the direction of gunfire and/or the type of weapon fired.
- Most systems possess three main components: a microphone or sensors, a processing unit, and a user-interface that displays gunfire alerts.
- Sensor data indicative of a user's environment is received from a sensor.
- a video signal is generated which comprises a visual representation of the sensor data.
- the video signal is combined with a night vision view of the user's environment to overlay the visual representation of the sensor data over the night vision view of the user's environment.
- the overlaid night vision view of the user's environment is displayed to the user.
- FIG. 1 Depicted in FIG. 1 is a block diagram of an apparatus 100 configured to overlay a visual representation of sensor data over a night vision view of a user's environment.
- the apparatus includes a sensor 110 embodied in a gunshot detector, a video generator 120 and a display device 130.
- display device 130 is embodied in a night vision goggle which is configured to display an enhanced image of a low-light environment.
- night vision goggle 130 displays to a user a night vision view of the user's environment.
- Gunshot detector 110 provides sensor data 140 to video generator 120, which provides a video signal 150 to display device 130.
- a night vision display such as a night vision goggle
- a sensor incorporating its own display device can be difficult.
- a GPS device with a backlit screen when viewed through a pair of night vision goggles, will oversaturate the night vision device, "whiting out” the display in the night vision goggles. Therefore, a user may have to remove their night vision goggles in order to view the display of a GPS device.
- a backlit display of a gunshot detector may similarly "white out” the display of a pair of night vision goggles.
- removing night vision goggles to view a backlit screen comes with numerous drawbacks.
- removing and replacing the night vision goggles can be cumbersome and time consuming, particularly if the user is attempting to move quickly through difficult terrain.
- a user's eyes have adapted to low light conditions, looking into a backlit screen will destroy the user's night-adapted vision.
- the light emitted from the backlit screen may alert other individuals, such as enemy combatants, to the user's location.
- Some gunshot detectors include an ear piece, with a computer generated voice providing an auditory indication of the location of a detected gunshot.
- these auditory indications may be heard by unintended parties, the shooter of the detected gunshot for example.
- individuals may find auditory indications of sensor data to be less convenient, descriptive, and accurate than visual indications.
- the device of FIG. 1 overlays video signal 150, which incorporates a visual representation of sensor data 140, over the display of night vision goggle 130, allowing a user to view the sensor data 140 without removing night vision goggles 130, and without relying on auditory representations of the data.
- Gunshot detector 110 through the detection of the sound of a gunshot, and/or through the light emitted by a muzzle flash, is able to provide sensor data about the location of the gunshot. For example, if gunshot detector 110 is worn by the user, gunshot detector 110 may determine the location of the gunshot relative to the user, and provide sensor data, such as the range and direction for the location of the gun shot. Similarly, if gunshot detector 110 is located remotely from the user, the sensor data 140 provided by gunshot detector 110 can be combined with location information for the user to determine the location of the gunshot relative to the user.
- sensor 110 may also include a global positioning system (“GPS”) receiver. Accordingly, sensor 110 may receive global positioning data from a global positioning satellite. For example, the sensor 110 may receive global positioning data for the user, other individuals in the area, or the location of other items of interest, such as the location of a gunshot. The global positioning data may be provided to video generator 120 through sensor data 140.
- sensor 110 may be embodied in a vehicle diagnostic sensor configured to provide diagnostic data for a vehicle, such as the vehicle in which the user is travelling. The diagnostic data may be provided to video generator 120 through sensor data 140, and displayed to the user through night vision device 130.
- sensor 110 may be embodied in a light detection and ranging (“LIDAR”) device.
- LIDAR light detection and ranging
- Video generator 120 may be included in a multipurpose computing device, such as a laptop, a tablet computer, a smart phone, or other multipurpose computing device. Accordingly, the video generator 120 may be embodied in a microcontroller or microprocessor in order to generate video signal 150. In other examples, the video generator 120 may be a purpose-built processor, such as an application specific integrated circuit ("ASIC") or a field programmable gate array (“FPGA"). Whether the video processor 120 utilizes a multipurpose process, or a purpose built processor, the video processor may be incorporated into the night vision goggle 130, or be arranged external to the night vision goggle 130 and configured to communicate video signal 150 through a wired or wireless connection. Similarly, video generator 120 may receive the sensor data through a wired or wireless connection.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- step 210 sensor data indicative of a user's environment is received.
- This sensor data may be the output from a gunshot detecting sensor, and therefore, the data may be indicative of the location of a gunshot relative to the sensor.
- the sensor data may also include a combination of data streams.
- the data may include data from a gunshot detector indicating the location of a gunshot relative to a detector external to the user, as well as GPS data for the location of the detector and for the location of the user, thereby allowing a determination of the relative positions of the gunshot and the user.
- the data indicative of the user's environment may be GPS data indicating the location of the user and/or the location of other items of interest to the user. For example, if the user travelling to a particular destination, the sensor data may indicate the direction of the destination relative to the user's current location. Of course, other kinds of sensor data may also be received.
- a video signal comprising a visual representation of the sensor data is generated.
- the sensor data comprises gunshot location data as well as GPS coordinates for the user
- generating the video signal may comprise generating an alphanumeric representation of the gunshot's location.
- the location of the user in the environment and the orientation of the user may be known. Therefore, the generated video signal may be a visual representation, such as an arrow or crosshairs, indicating where in the user's night vision view of the environment a gunshot originated.
- the sensor data includes GPS coordinates for a user's desired destination
- the video signal may include a visual representation of the direction the user needs to travel to reach the destination.
- step 230 the video signal is combined with a night vision view of the user's environment, thereby overlaying the visual representation of the sensor data over the night vision view of the user's environment. Specific examples of overlaying the video signal with the night vision view of the user's environment are described in greater detail in reference to FIGs. 3-5 .
- step 240 the night vision view of the user's environment overlaid with the video signal is displayed to the user.
- unenhanced image 310 illustrates an example of how a user may view their environment absent any night vision enhancement.
- night vision display 320 illustrates how a user may view their environment after enhancement from, for example, a monocular night vision device which may be incorporated into the gun sight of a firearm, or a monocular night vision device configured to be wearable, such as a head-mounted night vision display.
- night vision display 320 may be provided by night vision goggles.
- Image 330 shows how the overlay 340 of the video signal 150 is displayed with the enhanced night vision view of image 320.
- video generator 120 receives sensor data 140 from sensor 110.
- the sensor data 140 may be received in the form of a serial stream, such as serial stream of binary characters. While the serial stream may be encoded with coordinate information for a user's location, the location of a user's destination, the location of a gunshot, the location of a user's desired direction, or the location of another item of interest for the user, the sensor data itself is non-visual data. Accordingly, video generator 120 converts this serial stream into a video signal 150 to provide a visual representation of the sensor data 140 that can be overlaid on a night vision image. As depicted in FIG. 3 , the video generator 120 generates video signal 150 which includes an alphanumeric representation of sensor data 140.
- video signal 150 includes decimal longitude and latitude coordinates for the location of a gunshot, in this case, "51.51, -0.15.”
- video signal 150 could have included the equivalent degrees, minutes and seconds coordinates (e.g. "51° 30' 35.9994, 0° 9' 0""), or another visual representation of the sensor data.
- a first display and a second display may be used.
- FIG. 6 depicted therein in is night vision goggle 130 worn by a user 610 which includes a first display 616 and a second display 620.
- Night vision goggle 130 operates by receiving external light which is incident on photocathode 612.
- Photocathode 612 converts the incident light into electrons.
- the electrons are multiplied by microchannel plate 614, and subsequently converted back into light at phosphor screen 616, thereby displaying an enhanced night vision view of the light originally incident on photocathode 612. If the night vision goggle 130 displays the night vision image 320 of FIG.
- video signal 150 of FIG. 3 may be displayed to the user with transparent screen 620 which is arranged between phosphor screen 616 and the user. Accordingly, video signal 150 from FIG. 3 will appear overlaid over the night vision image displayed by phosphor screen 616 due to the arrangement of transparent display 620.
- Transparent screen 620 may be embodied in any known transparent display, such as an organic light emitting diode (“OLED”) display.
- OLED organic light emitting diode
- other night vision devices may produce a video signal which includes night vision image 320.
- video signal 150 may be combined with the night vision image 320 through the use of a video processing device.
- a video processing device may combine video signal 150 with the output of a microchannel plate to generate a second video signal that displays overlaid image 330 when projected from a display device.
- a first video signal comprising the night vision image 320 may be combined with video signal 150 to generate a third video signal that displays overlaid image 330 when displayed from a display device.
- FIG. 4 depicted therein is a bi-ocular night vision display device, in which a single image is separately displayed to each eye of the user.
- the night vision image is overlaid with a video representation of sensor data 140.
- unenhanced image 410 of FIG. 4 illustrates an example of how a user may view their environment absent any night vision enhancement.
- a single night vision enhanced image 420 is generated by the night vision device, and this same image is displayed to each of the user's eyes through left eye image 430a and right eye image 430b.
- video signal 150 from video generator 120 may be overlaid over both left eye image 430a and right eye image 430b, as indicated by overlays 440a and 440b, respectively.
- left eye image 430a and right eye image 430b are interpreted by the user as a single viewed image 450.
- overlays 440a and 440b, as well as video signal 150 are not simple alphanumeric representations of sensor data 140. Instead, overlays 440a, 440b and video signal 150 depict an arrow pointing to the location of a gunshot or another item of interest to the user. In order to have the arrows accurately point to the correct location, GPS data 460 is also provided by GPS receiver 470.
- GPS data 460 may provide the location of the user and the sensor 110. Additional data may provide the orientation of the user and/or the sensor. For example, a magnetic or gyroscopic sensor may be included in sensor 110 and the GPS receiver 470, thereby allowing orientation data to be included in sensor data 140 and GPS data 460, respectively. Similarly, a motion vector may be calculated for the user from GPS data 460, which can also be used to determine orientation of the user.
- GPS receiver 470 and the sensor 110 are depicted as two separate devices, the functions of the GPS receiver 470 and the sensor 110 may be embodied in more or fewer devices.
- the sensor 110 may be embodied as a GPS receiver, and therefore the functions of the GPS receiver 470 would be provided by sensor 110.
- the locations for the user and the sensor 110 may be sent by separate GPS receivers.
- sensor 110 is embodied in a gunshot detector and on the person of the user, only the orientation data and the gunshot detector data may be necessary to accurately orient the arrow in video signal 150, and the GPS data may be omitted.
- FIG. 5 depicted therein is a binocular display of a night vision image overlaid with a video representation of sensor data.
- a binocular display provides a separate image to each eye of the user.
- there are two separate optical channels in the night vision device of FIG. 5 Specifically, left unenhanced image 510a is enhanced by the night vision device to generate left night vision image 520a. The left enhanced image is overlaid with a visual representation of sensor data to provide left eye image 530a.
- right unenhanced image 510b is enhanced by the night vision device to generate right night vision image 520b. The right enhanced image is overlaid with a visual representation of sensor data to provide right eye image 530b.
- the user's brain interprets them as a single stereoscopic image 550.
- overlaid images 530a and 530b are combined to form stereoscopic image 550, the user may be provided with a 3-dimensional image. Yet, the use of two optical channels may add additional complexity to the overlaying of sensor data onto night vision images
- left enhanced image 520a will be slightly different than right enhanced image 520b. Accordingly, left overlay 540a may need to be positioned in a different location in left enhanced image than where right overlay 540b is positioned in right enhanced image 520b.
- the difference in positioning of left overlay 540a and right overlay 540b is depicted in left overlaid image 530a and right overlaid image 530b, though the positioning has been exaggerated to better illustrate the point.
- night vision device data 560 from night vision device 570.
- night vision device 570 may provide data indicative of, for example, where and how night vision device 570 is arranged and focused.
- the night vision device data 560 may include information indicating the relative positions of the left optical channel used to create the left enhanced image 520a and the right optical channel used to create the right enhanced image 520b.
- the video generator may generate two separate video signals, left video signal 150b and right video signal 150a. Left video signal 150a will be overlaid on left enhanced image 520a to generate left eye image 530a.
- right video signal 150b is overlaid on right enhanced image 520b to generate right eye image 540b. Because video generator 120 has taken the night vision device data 560 into consideration when generating left video signal 150a and right video signal 150b, when the user views stereoscopic overlay 580 in stereoscopic image 550, stereoscopic overly 580 may accurately indicate the location of the item of interest to the user.
- night vision device 570 may receive a single video signal, similar to the signal provided in FIGs. 3 and 4 .
- Night vision device 570 being in possession of night vision device data 560, can appropriately modify the video signal to appropriate position overlay 540a in overlaid image 530a and overlay 540b in overlaid image 530b, respectively.
- FIG. 6 depicted therein are some of the features of night vision displays that can be advantageously leveraged in example display devices utilizing the techniques described herein.
- user 610 wearing night vision goggles 130.
- the display screen 630 of night vision goggle 130 is set within the external casing of night vision goggle 130. Accordingly, light from the display 620 is unlikely to leak into the environment surrounding the user, thereby preserving the user's light security.
- Night vision goggle 130 may also be configured to include features that conform to the face of user 610, such as eyepiece 630. Eyepiece 630 may further prevent light emitted by display 620 from leaking into the user's environment, further maintaining the user's light security.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Controls And Circuits For Display Device (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
- The present disclosure relates to night vision display devices, and in particular, user wearable night vision display devices such as night vision goggles.
- Technology is becoming more and more integrated into the equipment used by modern soldiers. For example, global positioning system ("GPS") receivers, night vision devices, and gunfire detectors are becoming standard equipment for today's modern soldier.
- GPS receivers allow users, such as soldiers to navigate in unfamiliar territory, track potential targets, and provide guidance to "smart" bombs and missiles.
- A night vision device, such as a pair of night vision goggles, provides enhanced images of low light environments. Night vision is made possible by a combination of two approaches: increasing spectral range, and increasing intensity range. For example, some night vision devices operate by collecting tiny amounts of visible light, converting the photons of light into electrons, amplifying the number of electrons in a microchannel plate, and converting the amplified electrons back to a visible image. The enhanced images allow soldiers to operate effectively while remaining safe under the cover of darkness.
- A gunfire detector is a system that detects and conveys the location of gunfire or other weapon fire using acoustic and/or optical sensors or arrays of sensors. The detectors are used by law enforcement, security, military and businesses to identify the source and, in some cases, the direction of gunfire and/or the type of weapon fired. Most systems possess three main components: a microphone or sensors, a processing unit, and a user-interface that displays gunfire alerts.
-
-
FIG. 1 is an example night vision display device in which a night vision image is overlaid with gunshot detector sensor data. -
FIG. 2 is a flowchart illustrating a method of overlaying gunshot detector sensor data on a night vision display. -
FIG. 3 is a first example of overlaying gunshot detector sensor data on a night vision display. -
FIG. 4 is a second example of overlaying gunshot detector sensor data on a night vision display. -
FIG. 5 is a third example of overlaying gunshot detector sensor data on a night vision display. -
FIG. 6 illustrates a user wearing a pair of night vision goggles configured to display a night vision image overlaid with gunshot detector sensor data. - Sensor data indicative of a user's environment is received from a sensor. A video signal is generated which comprises a visual representation of the sensor data. The video signal is combined with a night vision view of the user's environment to overlay the visual representation of the sensor data over the night vision view of the user's environment. The overlaid night vision view of the user's environment is displayed to the user.
- Depicted in
FIG. 1 is a block diagram of anapparatus 100 configured to overlay a visual representation of sensor data over a night vision view of a user's environment. The apparatus includes asensor 110 embodied in a gunshot detector, avideo generator 120 and adisplay device 130. As depicted inFIG. 1 ,display device 130 is embodied in a night vision goggle which is configured to display an enhanced image of a low-light environment. In other words,night vision goggle 130 displays to a user a night vision view of the user's environment.Gunshot detector 110 providessensor data 140 tovideo generator 120, which provides avideo signal 150 to displaydevice 130. - Unlike the device of
FIG. 1 , using a night vision display, such as a night vision goggle, in conjunction with a sensor incorporating its own display device can be difficult. For example, a GPS device with a backlit screen, when viewed through a pair of night vision goggles, will oversaturate the night vision device, "whiting out" the display in the night vision goggles. Therefore, a user may have to remove their night vision goggles in order to view the display of a GPS device. Similarly, a backlit display of a gunshot detector may similarly "white out" the display of a pair of night vision goggles. Of course, removing night vision goggles to view a backlit screen comes with numerous drawbacks. For example, removing and replacing the night vision goggles can be cumbersome and time consuming, particularly if the user is attempting to move quickly through difficult terrain. Additionally, if a user's eyes have adapted to low light conditions, looking into a backlit screen will destroy the user's night-adapted vision. Furthermore, the light emitted from the backlit screen may alert other individuals, such as enemy combatants, to the user's location. - Other sensors rely on audio cues to communicate their data to a user. For example, some gunshot detectors include an ear piece, with a computer generated voice providing an auditory indication of the location of a detected gunshot. Unfortunately, these auditory indications may be heard by unintended parties, the shooter of the detected gunshot for example. Furthermore, individuals may find auditory indications of sensor data to be less convenient, descriptive, and accurate than visual indications.
- In order to provide a different sensor/display solution, the device of
FIG. 1 overlays video signal 150, which incorporates a visual representation ofsensor data 140, over the display ofnight vision goggle 130, allowing a user to view thesensor data 140 without removingnight vision goggles 130, and without relying on auditory representations of the data. -
Gunshot detector 110, through the detection of the sound of a gunshot, and/or through the light emitted by a muzzle flash, is able to provide sensor data about the location of the gunshot. For example, ifgunshot detector 110 is worn by the user,gunshot detector 110 may determine the location of the gunshot relative to the user, and provide sensor data, such as the range and direction for the location of the gun shot. Similarly, ifgunshot detector 110 is located remotely from the user, thesensor data 140 provided bygunshot detector 110 can be combined with location information for the user to determine the location of the gunshot relative to the user. - According to other examples,
sensor 110 may also include a global positioning system ("GPS") receiver. Accordingly,sensor 110 may receive global positioning data from a global positioning satellite. For example, thesensor 110 may receive global positioning data for the user, other individuals in the area, or the location of other items of interest, such as the location of a gunshot. The global positioning data may be provided tovideo generator 120 throughsensor data 140. In other examples,sensor 110 may be embodied in a vehicle diagnostic sensor configured to provide diagnostic data for a vehicle, such as the vehicle in which the user is travelling. The diagnostic data may be provided tovideo generator 120 throughsensor data 140, and displayed to the user throughnight vision device 130. In other examples,sensor 110 may be embodied in a light detection and ranging ("LIDAR") device. -
Video generator 120 may be included in a multipurpose computing device, such as a laptop, a tablet computer, a smart phone, or other multipurpose computing device. Accordingly, thevideo generator 120 may be embodied in a microcontroller or microprocessor in order to generatevideo signal 150. In other examples, thevideo generator 120 may be a purpose-built processor, such as an application specific integrated circuit ("ASIC") or a field programmable gate array ("FPGA"). Whether thevideo processor 120 utilizes a multipurpose process, or a purpose built processor, the video processor may be incorporated into thenight vision goggle 130, or be arranged external to thenight vision goggle 130 and configured to communicatevideo signal 150 through a wired or wireless connection. Similarly,video generator 120 may receive the sensor data through a wired or wireless connection. - With reference now made to
FIG. 2 , depicted therein is aflowchart 200 illustrating a process of overlaying a visual representation of sensor data onto a night vision image. The process begins instep 210 where sensor data indicative of a user's environment is received. This sensor data may be the output from a gunshot detecting sensor, and therefore, the data may be indicative of the location of a gunshot relative to the sensor. The sensor data may also include a combination of data streams. For example, the data may include data from a gunshot detector indicating the location of a gunshot relative to a detector external to the user, as well as GPS data for the location of the detector and for the location of the user, thereby allowing a determination of the relative positions of the gunshot and the user. According to other examples, the data indicative of the user's environment may be GPS data indicating the location of the user and/or the location of other items of interest to the user. For example, if the user travelling to a particular destination, the sensor data may indicate the direction of the destination relative to the user's current location. Of course, other kinds of sensor data may also be received. - In
step 220, a video signal comprising a visual representation of the sensor data is generated. For example, if the sensor data comprises gunshot location data as well as GPS coordinates for the user, generating the video signal may comprise generating an alphanumeric representation of the gunshot's location. According to other examples, the location of the user in the environment and the orientation of the user may be known. Therefore, the generated video signal may be a visual representation, such as an arrow or crosshairs, indicating where in the user's night vision view of the environment a gunshot originated. Similarly, if the sensor data includes GPS coordinates for a user's desired destination, the video signal may include a visual representation of the direction the user needs to travel to reach the destination. - In
step 230, the video signal is combined with a night vision view of the user's environment, thereby overlaying the visual representation of the sensor data over the night vision view of the user's environment. Specific examples of overlaying the video signal with the night vision view of the user's environment are described in greater detail in reference toFIGs. 3-5 . - Finally, in
step 240 the night vision view of the user's environment overlaid with the video signal is displayed to the user. - With reference now made to
FIG. 3 , depicted therein is an example of overlaying a visual representation ofsensor data 140 on anight vision image 320. Specifically,unenhanced image 310 illustrates an example of how a user may view their environment absent any night vision enhancement. On the other hand,night vision display 320 illustrates how a user may view their environment after enhancement from, for example, a monocular night vision device which may be incorporated into the gun sight of a firearm, or a monocular night vision device configured to be wearable, such as a head-mounted night vision display. According to other examples,night vision display 320 may be provided by night vision goggles.Image 330 shows how theoverlay 340 of thevideo signal 150 is displayed with the enhanced night vision view ofimage 320. - In order to provide overlaid
image 330,video generator 120 receivessensor data 140 fromsensor 110. Thesensor data 140 may be received in the form of a serial stream, such as serial stream of binary characters. While the serial stream may be encoded with coordinate information for a user's location, the location of a user's destination, the location of a gunshot, the location of a user's desired direction, or the location of another item of interest for the user, the sensor data itself is non-visual data. Accordingly,video generator 120 converts this serial stream into avideo signal 150 to provide a visual representation of thesensor data 140 that can be overlaid on a night vision image. As depicted inFIG. 3 , thevideo generator 120 generatesvideo signal 150 which includes an alphanumeric representation ofsensor data 140. Specifically,video signal 150 includes decimal longitude and latitude coordinates for the location of a gunshot, in this case, "51.51, -0.15." Of course,video signal 150 could have included the equivalent degrees, minutes and seconds coordinates (e.g. "51° 30' 35.9994, 0° 9' 0""), or another visual representation of the sensor data. - In order to overlay the video signal with the night vision view, a first display and a second display may be used. Turning briefly to
FIG. 6 , depicted therein in isnight vision goggle 130 worn by auser 610 which includes afirst display 616 and asecond display 620.Night vision goggle 130 operates by receiving external light which is incident onphotocathode 612.Photocathode 612 converts the incident light into electrons. The electrons are multiplied bymicrochannel plate 614, and subsequently converted back into light atphosphor screen 616, thereby displaying an enhanced night vision view of the light originally incident onphotocathode 612. If thenight vision goggle 130 displays thenight vision image 320 ofFIG. 3 withphosphor screen 616,video signal 150 ofFIG. 3 may be displayed to the user withtransparent screen 620 which is arranged betweenphosphor screen 616 and the user. Accordingly,video signal 150 fromFIG. 3 will appear overlaid over the night vision image displayed byphosphor screen 616 due to the arrangement oftransparent display 620.Transparent screen 620 may be embodied in any known transparent display, such as an organic light emitting diode ("OLED") display. - Returning to
FIG. 3 , other night vision devices may produce a video signal which includesnight vision image 320. In such devices,video signal 150 may be combined with thenight vision image 320 through the use of a video processing device. Specifically, a video processing device may combinevideo signal 150 with the output of a microchannel plate to generate a second video signal that displays overlaidimage 330 when projected from a display device. According to other examples, a first video signal comprising thenight vision image 320 may be combined withvideo signal 150 to generate a third video signal that displays overlaidimage 330 when displayed from a display device. - With reference now made to
FIG. 4 , depicted therein is a bi-ocular night vision display device, in which a single image is separately displayed to each eye of the user. As with the example ofFIG. 3 , the night vision image is overlaid with a video representation ofsensor data 140. Also similar toimage 310 ofFIG. 3 ,unenhanced image 410 ofFIG. 4 illustrates an example of how a user may view their environment absent any night vision enhancement. Because the device ofFIG. 4 is bi-ocular, a single night visionenhanced image 420 is generated by the night vision device, and this same image is displayed to each of the user's eyes throughleft eye image 430a andright eye image 430b. Accordingly,video signal 150 fromvideo generator 120 may be overlaid over bothleft eye image 430a andright eye image 430b, as indicated by overlays 440a and 440b, respectively. When viewed by the user,left eye image 430a andright eye image 430b are interpreted by the user as a single viewedimage 450. - As depicted in
FIG. 4 , overlays 440a and 440b, as well asvideo signal 150, are not simple alphanumeric representations ofsensor data 140. Instead, overlays 440a, 440b andvideo signal 150 depict an arrow pointing to the location of a gunshot or another item of interest to the user. In order to have the arrows accurately point to the correct location,GPS data 460 is also provided byGPS receiver 470. - If
sensor 110 is not located on the user's person, the relative positions of thesensor 110 and the user must be known to accurately orient the arrow invideo signal 150. Accordingly,GPS data 460 may provide the location of the user and thesensor 110. Additional data may provide the orientation of the user and/or the sensor. For example, a magnetic or gyroscopic sensor may be included insensor 110 and theGPS receiver 470, thereby allowing orientation data to be included insensor data 140 andGPS data 460, respectively. Similarly, a motion vector may be calculated for the user fromGPS data 460, which can also be used to determine orientation of the user. Of course, while theGPS receiver 470 and thesensor 110 are depicted as two separate devices, the functions of theGPS receiver 470 and thesensor 110 may be embodied in more or fewer devices. For example, thesensor 110 may be embodied as a GPS receiver, and therefore the functions of theGPS receiver 470 would be provided bysensor 110. Similarly, the locations for the user and thesensor 110 may be sent by separate GPS receivers. On the other hand, ifsensor 110 is embodied in a gunshot detector and on the person of the user, only the orientation data and the gunshot detector data may be necessary to accurately orient the arrow invideo signal 150, and the GPS data may be omitted. - Turning now to
FIG. 5 , depicted therein is a binocular display of a night vision image overlaid with a video representation of sensor data. Unlike the bi-ocular display ofFIG. 4 , a binocular display provides a separate image to each eye of the user. Accordingly, there are two separate optical channels in the night vision device ofFIG. 5 . Specifically, leftunenhanced image 510a is enhanced by the night vision device to generate leftnight vision image 520a. The left enhanced image is overlaid with a visual representation of sensor data to provideleft eye image 530a. Similarly, rightunenhanced image 510b is enhanced by the night vision device to generate rightnight vision image 520b. The right enhanced image is overlaid with a visual representation of sensor data to provideright eye image 530b. When these two images are viewed by the left and right eyes of the user, the user's brain interprets them as a singlestereoscopic image 550. - Because overlaid
images stereoscopic image 550, the user may be provided with a 3-dimensional image. Yet, the use of two optical channels may add additional complexity to the overlaying of sensor data onto night vision images For example, leftenhanced image 520a will be slightly different than rightenhanced image 520b. Accordingly, leftoverlay 540a may need to be positioned in a different location in left enhanced image than whereright overlay 540b is positioned in rightenhanced image 520b. The difference in positioning ofleft overlay 540a andright overlay 540b is depicted in left overlaidimage 530a and right overlaidimage 530b, though the positioning has been exaggerated to better illustrate the point. - In order to accurately position overlays 540a and 540b,
video generator 120 is provided with night vision device data 560 fromnight vision device 570. Specifically,night vision device 570 may provide data indicative of, for example, where and hownight vision device 570 is arranged and focused. Furthermore, the night vision device data 560 may include information indicating the relative positions of the left optical channel used to create the leftenhanced image 520a and the right optical channel used to create the rightenhanced image 520b. By considering the night vision device data 560, the video generator may generate two separate video signals, left video signal 150b andright video signal 150a.Left video signal 150a will be overlaid on leftenhanced image 520a to generateleft eye image 530a. Similarly, right video signal 150b is overlaid on rightenhanced image 520b to generateright eye image 540b. Becausevideo generator 120 has taken the night vision device data 560 into consideration when generatingleft video signal 150a and right video signal 150b, when the user viewsstereoscopic overlay 580 instereoscopic image 550, stereoscopic overly 580 may accurately indicate the location of the item of interest to the user. - According to other examples,
night vision device 570 may receive a single video signal, similar to the signal provided inFIGs. 3 and4 .Night vision device 570, being in possession of night vision device data 560, can appropriately modify the video signal toappropriate position overlay 540a in overlaidimage 530a andoverlay 540b in overlaidimage 530b, respectively. - With reference now made to
FIG. 6 , depicted therein are some of the features of night vision displays that can be advantageously leveraged in example display devices utilizing the techniques described herein. Specifically, depicted inFIG. 6 isuser 610 wearingnight vision goggles 130. Unlike the external backlit screen of a GPS receiver or a gunshot detector, thedisplay screen 630 ofnight vision goggle 130 is set within the external casing ofnight vision goggle 130. Accordingly, light from thedisplay 620 is unlikely to leak into the environment surrounding the user, thereby preserving the user's light security.Night vision goggle 130 may also be configured to include features that conform to the face ofuser 610, such aseyepiece 630.Eyepiece 630 may further prevent light emitted bydisplay 620 from leaking into the user's environment, further maintaining the user's light security. - The above description is intended by way of example only.
Claims (15)
- An apparatus (100) comprising;a display device (130) configured to provide a night vision view of a user's environment;a sensor (110) configured to generate sensor data (140) indicative of the user's environment; anda video generator (120) configured to receive sensor data (140) from the sensor (110) and provide a video signal (150) comprising a visual representation of the sensor data (140) to the night vision display device,wherein the night vision display device is configured to display an image corresponding to the video signal overlaid with the night vision view of the user's environment.
- The apparatus of claim 1, wherein the sensor comprises a gunshot detector configured to generate sensor data indicative of the location of a gunshot.
- The apparatus of claim 2, further comprising a global positioning system receiver, providing global positioning information to the video generator,
wherein the video generator is configured to provide the video signal in response to the location information for the gunshot and the global positioning information. - The apparatus of claim 1, wherein the night vision display device comprises a night vision goggle.
- The apparatus of claim 4, wherein the night vision goggle is configured to provide a binocular image to the user.
- The apparatus of claim 4, wherein the night vision goggle is configured to provide a bi-ocular image to the user.
- The apparatus of claim 1, wherein an eye piece of the display is configured to block light emitted from the display from leaking into the user's environment.
- The apparatus of claim 1, wherein the video generator is embodied in a smartphone.
- The apparatus of claim 1, wherein the video generator is embodied in at least one of an application specific integrated circuit, a field programmable gate array, a microcontroller, or a microprocessor.
- The apparatus of claim 1, wherein the sensor data comprises non-visual data.
- The apparatus of claim 1, wherein the night vision display device comprises a phosphor screen and a transparent display, wherein:the night vision view is displayed from the phosphor screen and the video signal is displayed from the transparent display, andthe transparent display is arranged on a user-side of the phosphor screen.
- A method comprising:receiving sensor data from a sensor indicative of a user's environment;generating a video signal comprising a visual representation of the sensor data;combining the video signal with a night vision view of the user environment to over lay the visual representation of the sensor data over the night vision view of the user's environment; anddisplaying the overlaid night vision view of the user's environment.
- The method of claim 12, wherein receiving sensor data comprises receiving sensor data from a gunshot detector indicative of a location of a gunshot.
- The method of claim 13, further comprising:receiving global positioning information,wherein generating the video signal comprises overlaying a location of the gunshot over the night vision view of the user's environment in response to the location information for the gunshot and the global positioning information.
- An apparatus comprising:a display device configured to provide a night vision view of a user's environment;a gunshot detector configured to generate sensor data indicative of a location of a gunshot; anda video generator configured to receive sensor data from the gunshot detector and provide a video signal comprising a visual representation of the sensor data to the night vision display device,wherein the night vision display device is configured to display an image corresponding to the video signal overlaid with the night vision view of the user's environment.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/826,675 US20140267389A1 (en) | 2013-03-14 | 2013-03-14 | Night Vision Display Overlaid with Sensor Data |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2778745A2 true EP2778745A2 (en) | 2014-09-17 |
EP2778745A3 EP2778745A3 (en) | 2014-10-15 |
Family
ID=50280229
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14159939.9A Withdrawn EP2778745A3 (en) | 2013-03-14 | 2014-03-14 | Night vision display overlaid with sensor data |
Country Status (6)
Country | Link |
---|---|
US (1) | US20140267389A1 (en) |
EP (1) | EP2778745A3 (en) |
JP (1) | JP2014179990A (en) |
AU (1) | AU2014201487B2 (en) |
CA (1) | CA2846554A1 (en) |
IL (1) | IL231530A0 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105812730A (en) * | 2016-03-11 | 2016-07-27 | 上海良相智能化工程有限公司 | Hand-held night-vision-type photographing system |
EP4170410A1 (en) * | 2021-10-22 | 2023-04-26 | L3Harris Technologies, Inc. | Flash and streak detection and persistent identification using bi-directional detector/display overlay |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7250150B2 (en) * | 2018-10-03 | 2023-03-31 | エイチアイアイ・ミッション・テクノロジーズ・コーポレーション | Integrated power system |
US10937622B2 (en) | 2018-12-19 | 2021-03-02 | Elbit Systems Of America, Llc | Programmable performance configurations for night vision device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003060590A2 (en) * | 2001-12-21 | 2003-07-24 | Itt Manufacturing Enterprises, Inc. | Video enhanced night vision goggle |
US20060114749A1 (en) * | 2004-01-22 | 2006-06-01 | Baxter Kevin C | Gunshot detection sensor with display |
US20100007580A1 (en) * | 2008-07-14 | 2010-01-14 | Science Applications International Corporation | Computer Control with Heads-Up Display |
US20120154920A1 (en) * | 2010-12-16 | 2012-06-21 | Lockheed Martin Corporation | Collimating display with pixel lenses |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7129464B2 (en) * | 2004-10-19 | 2006-10-31 | Buchin Michael P | Low-photon flux image-intensified electronic camera |
US7787012B2 (en) * | 2004-12-02 | 2010-08-31 | Science Applications International Corporation | System and method for video image registration in a heads up display |
FR2916863B1 (en) * | 2007-05-29 | 2009-08-14 | Sagem Defense Securite | BIOCULAR TWIN OF NIGHT VISION |
US8400510B2 (en) * | 2008-10-27 | 2013-03-19 | Devcar, Llc | Night vision system |
US9366862B2 (en) * | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US8964298B2 (en) * | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
US20120194420A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Ar glasses with event triggered user action control of ar eyepiece facility |
US20140063055A1 (en) * | 2010-02-28 | 2014-03-06 | Osterhout Group, Inc. | Ar glasses specific user interface and control interface based on a connected external device type |
US9389425B2 (en) * | 2012-04-18 | 2016-07-12 | Kopin Corporation | Viewer with display overlay |
-
2013
- 2013-03-14 US US13/826,675 patent/US20140267389A1/en not_active Abandoned
-
2014
- 2014-03-13 CA CA2846554A patent/CA2846554A1/en not_active Abandoned
- 2014-03-13 JP JP2014050363A patent/JP2014179990A/en active Pending
- 2014-03-13 AU AU2014201487A patent/AU2014201487B2/en not_active Ceased
- 2014-03-13 IL IL231530A patent/IL231530A0/en unknown
- 2014-03-14 EP EP14159939.9A patent/EP2778745A3/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003060590A2 (en) * | 2001-12-21 | 2003-07-24 | Itt Manufacturing Enterprises, Inc. | Video enhanced night vision goggle |
US20060114749A1 (en) * | 2004-01-22 | 2006-06-01 | Baxter Kevin C | Gunshot detection sensor with display |
US20100007580A1 (en) * | 2008-07-14 | 2010-01-14 | Science Applications International Corporation | Computer Control with Heads-Up Display |
US20120154920A1 (en) * | 2010-12-16 | 2012-06-21 | Lockheed Martin Corporation | Collimating display with pixel lenses |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105812730A (en) * | 2016-03-11 | 2016-07-27 | 上海良相智能化工程有限公司 | Hand-held night-vision-type photographing system |
EP4170410A1 (en) * | 2021-10-22 | 2023-04-26 | L3Harris Technologies, Inc. | Flash and streak detection and persistent identification using bi-directional detector/display overlay |
Also Published As
Publication number | Publication date |
---|---|
AU2014201487B2 (en) | 2015-09-17 |
IL231530A0 (en) | 2014-08-31 |
EP2778745A3 (en) | 2014-10-15 |
CA2846554A1 (en) | 2014-09-14 |
US20140267389A1 (en) | 2014-09-18 |
JP2014179990A (en) | 2014-09-25 |
AU2014201487A1 (en) | 2014-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7545971B2 (en) | Display system for viewing optical instruments | |
CN113614483B (en) | Viewing optic with bullet counter system | |
JP7118982B2 (en) | Observation optics with built-in display system | |
CN115885152A (en) | Viewing optic with enabler interface | |
US20130333266A1 (en) | Augmented Sight and Sensing System | |
US10408574B2 (en) | Compact laser and geolocating targeting system | |
RU2730466C2 (en) | Improved awareness of the environment using the enlarged picture in the picture inside the wide area view optical image | |
US20160252325A1 (en) | Compositions, methods and systems for external and internal environmental sensing | |
AU2014201487B2 (en) | Night vision display overlaid with sensor data | |
US9068798B2 (en) | Integrated multifunction scope for optical combat identification and other uses | |
Gans et al. | Augmented reality technology for day/night situational awareness for the dismounted soldier | |
US9476676B1 (en) | Weapon-sight system with wireless target acquisition | |
US9851177B2 (en) | Coating for light security | |
US20240102773A1 (en) | Imaging enabler for a viewing optic | |
US11314090B2 (en) | Covert target acquisition with coded short-wave infrared glasses | |
US20240068776A1 (en) | Systems and controls for an enabler of a viewing optic | |
US20240167790A1 (en) | Elevation Adders for a Viewing Optic with an Integrated Display System | |
US20130129254A1 (en) | Apparatus for projecting secondary information into an optical system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
17P | Request for examination filed |
Effective date: 20140314 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G02B 27/01 20060101AFI20140911BHEP Ipc: G02B 23/12 20060101ALI20140911BHEP |
|
R17P | Request for examination filed (corrected) |
Effective date: 20150218 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: EXELIS INC. |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20171013 |