IL284600A - A connformal display system and a method thereof - Google Patents

A connformal display system and a method thereof

Info

Publication number
IL284600A
IL284600A IL284600A IL28460021A IL284600A IL 284600 A IL284600 A IL 284600A IL 284600 A IL284600 A IL 284600A IL 28460021 A IL28460021 A IL 28460021A IL 284600 A IL284600 A IL 284600A
Authority
IL
Israel
Prior art keywords
display
hmd
platform
processor
images
Prior art date
Application number
IL284600A
Other languages
Hebrew (he)
Other versions
IL284600B2 (en
IL284600B1 (en
Inventor
Marinov Ofer
Mencel Guy
Gabizon Ofir
Assaf Yuval
Original Assignee
Elbit Systems Ltd
Marinov Ofer
Mencel Guy
Gabizon Ofir
Assaf Yuval
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elbit Systems Ltd, Marinov Ofer, Mencel Guy, Gabizon Ofir, Assaf Yuval filed Critical Elbit Systems Ltd
Priority to IL284600A priority Critical patent/IL284600B2/en
Priority to EP22837159.7A priority patent/EP4367547A1/en
Priority to PCT/IL2022/050535 priority patent/WO2023281488A1/en
Publication of IL284600A publication Critical patent/IL284600A/en
Publication of IL284600B1 publication Critical patent/IL284600B1/en
Publication of IL284600B2 publication Critical patent/IL284600B2/en
Priority to US18/402,769 priority patent/US20240160278A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Description

A CONFORMAL DISPLAY SYSTEM AND A METHOD THEREOF TECHNICAL FIELD id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1" id="p-1"
[0001] The present disclosed subject matter relates to conformal displays. More particularly, the present disclosed subject matter relates to conforming between displays and tracking systems.
BACKGROUND id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2" id="p-2"
[0002] A head-worn display coupled with a head-tracking system enables users to view synthetically generated images in a way that they appear as a part of the real-world scenery. This technology of displaying, synthetic or real, elements as parts of the outside world was also adapted by disciplines, such as contact analog, linked scene, augmented reality, and outside conformal. The Highway In The Sky (HITS) Display System was one of the first applications in avionics, over the years more and more conformal counterparts have been devised for aircraft-related instruments. Among them are routing information, navigation aids, specialized landing displays, obstacle warnings, drift indicators, and many more. id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3" id="p-3"
[0003] With the mounting interest in optical see-through head-mounted displays across military, medical, and gaming settings many systems having different capabilities are rapidly entering the market. Despite such a variety of systems, they all require display calibration to create a proper mixed reality environment. With the aid of tracking systems, it is possible to register rendered graphics with tracked objects in the real world. id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4" id="p-4"
[0004] Military-grade solutions usually require large efforts for integration and alignment in an always-changing environment. Especially where components such as the magnetic/optical head tracker unit have to be recalibrated in case of changes to cockpit geometry, electromagnetic conditions, and different users.
BRIEF SUMMARY id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5" id="p-5"
[0005] According to a first aspect of the present disclosed subject matter, a conformal display system for a Head-Mounted Display (HMD), and a display coupled to a support-structure having at least one degree of freedom, wherein the display is adapted to display images rendered by a display-processor, and wherein the support-structure is monitored by a tracking system configured to provide information indicating the support-structure position and or orientation with respect to a frame of reference, the system comprising: at least one first inertial sensor attached to the support-structure and configured to acquire support-structure inertial readings information indicative of the support-structure movements over time; at least one second inertial sensor attached to the display and configured to acquire display inertial readings information indicative of the display movements over time; and a processor configured to: obtain the support-structure inertial readings information, the display inertial readings information and the information indicating the support-structure position and/or orientation with respect to the frame of reference; continuously analyze the HMD movement information and the display movement information to determine relative orientation between the support-structure and the display; and cause the display-processor to adjust the images to conform with respect to the frame of reference based on the information indicating the HMD’s position and/or orientation and the relative movements. id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6" id="p-6"
[0006] In some exemplary embodiments, adjusting the images to conform with respect to the frame of reference provides accuracy enhancement of a line-of-sight designation. id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7" id="p-7"
[0007] In some exemplary embodiments, the HMD is worn by a user, and wherein the display is elected from the group consisting of: a see-through display; an opaque display; and any combination thereof. id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8" id="p-8"
[0008] In some exemplary embodiments, the user is operating a platform the frame of reference. id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9" id="p-9"
[0009] The system of Claim 4, further comprising at least one platform-sensor attached to the platform and configured to acquire platform movements information indicative of the platform's movements over time with respect to a fixed coordinate system established in space, wherein the processor causes the display-processor to adjust the images also to compensate for the platform’s movements over time. id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10" id="p-10"
[0010] In some exemplary embodiments, the at least one inertial HMD-sensor and the at least one inertial display-sensor and the at least one platform-sensor are inertial measurement units. id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11" id="p-11"
[0011] In some exemplary embodiments, the display displays to the user an augmented reality view comprised of scenes external to the platform and wherein the images are conformal to the external scenes. id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12" id="p-12"
[0012] In some exemplary embodiments, the display displays to the user a virtual reality comprised of scenes external to the platform rendered by a video camera mounted on the HMD and wherein the images are conformal to the external scenes. id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13" id="p-13"
[0013] In some exemplary embodiments, the images are selected from the group consisting of: graphical symbology; thermal images; text; video; synthetically generated images; and any combination thereof. id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14" id="p-14"
[0014] In some exemplary embodiments, the see-through head-worn display can be mechanically adjusted by the user along the at least one degree of freedom. id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15" id="p-15"
[0015] In some exemplary embodiments, the tracking system is selected from the group consisting of an electro-optical system; an electromagnetic system; and a combination thereof. id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16" id="p-16"
[0016] In some exemplary embodiments, the HMD movement information, the display movement information and the information indicating the HMD's position and/or orientation with respect to the frame of reference, and wherein the frame of reference is selected from the group consisting of a platform coordinates; a fixed coordinate system established in space; an earth coordinate system; and any combination thereof. id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17" id="p-17"
[0017] According to another aspect of the present disclosed subject matter A tracking system between coordinates of a platform, having an inertial-sensor, and a tracking system comprising a tracking-reference-unit (TRU) and a tracking module coupled to a head-mounted-device, the system comprising: At least one TRU inertial-sensor attached to the TRU; and a processor configured to receive information indicative of the helmet orientation relative to the TRU from the tracking system; and inertial information of the inertial-sensor and the at least one TRU inertial-sensor, wherein the processor utilizes the angular rates to dynamically calculate a transfer­alignment between the TRU and the platform's coordinates, and wherein the tracking system is configured to update the head-mounted-device orientation relative to the platform based on an updated TRU alignment and the information. id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18" id="p-18"
[0018] In some exemplary embodiments, the head-mounted-device comprises at least one element selected from the group consisting of a display; at least one sensor; and any combination thereof, wherein the head-mounted-device is worn by a user operating the platform, and wherein the at least one element is conformal to coordinates selected from the group consisting of platform’s coordinates; earth coordinates; and any combination thereof id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19" id="p-19"
[0019] In some exemplary embodiments, the at least one TRU inertial-sensor is an inertial measurement unit selected from the group consisting of at least one accelerometer; angular rate sensors; gyroscopes; and any combination thereof. id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20" id="p-20"
[0020] In some exemplary embodiments, the HMD comprises a see-through display displaying to the user an augmented reality or virtual reality view comprised of scenes external to the platform and images that are conformal to at least one external scene and the platform. id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21" id="p-21"
[0021] In some exemplary embodiments, the images are selected from the group consisting of graphical symbology; thermal images; text; video; synthetically generated images; and any combination thereof. id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22" id="p-22"
[0022] In some exemplary embodiments, the tracking system is selected from the group consisting of an electro-optical system; an electromagnetic system; and a combination thereof. id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23" id="p-23"
[0023] In some exemplary embodiments, the system further comprises a plurality of inertial measurement units coupled to a plurality of modules of the platform, wherein the processor further acquires angular rates from the plurality of inertial measurement units for enhancing the transfer-alignment accuracy. id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24" id="p-24"
[0024] According to yet another aspect of the present disclosed subject matter a tracking method using the system of Claim 13, the method comprising: receiving movements information from the at least one inertial HMD-sensor; receiving movements information from the at least one inertial display-sensor; determining a Line of Sight (LOS) of the HMD referring to coordinates of the platform; determining a LOS of the display based on a relative orientation between the HMD and the display derived from movements information of the at least one inertial HMD­sensor and the at least one inertial display-sensor; adjusting the images on the display so as the LOS of the HMD and the LOS of the display overlap to yield conformal display. id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25" id="p-25"
[0025] In some exemplary embodiments, the method comprising: receiving angular rates from the at least one inertial TRU-sensor and the inertial-sensor; receiving information indicating the orientation and position of the HMD from the tracking system; utilizing a transfer alignment algorithm for dynamically calculating a relative orientation between the TRU and the platform's coordinates and continually compensating the information indicating the orientation and position of the HMD.BRIEF DESCRIPTION OF THE DRAWINGS id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26" id="p-26"
[0026] Some embodiments of the disclosed subject matter described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present disclosed subject matter only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the disclosed subject matter. In this regard, no attempt is made to show structural details of the disclosed subject matter in more detail than is necessary for a fundamental understanding of the disclosed subject matter, the description taken with the drawings making apparent to those skilled in the art how the several forms of the disclosed subject matter may be embodied in practice.
In the drawings: id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27" id="p-27"
[0027] Figure 1 shows a schematic block diagram of a conformal display system, in accordance with some exemplary embodiments of the disclosed subject matter; id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28" id="p-28"
[0028] Figure 2 shows a schematic block diagram of a transfer-alignment system, in accordance with some exemplary embodiments of the disclosed subject matter; id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29" id="p-29"
[0029] Figures 3A and 3B show screenshots of real-time augmentation of generated image and real-world view, in accordance with some exemplary embodiments of the disclosed subject matter; id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30" id="p-30"
[0030] Figure 4 shows a flowchart diagram of a conformal display method, in accordance with some exemplary embodiments of the disclosed subject matter; and id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31" id="p-31"
[0031] Figure 5 shows a flowchart diagram of a transfer-alignment method, in accordance with some exemplary embodiments of the disclosed subject matter.
DETAILED DESCRIPTION id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32" id="p-32"
[0032] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the presently disclosed subject matter. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the presently disclosed subject matter. id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33" id="p-33"
[0033] In the drawings and descriptions set forth, identical reference numerals indicate those components that are common to different embodiments or configurations. id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34" id="p-34"
[0034] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "Processor", "Display-processor", Computer of a Platform", "Platform’s Computer" or the like, include action and/or processes of a computer that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities, and/or said data representing the physical objects. The terms "computer", "processor", "processing resource", "processing circuitry", and "controller" should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, a personal desktop/laptop computer, a server, a computing system, a communication device, a smartphone, a tablet computer, a smart television, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), a group of multiple physical machines sharing performance of various tasks, virtual servers co-residing on a single physical machine, any other electronic computing device, and/or any combination thereof. id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35" id="p-35"
[0035] The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general-purpose computer specially configured for the desired purpose by a computer program stored in a non-transitory computer readable storage medium. The term "non-transitory" is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application. id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36" id="p-36"
[0036] As used herein, the phrase "for example," "such as", "for instance" and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one case", "some cases", "other cases" or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter. Thus, the appearance of the phrase "one case", "some cases", "other cases" or variants thereof does not necessarily refer to the same embodiment(s). id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37" id="p-37"
[0037] It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination. id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38" id="p-38"
[0038] In embodiments of the presently disclosed subject matter, fewer, more and/or different stages than those shown in Figures 4 and 5 may be executed. In embodiments of the presently disclosed subject matter one or more stages illustrated in Figures 4 and 5 may be executed in a different order and/or one or more groups of stages may be executed simultaneously. Figures and 2 illustrates a general schematic of the system architecture in accordance with an embodiment of the presently disclosed subject matter. Each module in Figures 1 and 2 can be made up of any combination of software, hardware and/or firmware that performs the functions as defined and explained herein. The modules in Figures 1 and 2 may be centralized in one location or dispersed over more than one location. In other embodiments of the presently disclosed subject matter, the system may comprise fewer, more, and/or different modules than those shown in Figures 1 and 2. id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39" id="p-39"
[0039] Any reference in the specification to a method should be applied mutatis mutandis to a system capable of executing the method and should be applied mutatis mutandis to a non- transitory computer readable medium that stores instructions that once executed by a computer result in the execution of the method. id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40" id="p-40"
[0040] Any reference in the specification to a system should be applied mutatis mutandis to a method that may be executed by the system and should be applied mutatis mutandis to a non- transitory computer readable medium that stores instructions that may be executed by the system. id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41" id="p-41"
[0041] Any reference in the specification to a non-transitory computer readable medium should be applied mutatis mutandis to a system capable of executing the instructions stored in the non- transitory computer readable medium and should be applied mutatis mutandis to method that may be executed by a computer that reads the instructions stored in the non-transitory computer readable medium. id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42" id="p-42"
[0042] One technical problem dealt with by the disclosed subject matter is a potential misalignment between a helmet and a wearable display coupled to it. id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43" id="p-43"
[0043] It should be noted that such misalignments may result from using wearable displays, which comprise an adjustment mechanism that makes up for different interpupillary distances and other physiological variables. Additionally, coupling (connection) between the display and the helmet may grow poor during operational conditions, such as vibrations, high G maneuvers, temperature changes, distortions caused by material aging, and any combination thereof, or the like. All the above can be a cause for misalignments. id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44" id="p-44"
[0044] Wearable displays, having a see-through capability, such as a Head Wearable Display (HWD), or a Helmet Mounted Display (HMD) have to be conformed to known reference coordinates in order to enable a real-time display augmentation. Thus, a tracking system, of a platform, that monitors the helmet and a wearable display must be aligned with one another. id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45" id="p-45"
[0045] It should be noted that the term "platform" may refer in this present disclosure to any type of vehicle, an airplane, a ship, a person, or a boat, which incorporate a tracking system. id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46" id="p-46"
[0046] One technical solution is to continually compute relative geometric orientation between the wearable display and the helmet, which will be used to compensate for any potential misalignment during operational use. In some exemplary embodiments, a transfer-alignment algorithm may be invoked to determine the relative geometric orientation, which are used by a display-processor, rendering computerized images to the display, to compensate for such misalignment by aligning its images with real-world scenes in order to gain conformal display. id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47" id="p-47"
[0047] One technical effect of utilizing the disclosed subject matter is facilitating a real-time continuous automatic alignment process between the display and the tracker modules. And thereby enhancing the HMD / HWD accuracy throughout the operation in addition to alleviating lengthy and expensive manual calibration. id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48" id="p-48"
[0048] Another technical problem dealt with by the disclosed subject matter is a potential misalignment between coordinates of the platform’s sensors and the coordinates of the tracking system, consequently the display as well. Such misalignment may result in deflecting the computerized images derived from the sensors, with respect to a real-world scene on the display. id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49" id="p-49"
[0049] It should be noted that the sensors and the tracking system, are mechanically installed on the platform with limited accuracy due to unavoidable mechanical tolerances, temperature, and stress factors, or the like. Harmonizing these elements to reference coordinates is a cumbersome, lengthy and expensive process, which may have to be repeated time and again. id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50" id="p-50"
[0050] It should also be noted that the platform comprises sensors, such as Inertial Navigation System (INS), Forward-Looking -Infrared (FLIR), Light Detection and Ranging (LiDAR), Radar, Enhanced Vision System (EVS) and the like that are aligned with the platform’s coordinate and possibly a fixed coordinate system established in space. It will be appreciated that the platform and its sensors may be configured to be guided by an HMD / HWD guided tracking system. Thus, one of the present discloser objectives is providing coordinates congruence between the tracking system and the platform. id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51" id="p-51"
[0051] Another technical solution is to dynamically determine a relative orientation between a TRU of the tracking system and the INS of the platform, thereupon compensate for any potential misalignment between coordinates during operational use. id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52" id="p-52"
[0052] In some exemplary embodiments, a transfer-alignment process may be utilized to correct misalignment between the TRU of the tracking system and the INS of the platform. Thereby, conforming the wearable see-through display to reference coordinates of the platform. id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53" id="p-53"
[0053] Another technical effect of utilizing the disclosed subject matter is facilitating a real­time continuous automatic alignment process between the platform and the tracker modules. And thereby enhancing the HMD / HWD accuracy throughout the operation in addition to alleviating lengthy and expensive manual calibration. id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54" id="p-54"
[0054] Referring now to Figure 1 showing a schematic block diagram of a conformal display system, in accordance with some exemplary embodiments of the disclosed subject matter. A conformal display system (System) 100 may be a computerized system configured to dynamically conform between images derived from sensors obeying a LOS of the tracking system and a LOS that the user sees scenery external to Platform 10 on the display. id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55" id="p-55"
[0055] In some exemplary embodiments, System 100 may be situated (but not limited to) in a Compartment 11 of Platform 10 that also accommodates a user wearing a HMD 130 that has a Display Unit 133. In some exemplary embodiments, System 100 may be comprised of a Processor 120, a TRU 140 attached to Compartment 11, a Tracking Module 131, at least one inertial HMD­sensor (Sensor) 132 attached to HMD 130, and inertial display-sensor (Sensor) 134 attached to Display Unit 133. id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56" id="p-56"
[0056] In some exemplary embodiments, Processor 120, TRU 140, and Tracking Module 1form together an apparatus system configured to sense elevation, azimuth, and roll, i.e., orientation and position, of HMD 130 relative to Platform 10. Different technology methods, such as inertial; optical; electromagnetic; sonic; and any combination thereof, or the like can be utilized for implementing such tracking system. id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57" id="p-57"
[0057] For example, an optical-based tracking apparatus may utilize one or more electro-optical emitters connected to either Tracking Module 131, or TRU 140, and one or more electro-optical detectors connected to either TRU 140 or Tracking Module 131. The one or more electro-optical detectors constantly observe the one or more electro-optical emitters for sensing the orientation and position of HMD 130. Additionally, or alternatively, an electromagnetic-based tracking apparatus may be utilized. This type of apparatus employs coils connected to Tracking Module 131 of HMD 130, which is present within an alternating field generated in multiple axes within Compartment 11 and controlled by TRU 140. These coils are adapted to produce for the Tracking Module 131 alternating electrical current signal, indicative of the movement of HMD 130 in Compartment 11. id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58" id="p-58"
[0058] In some exemplary embodiments, Processor 120 acquire from TRU 140 and Tracking Module 131 information, such as signals, electrical current, or the like, indicative of the orientation and position of HMD 130 in order to determine a LOS on which the user, wearing the HMD 130 is pointing (looking at). Processor 120 simultaneously indicates the LOS to a computer (not shown) of Platform 10, which instructs platform's sensors, such as INS, FLIR, LiDAR, Radar, or the like to aim the sensors in line with the LOS. id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59" id="p-59"
[0059] Display-processor 110 is configured to generate and render images to a display device; such as an Eyepiece 135 of Display Unit 133, a visor (not shown), and any combination thereof, or the like; based on information obtained from the sensors and/or computerized information. In some exemplary embodiments, the images can be synthetically generated digital images, textual information, graphical symbology, processed and/or unprocessed video originated from electro- optical sensors, and any combination thereof, or the like. id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60" id="p-60"
[0060] In some exemplary embodiments, Eyepiece 135, or the like, may be a see-through transparent screen display that allows the user to watch on the screen real-time augmentation of the rendered images while still being able to see through it the real-world scenery. id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61" id="p-61"
[0061] In some exemplary embodiments, HMD 130 may comprise an attached video camera (not shown) adapted to render real-world scenery, i.e., scenes external to Platform 10, to the screen of Eyepiece 135. Thereby, providing for a real-time virtual-reality display that renders computerized images and real-world scenery provided by the video camera. id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62" id="p-62"
[0062] It will be appreciated that misalignment between HMD 130 and Display Unit 133 and thereby Eyepiece 135 coupled to it, can result in a miss-congruence between an actual LOS that the user sees and LOS calculated by the tracking system, which the platform's sensors adhere to. id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63" id="p-63"
[0063] It should be noted that the misalignment may result from Display Unit 133 connected by a support-structure that has an adjustment mechanism used to make up for different interpupillary distances and other physiological variables. Additionally, the support-structure connecting between Display Unit 133 and HMD 130 may grow poor during operational conditions, such as vibrations, high G maneuvers, temperature changes, distortions caused by material aging, and any combination thereof, or the like. id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64" id="p-64"
[0064] Referring now to Figure 3A showing a screenshot of real-time augmentation of generated (computerized) images and real-world view, in accordance with some exemplary embodiments of the disclosed subject matter. The screenshot depicts an urban area 320 having powerlines 321, i.e., real-world view, as viewed by the user through Eyepiece 135, of Figure 1. A Symbol 3shows the Platform 10 level with respect to the Horizon and a Symbol 311 shows a flight safety line. Both Symbols 310 and 311 are synthetic images generated by the platform computer based on the platform’s sensors, which are superimposed on the urban area 320 real view, i.e., augmented reality. id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65" id="p-65"
[0065] As appreciated from the screenshot, of Figure 3A, there is a lack of correspondence between the terrain and the obstacles (building and powerline 321) as seen on the screen of Eyepiece 135, and the calculated symbol 311 that shows the flight safety line. It will be understood that the lack of correspondence between the real-world view and the generated images results from the miss-congruence between an actual LOS that the user sees and the LOS calculated by the tracking system. id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66" id="p-66"
[0066] Referring back to Figure 1. In some exemplary embodiments, Sensor 132, attached to HMD 130, and Sensor 134, attached to Display Unit 133, may be an Inertial Measurement Unit (IMU) that may be configured to measure and report to Processor 120 acceleration, orientation, angular rates, and other gravitational forces of items to which they are attached to. The IMU may be based on technology, such as Fiber Optic Gyroscope (FOG), Ring Laser Gyroscope (RLG), Micro Electro-Mechanical Systems (MEMS) and any combination thereof, or the like. id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67" id="p-67"
[0067] In some exemplary embodiments at least one Sensor 132 embedded into HMD 130 is configured to produce information, e.g., angular-rate signal, to Processor 120, which is indicative of HMD 130 movements over time. And at least one Sensor 132 embedded into Display Unit 1is configured to produce information, e.g., angular-rate signal, to Processor 120, which is indicative of Display Unit 133 movements over time. id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68" id="p-68"
[0068] In some exemplary embodiments of the disclosed subject matter, misalignment between Display Unit 133 and HMD 130 may be determined by continuously analyzing the angular rates of Sensors 132 and 134, in order to determine, using Processor 120, their relative movements and thereby their relative orientation. id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69" id="p-69"
[0069] Additionally, or alternatively, System 100 may further comprise at least one inertial platform-sensor (Sensor) 112 attached to Platform 10. Sensors 112 may be a sensor, such as Sensors 132 and 134, configured to provide Platform 10 movements information indicative of the platform's movements over time with respect to a fixed coordinate system established in space. In some exemplary embodiments, Sensors 112 produce and provide angular-rate signal, based on Platform 10 movements information, to Processor 120. In some exemplary embodiments, Sensor 112 may be used by Processor 120 to determine an orientation reference while continuously analyzing the angular rates of Sensor 132 and 134 in order to determine their relative movements and thereby their relative orientation reference. id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70" id="p-70"
[0070] In some exemplary embodiments, System 100 may utilize Processor 120 to perform methods such as depicted in Figure 4. Processor 120 may be also utilized to perform computations required by System 100 or any of its subcomponents. It should be noted that Processor 120 may be deployed in any location of Platform 10 and may comprise a collection of assisting processing devices and services, such as Display-processor 110, one or more computers of Platform 10, and any combination thereof, or the like. id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71" id="p-71"
[0071] According to an aspect of the present disclosed subject matter, Processor 120 is configured to: obtain and continuously analyze HMD’s 130 movement information, Display Unit 133 movement information, and information indicating the position and orientation of HMD 1with respect to Platform 10. Processor 120 is also configured to determine relative orientation between HMD 130 and Display Unit 133 and cause the display-processor to adjust the images to compensate for the relative orientation changes, and to conform with the information indicating HMD 130 position and orientation with respect to Platform 10. id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72" id="p-72"
[0072] In some exemplary embodiments of the disclosed subject matter, Processor 120 may comprise an I/O module (not shown). The I/O module may be utilized as an interface to transmit and/or receive information and instructions between Processor 120 and components of System 100. id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73" id="p-73"
[0073] In some exemplary embodiments, Processor 120 may comprise a memory module (not shown). The memory module may be comprised of volatile and/or non-volatile memories, based on technologies such as semiconductor, magnetic, optical, flash, a combination thereof, or the like. The memory module (not shown) may retain program code operative to cause Processor 1to perform acts associated with any of the steps shown in Figure 4. In some exemplary embodiments, the memory module (not shown) of Processor 120 may be used to retain information obtained from Sensors 132 and 134 as well as information indicating the position and orientation of HMD 130 with respect to Platform 10. id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74" id="p-74"
[0074] In some exemplary embodiments, Processor 120 dynamically calculates a relative orientation between Display Unit 133 and HMD 130 in order to detect misalignments of Display Unit 133 relative to HMD 130, and consequently misalignments with respect to Platform coordinates. The dynamic calculation may also involve causing Display-processor 110 to dynamically adjust its rendered images in order to compensate for the misalignment based on the relative orientation between Display Unit 133 and HMD 130. Such compensation results in a conformal display between rendered images (e.g., Symbols 310 and 311’ of Figure 3B) with the user’s scene (e.g., Urban area 320 of Figure 3B). id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75" id="p-75"
[0075] It should be noted that the misalignment, described above can be understood as shift of a viewpoint, and therefore the LOS, of the user with respect to a viewpoint, and therefore the LOS, of the tracking system of HMD 130, i.e., LOS of Platform 10 and its sensors. id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76" id="p-76"
[0076] In some exemplary embodiments, Processor 120 may utilize a transfer alignment algorithm to compute the relative orientation between Display Unit 133 and HMD 130, i.e., misalignment of Display Unit 133. The transfer alignment algorithm resolves an alignment matrix, based on real time measurements of angular rates produced by Sensors 132 and 134. id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77" id="p-77"
[0077] In some exemplary embodiments, the transfer alignment algorithm comprises: analyzing the angular rates; synchronizing data streams of the angular rates, determining Sensors 132 and 134 relative biases, extracting Sensors 132 and 134 relative orientation (i.e., alignment) and any combination thereof, or the like. id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78" id="p-78"
[0078] In some exemplary embodiments, the transfer alignment algorithm runs continuously to dynamically compensate and improve the alignment functionality. Additionally, or alternatively, the algorithm may be utilized for monitoring and testing the alignment integrity, functionality, and possibly alert the operator in case of failures, as defined for the intended use. id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79" id="p-79"
[0079] In some exemplary embodiments, the extracted orientation may be utilized by the computer of Platform, the tracking system, and Display-processor 110 to facilitate Processor 1in performing acts associated with any of the steps shown in Figures 4 and 5. id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80" id="p-80"
[0080] Referring now to Figure 3B showing a screenshot of real-time augmentation of generated image and real-world view, in accordance with some exemplary embodiments of the disclosed subject matter. The screenshot depicts an urban area 320 having powerlines 321, i.e., real-world view, as viewed by the user through Eyepiece 135, of Figure 1. Symbol 310 shows Platform level with respect to the Horizon and Symbol 311’ shows a flight safety line. Both Symbols 3and 311 are synthetic images generated by the platform computer based on the platform’s sensors, which are superimposed on the urban area 320 real view, i.e., augmented reality. id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81" id="p-81"
[0081] As appreciated from the screenshot, of Figure 3B, the terrain and the obstacles (building and powerline 321) as seen on the screen of Eyepiece 135 correspondence to the calculated symbol 311’ that shows the flight safety line. It will be understood that the correspondence between the real-world view and the generated images results from a dynamic alignment between an actual LOS that the user sees and the LOS calculated by the tracking system. id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82" id="p-82"
[0082] Referring now to Figure 2 showing a schematic block diagram of a transfer-alignment system, in accordance with some exemplary embodiments of the disclosed subject matter. A transfer-alignment system (System) 200 may be a computerized system configured to conform between coordinates of the tracking system and coordinates of a platform and perform methods such as depicted in Figure 5. id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83" id="p-83"
[0083] In some exemplary embodiments, System 200 may be situated in Compartment and/or Platform 10 that also accommodates a user wearing HMD 130 comprising Display Unit 133 and Tracking Module 131. In some exemplary embodiments, System 200 may be comprised of a Processor 120, a TRU 140 attached to Compartment 11 of Platform 10, at least one inertial TRU-sensor (Sensor) 241 attached to TRU 140, and an INS 211 attached to Platform 10. id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84" id="p-84"
[0084] In some exemplary embodiments, system 200 of the present discloser may be utilized to transfer-alignment between coordinates of Platform 10, through INS 211 connected to Platform 10, and a tracking system, through Sensor 241 embedded into TRU 140, and connected to Compartment 11 of Platform 10. id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85" id="p-85"
[0085] It will be reminded that the tracking system is used for tracking movements, position, and orientation of HMD130 and thereby the LOS of the user with respect to the platform. Thus, one of the objectives of the present discloser is maintaining congruence between coordinates of TRU 140 and Platform 10. id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86" id="p-86"
[0086] In some exemplary embodiments, Processor 120, TRU 140, and Tracking Module 1form together a tracking system configured to sense elevation, azimuth, and roll, i.e., orientation and position, of HMD 130 relative to Platform 10. It will be appreciated that the tracking system of Figure 2 may be identical or similar to the tracking system described in Figure 1, and therefore perform the same activities, such as depicted in Figure 1 above. id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87" id="p-87"
[0087] In some exemplary embodiments, Processor 120 acquire from TRU 140 and Tracking Module 131 information, such as signals, electrical current, or the like, indicative of the orientation and position of HMD 130 in order to determine a LOS on which the user, wearing the HMD 130 is pointing (looking at). Processor 120 simultaneously indicates the LOS to a computer (not shown) of Platform 10, which instructs platform's sensors, such as INS, FLIR, LiDAR, Radar, or the like to aim the sensors in line with the LOS. id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88" id="p-88"
[0088] Display-processor 110 is configured to generate and render images to a display device; such as an Eyepiece 135 of Display Unit 133, a visor (not shown), and any combination thereof, or the like; based on information obtained from the sensors and/or computerized information. In some exemplary embodiments, the images can be synthetically generated digital images, textual information, graphical symbology, processed and/or unprocessed video originated from electro- optical sensors, and any combination thereof, or the like. id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89" id="p-89"
[0089] It will be appreciated that misalignment between coordinates of TRU 140 and INS 211, representing Platform 10 (i.e., misalignment between TRU 140 and Platform 10) can result in a miss-congruence between a LOS of the tracking system and the LOS of Platform 10, thereby the LOS that the platform's sensors adhere to. id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90" id="p-90"
[0090] It should be noted that alignment between coordinates of, mechanically assembled, TRU 140 and INS 211 may grow poor during operational conditions, such as vibrations, high G maneuvers, temperature changes, distortions caused by material aging, and any combination thereof, or the like. Additionally, mechanically installed components have inherently limited accuracy due to unavoidable mechanical tolerances. Moreover, a maintenances process for harmonizing such components to reference coordinates is cumbersome, lengthy, expensive, and adding human error factors that may have to be repeated time and again. id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91" id="p-91"
[0091] In some exemplary embodiments, Sensor 241, attached to TRU 140, may be an Inertial Measurement Unit (IMU) similar to Sensors 132 and 134, of Figure 1, and may be configured to perform the similar activities, such as of Sensors 132 and 134, of Figure 1. id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92" id="p-92"
[0092] In some exemplary embodiments, INS 211 may be a self-contained Inertial navigation system using navigation techniques-based measurements provided by accelerometers and gyroscopes are used to track position and orientation of Platform 10 relative to a known starting point, orientation and velocity. The INS 211 contains three orthogonal rate-gyroscopes and three orthogonal accelerometers, measuring angular velocity and linear acceleration respectively, which may be represented by information, in a form of one or more angular-rate-signals. id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93" id="p-93"
[0093] In some exemplary embodiments at least one Sensor 241 embedded into TRU 140 is configured to provide information, in a form of angular-rate signal, to Processor 120, which is indicative of HMD 130 movements over time. At least one INS 211 connected to Platform 10, which is configured to provide information, in a form of angular-rate signal, to Processor 120, which is indicative of Display Unit 133 movements over time. id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94" id="p-94"
[0094] In some exemplary embodiments of the disclosed subject matter, misalignment of TRU 140 with respect to Platform 10 may be determined by continuously analyzing the angular rates of Sensor 241 and INS 211, in order to determine, using Processor 120, their relative movements and thereby their relative orientation. id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95" id="p-95"
[0095] In some exemplary embodiments, System 200 may utilize Processor 120 to perform methods such as depicted in Figure 5. Processor 120 may be also utilized to perform computations required by the System 200 or any of its subcomponents. It should be noted that Processor 1may be deployed in any location of Platform 10 and may comprise a collection of assisting processing devices and services, such as Display-processor 110, one or more computers of Platform 10, and any combination thereof, or the like. id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96" id="p-96"
[0096] According to another aspect of the present discloser, Processor 120 is configured to receive information, from the tracking system, indicating HMD’s 130 orientation relative to TRU 140 in addition to angular rates of INS 211. Processor 120 utilizes the information and the angular rates to dynamically calculate a transfer-alignment between TRU 140 and Platform 10 coordinates to be also utilized by the tracking system for alignment compensation between the HMD 130 and the platform. id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97" id="p-97"
[0097] In some exemplary embodiments of the disclosed subject matter, Processor 120 may comprise an I/O module (not shown). The I/O module may be utilized as an interface to transmit and/or receive information and instructions between Processor 120 and components of System 200. id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98" id="p-98"
[0098] In some exemplary embodiments, Processor 120 may comprise a memory module (not shown). The memory module may be comprised of volatile and/or non-volatile memories, based on technologies such as semiconductor, magnetic, optical, flash, a combination thereof, or the like. The memory module (not shown) may retain program code operative to cause Processor 1to perform acts associated with any of the steps shown in Figures 4, and 5. In some exemplary embodiments, the memory module (not shown) of Processor 120 may be used to retain information obtained from Sensor 241 and from INS 211, and thereby information indicating the position and orientation of HMD 130 with respect to Platform 10. id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99" id="p-99"
[0099] In some exemplary embodiments, Processor 120 dynamically calculates a relative orientation between TRU 140 and INS 211 in order to detect misalignments of TRU 140 relative to Platform 10, and consequently misalignments of HMD 130 with respect to platform’s sensors. The dynamic calculation may also involve causing Display-processor 110 to dynamically adjust its rendered images in order to compensate for the misalignment based on the relative orientation between TRU 140 and INS 211. Such compensation results in a conformal display between rendered images (e.g., Symbols 310 and 311’ of Figure 3B) with the user’s scene (e.g., Urban area 320 of Figure 3B). id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100" id="p-100"
[0100] It should be noted that the misalignment, described above, can be understood as shift of a viewpoint, and therefore the LOS of the tracking system of HMD 130, with respect to a viewpoint of Platform 10 and therefore the LOS and the sensors, such as INS, FLIR, LiDAR, Radar, or the like of Platform 10. id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101" id="p-101"
[0101] In some exemplary embodiments, Processor 120 may utilize a transfer-alignment algorithm to compute the relative orientation between TRU 140 and INS 211, i.e., misalignment of TRU 140. The transfer alignment algorithm resolves an alignment matrix, based on real time measurements of angular rates produced by Sensor 241 and INS 211. id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102" id="p-102"
[0102] In some exemplary embodiments, the transfer alignment algorithm comprises: analyzing the angular rates; synchronizing data streams of the angular rates, determining Sensor 241 and INS 211relative bias, extracting Sensor 241 and INS 211 relative orientation (i.e., alignment) and any combination thereof, or the like. id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103" id="p-103"
[0103] In some exemplary embodiments, the transfer alignment algorithm runs continuously to dynamically compensate and improve the alignment functionality. Additionally, or alternatively, the algorithm may be utilized for monitoring and testing the alignment integrity, functionality, and possibly alert the operator in case of failures, as defined for the intended use. id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104" id="p-104"
[0104] In some exemplary embodiments, the extracted orientation may be utilized by the computer of Platform, the tracking system, and Display-processor 110 to facilitates Processor 1in performing acts associated with any of the steps shown in Figure 5. id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105" id="p-105"
[0105] Referring now to Figure 4 showing a flowchart diagram of a conformal display method, in accordance with some exemplary embodiments of the disclosed subject matter. The conformal display method may be performed by System 100, of Figure 1. In some exemplary embodiments, the conformal display method continuously analyzes the relative orientation between HMD 1and Display Unit 133 in order to dynamically sustain conformal display. Additionally, the method is operative to cause Display-processor 110 to adjust rendered images that are aligned to the tracking system’s LOS based on the relative orientation. id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106" id="p-106"
[0106] In step 401, a movements information of HMD 130 may be received from Sensor 132. In some exemplary embodiments, an angular-rates signal generated by Sensor 132 continuously provides Processor 120 with information indictive of HMD 130 movements. id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107" id="p-107"
[0107] In step 402, a movements information of Display Unit 133 may be received from Sensor 134. In some exemplary embodiments, an angular-rates signal generated by Sensor 1continuously provides Processor 120 with information indictive of Display Unit 133 movements. id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108" id="p-108"
[0108] In step 403, a LOS of HMD 130 may be determined. In some exemplary embodiments, Processor 120 utilizes information generated by the tracking system for determining the LOS at which HMD 130 is aiming. The LOS and thus, the coordinates of HMD 130 and Platform 10, including its sensors, are aligned together. Additionally, or alternatively, Platform 10 may also align with a coordinate system established in space. id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109" id="p-109"
[0109] In step 404, a relative orientation between HMD 130 and the Display Unit 133 may be calculated. In some exemplary embodiments, Processor 120 may execute a transfer alignment algorithm (to be described in detail further below) in order to dynamically calculate the relative orientation. The relative orientation may be derived, by the transfer alignment algorithm, from movement information provided by Sensors 132 and 134. id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110" id="p-110"
[0110] In step 405, a LOS of Display Unit 133 may be determined. In some exemplary embodiments, the LOS of Display Unit 133 can be derived from the relative orientation calculation of Step 404. Using transfer alignment algorithm (to be described in detail further below) yields the position and orientation of Display Unit 133 with respect to the known coordinates of HMD 130, and thereby Platform 10, which thereafter allows extracting the LOS of Display Unit 133. id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111" id="p-111"
[0111] In step 406, the images on the display may be adjusted. In some exemplary embodiments, Processor 120, but not limited to, instructs Display-processor 110 to adjust its rendered images on Eyepiece 135 in order to bring the images to congruence with scenes viewed, by the user on, Eyepiece 135. That is to say, aligning the images so as the LOS of HMD 130, (i.e., Platform 10) and the LOS of Display Unit 133 overlap to yield conformal display. id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112" id="p-112"
[0112] It is to be still further noted that, with reference to Figure 4, some of the blocks can be integrated into a consolidated block or can be broken down to a few blocks and/or other blocks may be added. Furthermore, in some cases, the blocks can be performed in a different order than described herein. It is to be further noted that some of the blocks are optional. It should be also noted that whilst the flow diagram is described also with reference to the system elements that realizes them, this is by no means binding, and the blocks can be performed by elements other than those described herein. id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113" id="p-113"
[0113] Referring now to Figure 5 showing a flowchart diagram of a transfer-alignment method, in accordance with some exemplary embodiments of the disclosed subject matter. In some exemplary embodiments, the transfer-alignment method may be performed by System 200, of Figure 1, to continuously analyze the relative orientation between TRU 140 and INS 211 that represents coordinates of Platform 10. Additionally, or alternatively, the transfer-alignment method is configured to dynamically compensate for misalignments of HMD 130 tracking with respect to Platform 10. id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114" id="p-114"
[0114] In step 501, angular rates from Sensor 241 and INS 211 may be received. In some exemplary embodiments, an angular-rates signal generated by Sensor 241 and an angular-rates signal generated by INS 211 continuously provide Processor 120 with information indicating TRU 140 (i.e., tracking system) movements and INS 211 (i.e., Platform 10) movements. id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115" id="p-115"
[0115] In step 502, information indicating the orientation and position of HMD 130 may be received from the tracking system. id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116" id="p-116"
[0116] In step 503, a relative orientation between TRU 140 and INS 211 may be calculated. In some exemplary embodiments, Processor 120 may execute a transfer alignment algorithm (to be described in detail further below) in order to dynamically calculate the relative orientation. The relative orientation may be derived, by the transfer alignment algorithm, from movement information provided by Sensors 241 and INS 211. id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117" id="p-117"
[0117] In step 504, misaligned coordinates of HMD 130 may be corrected. In some exemplary embodiments, information, provided by the tracking system, that indicates the orientation and position of HMD 130 may be continually compensated in order to match the tracking system to Platform 10 coordinates. id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118" id="p-118"
[0118] It is to be still further noted that, with reference to Figure 5, some of the blocks can be integrated into a consolidated block or can be broken down to a few blocks and/or other blocks may be added. Furthermore, in some cases, the blocks can be performed in a different order than described herein. It is to be further noted that some of the blocks are optional. It should be also noted that whilst the flow diagram is described also with reference to the system elements that realizes them, this is by no means binding, and the blocks can be performed by elements other than those described herein.
Description of an exemplary transfer alignment algorithm id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119" id="p-119"
[0119] Exemplary parameters of the exemplary transfer alignment algorithm listed in thefollowing table. Input variables Format Description Units wgyro1 floatX3XN Angular velocity of gyro1 (e.g., Sensor132) deg/secwgyro2 floatX3XN Angular velocity of gyro2 (e.g., Sensor134) deg/sectgyro2 floatXN time tags of gyro1 sectgyro2 floatXN time tags of gyro2 secIsvalid1 boolXN validity of measurement of gyro1 noneIsvalid2 boolXN validity of measurement of gyro2 none Control Parameter Format Description Units Spacing between time pointsfloatSpacing between time points sec Threshold for interpolationfloatThreshold beyond no interpolation is made sec Hr^maxfloat Limit for an acceptable residual error degmaxmax ^ rrdelayfloatLimit for an acceptable residual error for time delaysec Harmonization uncertainty upper limitfloatLimit for an acceptable uncertainty for harmonizationdeg Interpolation mode enumchoose common times the interpolators are evaluatednone fcutfloat filter cutoff frequency nonefsfloat length of averaging window none

Claims (11)

284600 / CLAIMS What is claimed is:
1. A transfer-alignment system for a Head-Mounted Display (HMD), and a display coupled to the HMD, wherein the display is (a) adjustable by a user along the at least one degree of freedom, and (b) adapted to display images rendered by a display-processor, and wherein the HMD is monitored by a tracking system configured to provide information indicating position and/or orientation of the HMD with respect to a frame of reference, the system comprising: at least one first inertial sensor attached to the HMD and configured to acquire HMD’s inertial readings information indicative of movements of the HMD over time; at least one second inertial sensor attached to the display and configured to acquire display’s inertial readings information indicative of movements of the display over time; and a processor configured to: obtain the HMD’s inertial readings information, the display’s inertial readings information and the information indicating the HMD ’s position and/or orientation with respect to the frame of reference; continuously analyze movement information of the HMD and movement information of the display to determine relative orientation between the HMD and the display; and cause the display-processor to adjust the images to conform with respect to the frame of reference based on the information indicating the position and/or orientation and the relative movements of the HMD, wherein the frame of reference is selected from the group consisting of a platform coordinates, a fixed coordinate system established in space, an earth coordinate system, and any combination thereof.
2. The system of Claim 1, whereby adjusting the images to conform with respect to the frame of reference provides accuracy enhancement of a line-of-sight designation.
3. The system of Claim 1, wherein the HMD is worn by the user, and wherein the display is selected from the group consisting of: a see-through display; an opaque display; and any combination thereof. 284600 /
4. The system of Claim 3, wherein the user is operating a platform with respect to the frame of reference.
5. The system of Claim 4, further comprising at least one platform-sensor attached to the platform and configured to acquire platform-information indicative of the platform's movements over time with respect to a fixed coordinate system established in space, wherein the processor causes the display-processor to adjust the images also to compensate for the platf orm’ s movements over time.
6. The system of Claim 5, wherein said at least one first inertial sensor and said at least one second inertial sensor and said at least one platform-sensor are inertial measurement units.
7. The system of Claim 4, wherein the display displays to the user an augmented reality view comprised of scenes external to the platform and wherein the images are conformal to the external scenes.
8. The system of Claim 4, wherein the display displays to the user a virtual reality comprised of scenes external to the platform rendered by a video camera mounted on the HMD and wherein the images are conformal to the external scenes.
9. The system of Claim 1, wherein the images are selected from the group consisting of: graphical symbology; thermal images; text; video; synthetically generated images; and any combination thereof.
10. The system of Claim 1, wherein the tracking system is selected from the group consisting of an electro-optical system; an electromagnetic system; and a combination thereof.
11. The system of Claim 1, wherein the movement information of the HMD, the display movement information of display, and the information indicating the HMD's position and/or orientation with respect to the frame of reference.
IL284600A 2021-07-04 2021-07-04 A connformal display system and a method thereof IL284600B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
IL284600A IL284600B2 (en) 2021-07-04 2021-07-04 A connformal display system and a method thereof
EP22837159.7A EP4367547A1 (en) 2021-07-04 2022-05-23 A conformal display system and a method thereof
PCT/IL2022/050535 WO2023281488A1 (en) 2021-07-04 2022-05-23 A conformal display system and a method thereof
US18/402,769 US20240160278A1 (en) 2021-07-04 2024-01-03 Conformal display system and a method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL284600A IL284600B2 (en) 2021-07-04 2021-07-04 A connformal display system and a method thereof

Publications (3)

Publication Number Publication Date
IL284600A true IL284600A (en) 2023-02-01
IL284600B1 IL284600B1 (en) 2023-06-01
IL284600B2 IL284600B2 (en) 2023-10-01

Family

ID=86691811

Family Applications (1)

Application Number Title Priority Date Filing Date
IL284600A IL284600B2 (en) 2021-07-04 2021-07-04 A connformal display system and a method thereof

Country Status (1)

Country Link
IL (1) IL284600B2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170045736A1 (en) * 2015-08-12 2017-02-16 Seiko Epson Corporation Image display device, computer program, and image display system
US20170352190A1 (en) * 2016-06-02 2017-12-07 Thales Visionix, Inc. Miniature vision-inertial navigation system with extended dynamic range
US20180143682A1 (en) * 2016-11-22 2018-05-24 Honeywell International Inc. Nte display systems and methods with optical trackers
US20190197995A1 (en) * 2017-12-21 2019-06-27 Thales Method and system for readjusting, via an svs synthetic vision system, on a worn head-up display, a symbology which pertains to the piloting of an aircraft and which is conformal to the real outside world
US20190196192A1 (en) * 2017-12-21 2019-06-27 Thales Dual harmonization method and system for a head-worn display system for making the display of piloting information of an aircraft conform with the outside real world
US20190197196A1 (en) * 2017-12-26 2019-06-27 Seiko Epson Corporation Object detection and tracking
US20190196198A1 (en) * 2017-12-21 2019-06-27 Thales Dual harmonization method and system for a worn head-up display system with a removable attitude inertial device in the cockpit

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170045736A1 (en) * 2015-08-12 2017-02-16 Seiko Epson Corporation Image display device, computer program, and image display system
US20170352190A1 (en) * 2016-06-02 2017-12-07 Thales Visionix, Inc. Miniature vision-inertial navigation system with extended dynamic range
US20180143682A1 (en) * 2016-11-22 2018-05-24 Honeywell International Inc. Nte display systems and methods with optical trackers
US20190197995A1 (en) * 2017-12-21 2019-06-27 Thales Method and system for readjusting, via an svs synthetic vision system, on a worn head-up display, a symbology which pertains to the piloting of an aircraft and which is conformal to the real outside world
US20190196192A1 (en) * 2017-12-21 2019-06-27 Thales Dual harmonization method and system for a head-worn display system for making the display of piloting information of an aircraft conform with the outside real world
US20190196198A1 (en) * 2017-12-21 2019-06-27 Thales Dual harmonization method and system for a worn head-up display system with a removable attitude inertial device in the cockpit
US20190197196A1 (en) * 2017-12-26 2019-06-27 Seiko Epson Corporation Object detection and tracking

Also Published As

Publication number Publication date
IL284600B2 (en) 2023-10-01
IL284600B1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
IL284834A (en) Continuous time warp and binocular time warp for virtual and augmented reality display systems and methods
US10953330B2 (en) Reality vs virtual reality racing
IL258059B (en) Virtual/augmented reality system having reverse angle diffraction grating
CN110018736B (en) Object augmentation via near-eye display interface in artificial reality
IL274443B2 (en) System and methods for extrinsic calibration of cameras and diffractive optical elements
IL284819A (en) Visual tracking of peripheral devices
IL303275A (en) Mixed reality system with multi-source virtual content compositing and method of generating virtual content using same
IL297610A (en) Eye pose identification using eye features
IL256838B1 (en) Collimating fiber scanner design with inward pointing angles in virtual/augmented reality system
IL301719A (en) Method and system for tracking eye movement in conjunction with a light scanning projector
IL282078A (en) Polarizing maintaining optical fiber in virtual/augmented reality system
US20230027801A1 (en) Overlaying augmented reality (ar) content within an ar headset coupled to a magnifying loupe
IL294134B2 (en) Fixed-distance virtual and augmented reality systems and methods
CN105676452A (en) Augmented reality hud display method and device for vehicle
US10764558B2 (en) Reduced bandwidth stereo distortion correction for fisheye lenses of head-mounted displays
US20210272328A1 (en) Calibration techniques for aligning real-world objects to virtual objects in an augmented reality environment
CN109994015B (en) Wearable head-up display system and dual coordination method thereof
KR20200056721A (en) Method and apparatus for measuring optical properties of augmented reality device
IL284600B2 (en) A connformal display system and a method thereof
IL257946B (en) Optical display, image capturing device and methods with variable depth of field
CN111866493B (en) Image correction method, device and equipment based on head-mounted display equipment
CN111866492A (en) Image processing method, device and equipment based on head-mounted display equipment
US11899204B2 (en) Soft follow and pitch angle effects for VR/AR interface
US20240160278A1 (en) Conformal display system and a method thereof
Kermen et al. A multi-sensor integrated head-mounted display setup for augmented reality applications