US20220019078A1 - Space Suit Helmet Having Waveguide Display - Google Patents

Space Suit Helmet Having Waveguide Display Download PDF

Info

Publication number
US20220019078A1
US20220019078A1 US16/932,241 US202016932241A US2022019078A1 US 20220019078 A1 US20220019078 A1 US 20220019078A1 US 202016932241 A US202016932241 A US 202016932241A US 2022019078 A1 US2022019078 A1 US 2022019078A1
Authority
US
United States
Prior art keywords
surface structure
waveguide
processor
user
space suit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/932,241
Other versions
US11243400B1 (en
Inventor
Christopher A. Keith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockwell Collins Inc
Original Assignee
Rockwell Collins Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rockwell Collins Inc filed Critical Rockwell Collins Inc
Priority to US16/932,241 priority Critical patent/US11243400B1/en
Assigned to ROCKWELL COLLINS, INC. reassignment ROCKWELL COLLINS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KEITH, CHRISTOPHER A.
Priority to EP21186009.3A priority patent/EP3951477A3/en
Publication of US20220019078A1 publication Critical patent/US20220019078A1/en
Application granted granted Critical
Publication of US11243400B1 publication Critical patent/US11243400B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/042Optical devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G6/00Space suits
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Definitions

  • space suits only have a single-line display located on a chest of the suit, and astronauts make use of paper booklets on their forearm. There is a need for more accessible and ready information.
  • an oxygenated interior cavity of space suit helmets has regulated limitations for an amount of electrical current for electronics within the interior cavity to reduce the possibility of combustion within the space suit helmet.
  • space suits are not currently custom fitted to each wearer. Space suits are typically designed to accommodate a range of astronauts.
  • inventions of the inventive concepts disclosed herein are directed to a system.
  • the system may include a space suit helmet.
  • the space suit helmet may include a surface structure, an inner surface structure, and a waveguide display.
  • the inner surface structure may be configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure.
  • the waveguide display may be implemented at least one of in or on the space suit helmet.
  • the waveguide display may include a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • inventions of the inventive concepts disclosed herein are directed to a method.
  • the method may include: providing a space suit helmet, comprising a surface structure, an inner surface structure, and a waveguide display, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure, wherein the waveguide display is implemented at least one of in or on the space suit helmet, wherein the waveguide display comprises a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • FIG. 1 is a view of an exemplary embodiment of a system including a space suit helmet according to the inventive concepts disclosed herein.
  • FIG. 2 is a view of the eye tracking system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 3 is a view of the suit tracking system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 4 is a view of the voice recognition system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 5 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 6 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 7 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 8 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 9 is a diagram of an exemplary embodiment of a method according to the inventive concepts disclosed herein.
  • inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings.
  • inventive concepts disclosed herein may be practiced without these specific details.
  • well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure.
  • inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
  • a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b).
  • reference numeral e.g. 1, 1a, 1b
  • Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
  • any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein.
  • the appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
  • embodiments of the inventive concepts disclosed herein are directed to a method and a system including a space suit helmet having a waveguide display.
  • Some embodiments may include a waveguide display integrated into a space suit helmet to provide real-time conformal or non-conformal information to a user (e.g., an astronaut wearing the helmet).
  • the waveguide display for the space suit helmet may have various configurations, such as a side-mounted display attached a side of the space suit helmet, a display mounted inside an oxygen enriched environment of the space suit helmet, a waveguide display installed in between an at least translucent (e.g., translucent and/or transparent) inner surface structure (e.g., a pressure bubble) and an at least translucent (e.g., translucent and/or transparent) outer surface structure (e.g., an impact bubble), and/or a waveguide display mounted external to the pressure bubble and the impact bubble.
  • Some embodiments enable a small, compact display assembly to be integrated into the suit, which has not been possible with previous display technologies.
  • Previously conceived optical display solutions required large and bulky optics with the display sources remote from the apparatus that the user looks into in order to see the display.
  • Some embodiments may allow for the viewing apparatus to be placed between the impact and pressure bubbles, which may protect the display itself as well as maximizing volume inside the bubble for the user to move around and not bump into items placed inside the pressure bubble.
  • the system 100 may be implemented as any suitable system, such as at least one vehicle (e.g., a spacecraft).
  • the system 100 may include at least one suit (e.g., a space suit 101 ).
  • the space suit 101 may include a space suit helmet 102 .
  • the space suit helmet 102 may include at least one eye tracking system 104 , at least one suit tracking system 106 , at least one voice recognition system 108 , at least one processor 110 , at least one waveguide display 111 , at least one power supply (not shown), and/or at least one speaker 120 , some or all of which may be communicatively coupled at any given time.
  • the waveguide display 111 may include the at least one optical system 112 , at least one waveguide 114 , and/or at least one tint layer (e.g., at least one electrochromic layer 118 ), some or all of which may be optically and/or communicatively coupled at any given time.
  • the eye tracking system 104 may include at least one infrared light source 202 (e.g., at least one infrared light emitting diode (LED)), at least one infrared image sensor 204 , at least one processor 206 , and at least one memory 208 , as well as other components, equipment, and/or devices commonly included in an eye tracking system, some or all of which may be communicatively coupled at any time, as shown in FIG. 2 .
  • the eye tracking system 104 may be configured to track eye gestures, track movement of a user's eye, track a user's gaze, and/or otherwise receive inputs from a user's eyes.
  • the eye tracking system 104 may be configured for performing fully automatic eye tracking operations of users in real time.
  • the infrared light source 202 may be configured to emit infrared light onto at least one eye of a user.
  • the infrared sensitive image sensor 204 may be configured to capture images of the at least one eye illuminated by the infrared light source 202 .
  • the processor 206 may be configured to process data received from the infrared sensitive image sensor 204 and output processed data (e.g., eye tracking data) to one or more devices or systems of the space suit helmet 102 and/or the system 100 .
  • the processor 206 may be configured to generate eye tracking data and output the generated eye tracking data to one of the devices (e.g., the processor 110 ) of the space suit helmet 102 and/or the system 100 .
  • the processor 206 may be configured to run various software applications or computer code stored (e.g., maintained) in a non-transitory computer-readable medium (e.g., memory 208 ) and configured to execute various instructions or operations.
  • the processor 206 may be implemented as a special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout.
  • the processor 206 may be configured to: receive image data from the infrared sensitive image sensor 204 ; track movement of at least one eye of a user based on the image data; and/or output eye tracking system data indicative of the tracked movement of the at least one eye of the user.
  • the processor 206 may be configured to: perform visor distortion correction operations; perform eye mapping and alignment operations; output, via at least one data connection, eye tracking system data (e.g., indicative of eye azimuth and/or elevation) to a spacecraft interface, simulator interface, and/or other computing device of the system 100 ; and/or perform a suit tracking translation operation.
  • eye tracking system data e.g., indicative of eye azimuth and/or elevation
  • the suit tracking system 106 may have optical, magnetic, and/or inertial tracking capability.
  • the suit tracking system 106 may include suit tracking capabilities and/or be coordinated with suit tracking capabilities, for example, such that the suit tracking operations are relative to a position and/or orientation of the suit 101 and/or relative to a position and/or orientation to a vehicle.
  • suit tracking system 106 may be configured to track a direction of where a field of view (FOV) through the waveguide display 111 is pointing.
  • FOV field of view
  • the waveguide display 111 is mounted to the suit 101 (e.g., to the space suit helmet 102 )
  • this direction may be a direction that a torso or bubble is pointing that is being tracked.
  • the suit tracking system 106 may include at least one sensor 302 , at least one processor 304 , and at least one memory 306 , as well as other components, equipment, and/or devices commonly included in a suit tracking system, some or all of which may be communicatively coupled at any time, as shown in FIG. 3 .
  • the at least one sensor 302 may be at least one optical sensor (e.g., an optical infrared sensor configured to detect infrared light), at least one magnetic sensor, and/or at least one inertial sensor.
  • the suit tracking system 106 may be configured to determine and track a position and an orientation of a user's head relative to an environment.
  • the suit tracking system 106 may be configured for performing fully automatic suit tracking operations in real time.
  • the processor 304 of the suit tracking system 106 may be configured to process data received from the sensors 302 and output processed data (e.g., suit tracking data) to one of the computing devices of the system 100 and/or the processor 110 for use in generating images aligned with the user's field of view, such as augmented reality or virtual reality images aligned with the user's field of view to be displayed by the waveguide display 111 .
  • the processor 304 may be configured to determine and track a position and orientation of a user's head relative to an environment. Additionally, for example, the processor 304 may be configured to generate position and orientation data associated with such determined information and output the generated position and orientation data.
  • the processor 304 may be configured to run various software applications or computer code stored in a non-transitory computer-readable medium (e.g., memory 306 ) and configured to execute various instructions or operations.
  • the at least one processor 304 may be implemented as a special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout.
  • the voice recognition system 108 may include at least one microphone 402 , at least one processor 404 , memory 406 , and storage 408 , as shown in FIG. 4 , as well as other components, equipment, and/or devices commonly included in a voice recognition system.
  • the microphone 402 , the processor 404 , the memory 406 , and the storage 408 , as well as the other components, equipment, and/or devices commonly included in a voice recognition system may be communicatively coupled.
  • the voice recognition system 108 may be configured to recognize voice commands or audible inputs of a user.
  • the voice recognition system 108 may allow the user to use verbal commands as an interaction and control method.
  • the voice recognition system 108 may be configured to detect user commands and output user command data (e.g., voice command data), which, for example, may be used to provide commands to control operation of the waveguide display 111 . Additionally, verbal commands may be used to modify, manipulate, and declutter content displayed by the waveguide display 111 .
  • the voice recognition system 108 may be integrated with the eye tracking system 104 so context of user inputs can be inferred.
  • the processor 404 may be configured to process data received from the microphone 402 and output processed data (e.g., text data and/or voice command data) to a device of the system 100 and/or the processor 110 .
  • the processor 404 may be configured to run various software applications or computer code stored in a non-transitory computer-readable medium and configured to execute various instructions or operations
  • the at least one processor 110 may be implemented as any suitable processor(s), such as at least one general purpose, at least one image processor, at least one graphics processing unit (GPU), and/or at least one special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout.
  • the processor 110 may be communicatively coupled to the waveguide display element 111 .
  • the processor 110 may be configured to: receive the eye tracking system data; receive the suit tracking system data; receive the voice command data; generate and/or output image data to the waveguide display 111 and/or to the optical system 112 , for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; generate and/or output image data to the optical system 112 , for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; generate and/or output augmented reality and/or virtual reality image data to the optical system 112 , for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; and/or generate and/or output other image data, which may include vehicle operation (e.g., space flight) information, navigation information, tactical information, and/or sensor information to the optical system 112 , for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data.
  • vehicle operation e.g., space flight
  • the processor 110 may be configured to: output graphical data to the optical system 112 ; control operation of the optical system based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control whether the optical system is in an active state or deactivated state based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control content displayed by the waveguide display 111 based at least on the eye tracking data, the voice command data, and/or the suit tracking data; steer a field of view of the waveguide display 111 based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control an operation (e.g., an amount of tint) of the electrochromic layer 118 , for example, based at least on the eye tracking data, the voice command data, the suit tracking data, and/or a sensed brightness; and/or output audio data to the at least one speaker 120 for presentation to the user, for example, based at least on the eye tracking data, the voice command data, and/or a
  • the waveguide display 111 may be implemented as any suitable waveguide display.
  • the waveguide display 111 may include the at least one optical system 112 , at least one waveguide 114 , and/or at least one tint layer (e.g., at least one electrochromic layer 118 ).
  • the optical system 112 may include at least one processor, at least one collimator, and/or at least projector 116 .
  • the optical system 112 may be configured to project images at least through the waveguide 114 to be displayed to the user.
  • the waveguide 116 may be a diffractive, mirror, or beam splitter based waveguide.
  • the waveguide display 111 may include at least one lens, at least one mirror, diffraction gratings, at least one polarization sensitive component, at least one beam splitter, the at least one waveguide 114 , at least one light pipe, at least one window, and/or the projector 116 .
  • the optical system 112 may be configured to receive image data from the processor 110 and project images through the waveguide 114 for display to the user.
  • the tint layer (e.g., the electrochromic layer 118 ) may be positioned on a side of a viewable portion of the waveguide 114 (e.g., positioned on a back side such that a viewable portion of the waveguide 114 is between the tint layer and the user 502 ).
  • the tint layer may improve a perceived brightness of content displayed by the waveguide display 111 in a high brightness environment.
  • the electrochromic layer 118 may receive an electric stimulus from the processor 110 and/or the optical system 112 to darken the electrochromic layer 118 so as to improve a perceived brightness.
  • the processor 110 and/or the optical system 112 may automatically control a tint level of the electrochromic layer 118 based at least on a sensed environmental brightness.
  • the electrochromic layer 118 may provide a variable tint.
  • the electrochromic layer 118 may dim real world ambient light from passing through a viewable portion of the waveguide 114 and improve display visibility.
  • FIGS. 5-8 exemplary embodiments of a space suit helmet 102 of FIG. 1 worn by a user 502 (e.g., an astronaut) according to the inventive concepts disclosed herein are depicted. In addition to one or more of the elements shown in FIGS.
  • the space suit helmet 102 may include at least one ring 504 , a first surface structure (e.g., an outer surface structure; e.g., an impact bubble 602 ), a second surface structure (e.g., an inner surface structure; e.g., a pressure bubble 606 ), a gap 604 between the first surface structure and the second surface structure, an interior cavity 608 , and/or wires 610 (e.g., connecting the optical system 112 to the processor 110 ).
  • a first surface structure e.g., an outer surface structure; e.g., an impact bubble 602
  • a second surface structure e.g., an inner surface structure; e.g., a pressure bubble 606
  • a gap 604 between the first surface structure and the second surface structure e.g., an interior cavity 608
  • wires 610 e.g., connecting the optical system 112 to the processor 110 .
  • the first surface structure e.g., an outer surface structure; e.g., an impact bubble 602
  • the second surface structure e.g., an inner surface structure; e.g., a pressure bubble 606
  • the inner surface structure e.g., the impact bubble 602
  • the outer surface structure e.g., the impact bubble 602
  • Each of the inner surface structure and the outer surface structure may be at least translucent (e.g., translucent or transparent), such that the user 502 is able to see through the inner surface structure and the outer surface structure.
  • the inner surface structure and the outer surface structure may be any suitable shape, such as having at least one flat surface, at least one curved surface, or a combination thereof.
  • the outer surface structure may be the impact bubble 602
  • the inner surface structure may be the pressure bubble 606 .
  • the waveguide display 111 may be implemented in and/or on the space suit helmet 102 .
  • the waveguide display 111 may be positioned at any suitable location, such as in a direct forward view or some other location (e.g., off to a side of the user 502 and/or down at chin level of the user 502 ).
  • the waveguide display 111 may be adjustably positionable (e.g., tiltable and/or movable in a lateral and/or vertical direction), such as by use of a motor, magnets, a pivot joint, and/or a track); in some of such embodiments, the processor 110 may be configured to control an orientation and/or a position of a viewable portion of the waveguide display 111 ; in other of such embodiments, the orientation and/or the position of a viewable portion of the waveguide display 111 may be manually adjusted.
  • the processor 110 may be configured to control an orientation and/or a position of a viewable portion of the waveguide display 111 ; in other of such embodiments, the orientation and/or the position of a viewable portion of the waveguide display 111 may be manually adjusted.
  • the waveguide display 111 may be mounted within space suit helmet 102 in the interior cavity 608 .
  • the waveguide display 111 may be mounted to the space suit helmet 102 near the ring 504 at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • the waveguide display 111 may be mounted within space suit helmet 102 in between the first surface structure (e.g., the impact bubble 602 ) and the second surface structure (e.g., the pressure bubble 606 ).
  • the waveguide display 111 may be mounted to the ring 504 of the space suit helmet 102 .
  • the waveguide display 111 may be positionable at any suitable height and lateral position.
  • the waveguide display 111 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • the waveguide display 111 may be positioned at chin level level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • FIG. 6 positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • positioning the waveguide display 111 between the first surface structure (e.g., the impact bubble 602 ) and the second surface structure (e.g., the pressure bubble 606 ) may protect the waveguide display 111 and maximize a volume inside of the pressure bubble 606 for the user 502 to move around in the pressure bubble 606 and not bump into the waveguide display 111 .
  • the waveguide display 111 may be mounted within space suit helmet 102 .
  • the optical system 112 may be mounted in between the first surface structure (e.g., the impact bubble 602 ) and the second surface structure (e.g., the pressure bubble 606 ).
  • the waveguide 114 may be mounted at least in part in the interior cavity 608 .
  • the optical system 112 may be configured to project images through the inner surface structure and the waveguide 114 to be displayed to the user.
  • the waveguide 114 may extend through the inner surface structure to within the interior cavity 608 .
  • the optical system 112 may be mounted to the ring 504 of the space suit helmet 102 .
  • the waveguide 114 may be positionable at any suitable height and lateral position.
  • the waveguide 114 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • the waveguide 114 may be positioned at chin level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • the waveguide display 111 may be mounted on an exterior of the space suit helmet 102 such that the outer surface structure is positioned between the waveguide display 111 and the inner surface structure.
  • the waveguide display 111 may be mounted to an exterior of the ring 504 of the space suit helmet 102 .
  • the waveguide display 111 may be positionable at any suitable height and lateral position.
  • the waveguide display 111 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • the waveguide display 111 may be positioned at chin level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502 .
  • positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • an exemplary embodiment of a method 900 may include one or more of the following steps. Additionally, for example, some embodiments may include performing one more instances of the method 900 iteratively, concurrently, and/or sequentially. Additionally, for example, at least some of the steps of the method 900 may be performed in parallel and/or concurrently. Additionally, in some embodiments, at least some of the steps of the method 900 may be performed non-sequentially. Additionally, in some embodiments, at least some of the steps of the method 900 may be performed in sub-steps of providing various components.
  • a step 902 may include providing a space suit helmet, comprising a surface structure, an inner surface structure, and a waveguide display, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure, wherein the waveguide display is implemented at least one of in or on the space suit helmet, wherein the waveguide display comprises a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • the method 900 may include any of the operations disclosed throughout.
  • embodiments of the inventive concepts disclosed herein may be directed to a method and a system including a space suit helmet having a waveguide display.
  • At least one non-transitory computer-readable medium may refer to as at least one non-transitory computer-readable medium (e.g., e.g., at least one computer-readable medium implemented as hardware; e.g., at least one non-transitory processor-readable medium, at least one memory (e.g., at least one nonvolatile memory, at least one volatile memory, or a combination thereof; e.g., at least one random-access memory, at least one flash memory, at least one read-only memory (ROM) (e.g., at least one electrically erasable programmable read-only memory (EEPROM)), at least one on-processor memory (e.g., at least one on-processor cache, at least one on-processor buffer, at least one on-processor flash memory, at least one on-processor EEPROM, or a combination thereof), or a combination thereof), at least one storage device (e.g., at least one hard disk drive, or a digital versatile disk drive, or a magnetic tape
  • “at least one” means one or a plurality of; for example, “at least one” may comprise one, two, three, . . . , one hundred, or more.
  • “one or more” means one or a plurality of; for example, “one or more” may comprise one, two, three, . . . , one hundred, or more.
  • zero or more means zero, one, or a plurality of; for example, “zero or more” may comprise zero, one, two, three, . . . , one hundred, or more.
  • the methods, operations, and/or functionality disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods, operations, and/or functionality disclosed are examples of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the methods, operations, and/or functionality can be rearranged while remaining within the scope of the inventive concepts disclosed herein.
  • the accompanying claims may present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
  • embodiments of the methods according to the inventive concepts disclosed herein may include one or more of the steps described herein. Further, such steps may be carried out in any desired order and two or more of the steps may be carried out simultaneously with one another. Two or more of the steps disclosed herein may be combined in a single step, and in some embodiments, one or more of the steps may be carried out as two or more sub-steps. Further, other steps or sub-steps may be carried in addition to, or as substitutes to one or more of the steps disclosed herein.
  • inventive concepts disclosed herein are well adapted to carry out the objects and to attain the advantages mentioned herein as well as those inherent in the inventive concepts disclosed herein. While presently preferred embodiments of the inventive concepts disclosed herein have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the broad scope and coverage of the inventive concepts disclosed and claimed herein.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Helmets And Other Head Coverings (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)

Abstract

A system may include a space suit helmet. The space suit helmet may include a surface structure, an inner surface structure, and a waveguide display. The inner surface structure may be configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure. The waveguide display may be implemented at least one of in or on the space suit helmet. The waveguide display may include a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.

Description

    BACKGROUND
  • Currently, space suits only have a single-line display located on a chest of the suit, and astronauts make use of paper booklets on their forearm. There is a need for more accessible and ready information. Currently, there are requirements to minimize the number of items an astronaut physically wears on their head, such as headsets, microphones, caps, and near-eye displays. Additionally, an oxygenated interior cavity of space suit helmets has regulated limitations for an amount of electrical current for electronics within the interior cavity to reduce the possibility of combustion within the space suit helmet. Additionally, space suits are not currently custom fitted to each wearer. Space suits are typically designed to accommodate a range of astronauts.
  • SUMMARY
  • In one aspect, embodiments of the inventive concepts disclosed herein are directed to a system. The system may include a space suit helmet. The space suit helmet may include a surface structure, an inner surface structure, and a waveguide display. The inner surface structure may be configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure. The waveguide display may be implemented at least one of in or on the space suit helmet. The waveguide display may include a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • In a further aspect, embodiments of the inventive concepts disclosed herein are directed to a method. The method may include: providing a space suit helmet, comprising a surface structure, an inner surface structure, and a waveguide display, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure, wherein the waveguide display is implemented at least one of in or on the space suit helmet, wherein the waveguide display comprises a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Implementations of the inventive concepts disclosed herein may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the included drawings, which are not necessarily to scale, and in which some features may be exaggerated and some features may be omitted or may be represented schematically in the interest of clarity. Like reference numerals in the drawings may represent and refer to the same or similar element, feature, or function. In the drawings:
  • FIG. 1 is a view of an exemplary embodiment of a system including a space suit helmet according to the inventive concepts disclosed herein.
  • FIG. 2 is a view of the eye tracking system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 3 is a view of the suit tracking system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 4 is a view of the voice recognition system of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 5 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 6 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 7 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 8 is a view of an exemplary embodiment of the space suit helmet of FIG. 1 according to the inventive concepts disclosed herein.
  • FIG. 9 is a diagram of an exemplary embodiment of a method according to the inventive concepts disclosed herein.
  • DETAILED DESCRIPTION
  • Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
  • As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
  • Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
  • In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
  • Finally, as used herein any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
  • Broadly, embodiments of the inventive concepts disclosed herein are directed to a method and a system including a space suit helmet having a waveguide display.
  • Some embodiments may include a waveguide display integrated into a space suit helmet to provide real-time conformal or non-conformal information to a user (e.g., an astronaut wearing the helmet). In some embodiments, the waveguide display for the space suit helmet may have various configurations, such as a side-mounted display attached a side of the space suit helmet, a display mounted inside an oxygen enriched environment of the space suit helmet, a waveguide display installed in between an at least translucent (e.g., translucent and/or transparent) inner surface structure (e.g., a pressure bubble) and an at least translucent (e.g., translucent and/or transparent) outer surface structure (e.g., an impact bubble), and/or a waveguide display mounted external to the pressure bubble and the impact bubble. Some embodiments enable a small, compact display assembly to be integrated into the suit, which has not been possible with previous display technologies.
  • Previously conceived optical display solutions required large and bulky optics with the display sources remote from the apparatus that the user looks into in order to see the display. Some embodiments may allow for the viewing apparatus to be placed between the impact and pressure bubbles, which may protect the display itself as well as maximizing volume inside the bubble for the user to move around and not bump into items placed inside the pressure bubble.
  • Referring now to FIGS. 1-4, an exemplary embodiment of a system 100 according to the inventive concepts disclosed herein is depicted. The system 100 may be implemented as any suitable system, such as at least one vehicle (e.g., a spacecraft). For example, as shown in FIG. 1, the system 100 may include at least one suit (e.g., a space suit 101). For example, the space suit 101 may include a space suit helmet 102. In some embodiments, the space suit helmet 102 may include at least one eye tracking system 104, at least one suit tracking system 106, at least one voice recognition system 108, at least one processor 110, at least one waveguide display 111, at least one power supply (not shown), and/or at least one speaker 120, some or all of which may be communicatively coupled at any given time. For example, the waveguide display 111 may include the at least one optical system 112, at least one waveguide 114, and/or at least one tint layer (e.g., at least one electrochromic layer 118), some or all of which may be optically and/or communicatively coupled at any given time.
  • The eye tracking system 104 may include at least one infrared light source 202 (e.g., at least one infrared light emitting diode (LED)), at least one infrared image sensor 204, at least one processor 206, and at least one memory 208, as well as other components, equipment, and/or devices commonly included in an eye tracking system, some or all of which may be communicatively coupled at any time, as shown in FIG. 2. The eye tracking system 104 may be configured to track eye gestures, track movement of a user's eye, track a user's gaze, and/or otherwise receive inputs from a user's eyes. The eye tracking system 104 may be configured for performing fully automatic eye tracking operations of users in real time.
  • The infrared light source 202 may be configured to emit infrared light onto at least one eye of a user.
  • The infrared sensitive image sensor 204 may be configured to capture images of the at least one eye illuminated by the infrared light source 202.
  • The processor 206 may be configured to process data received from the infrared sensitive image sensor 204 and output processed data (e.g., eye tracking data) to one or more devices or systems of the space suit helmet 102 and/or the system 100. For example, the processor 206 may be configured to generate eye tracking data and output the generated eye tracking data to one of the devices (e.g., the processor 110) of the space suit helmet 102 and/or the system 100. The processor 206 may be configured to run various software applications or computer code stored (e.g., maintained) in a non-transitory computer-readable medium (e.g., memory 208) and configured to execute various instructions or operations. The processor 206 may be implemented as a special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout. For example, the processor 206 may be configured to: receive image data from the infrared sensitive image sensor 204; track movement of at least one eye of a user based on the image data; and/or output eye tracking system data indicative of the tracked movement of the at least one eye of the user. For example, the processor 206 may be configured to: perform visor distortion correction operations; perform eye mapping and alignment operations; output, via at least one data connection, eye tracking system data (e.g., indicative of eye azimuth and/or elevation) to a spacecraft interface, simulator interface, and/or other computing device of the system 100; and/or perform a suit tracking translation operation.
  • The suit tracking system 106 may have optical, magnetic, and/or inertial tracking capability. In some embodiments, the suit tracking system 106 may include suit tracking capabilities and/or be coordinated with suit tracking capabilities, for example, such that the suit tracking operations are relative to a position and/or orientation of the suit 101 and/or relative to a position and/or orientation to a vehicle. For example, suit tracking system 106 may be configured to track a direction of where a field of view (FOV) through the waveguide display 111 is pointing. For example, if the waveguide display 111 is mounted to the suit 101 (e.g., to the space suit helmet 102), this direction may be a direction that a torso or bubble is pointing that is being tracked. The suit tracking system 106 may include at least one sensor 302, at least one processor 304, and at least one memory 306, as well as other components, equipment, and/or devices commonly included in a suit tracking system, some or all of which may be communicatively coupled at any time, as shown in FIG. 3. The at least one sensor 302 may be at least one optical sensor (e.g., an optical infrared sensor configured to detect infrared light), at least one magnetic sensor, and/or at least one inertial sensor. The suit tracking system 106 may be configured to determine and track a position and an orientation of a user's head relative to an environment. The suit tracking system 106 may be configured for performing fully automatic suit tracking operations in real time. The processor 304 of the suit tracking system 106 may be configured to process data received from the sensors 302 and output processed data (e.g., suit tracking data) to one of the computing devices of the system 100 and/or the processor 110 for use in generating images aligned with the user's field of view, such as augmented reality or virtual reality images aligned with the user's field of view to be displayed by the waveguide display 111. For example, the processor 304 may be configured to determine and track a position and orientation of a user's head relative to an environment. Additionally, for example, the processor 304 may be configured to generate position and orientation data associated with such determined information and output the generated position and orientation data. The processor 304 may be configured to run various software applications or computer code stored in a non-transitory computer-readable medium (e.g., memory 306) and configured to execute various instructions or operations. The at least one processor 304 may be implemented as a special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout.
  • The voice recognition system 108 may include at least one microphone 402, at least one processor 404, memory 406, and storage 408, as shown in FIG. 4, as well as other components, equipment, and/or devices commonly included in a voice recognition system. The microphone 402, the processor 404, the memory 406, and the storage 408, as well as the other components, equipment, and/or devices commonly included in a voice recognition system may be communicatively coupled. The voice recognition system 108 may be configured to recognize voice commands or audible inputs of a user. The voice recognition system 108 may allow the user to use verbal commands as an interaction and control method. The voice recognition system 108 may be configured to detect user commands and output user command data (e.g., voice command data), which, for example, may be used to provide commands to control operation of the waveguide display 111. Additionally, verbal commands may be used to modify, manipulate, and declutter content displayed by the waveguide display 111. The voice recognition system 108 may be integrated with the eye tracking system 104 so context of user inputs can be inferred. The processor 404 may be configured to process data received from the microphone 402 and output processed data (e.g., text data and/or voice command data) to a device of the system 100 and/or the processor 110. The processor 404 may be configured to run various software applications or computer code stored in a non-transitory computer-readable medium and configured to execute various instructions or operations
  • The at least one processor 110 may be implemented as any suitable processor(s), such as at least one general purpose, at least one image processor, at least one graphics processing unit (GPU), and/or at least one special purpose processor configured to execute instructions for performing (e.g., collectively performing if more than one processor) any or all of the operations disclosed throughout. In some embodiments, the processor 110 may be communicatively coupled to the waveguide display element 111. For example, the processor 110 may be configured to: receive the eye tracking system data; receive the suit tracking system data; receive the voice command data; generate and/or output image data to the waveguide display 111 and/or to the optical system 112, for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; generate and/or output image data to the optical system 112, for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; generate and/or output augmented reality and/or virtual reality image data to the optical system 112, for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data; and/or generate and/or output other image data, which may include vehicle operation (e.g., space flight) information, navigation information, tactical information, and/or sensor information to the optical system 112, for example, based on the eye tracking system data, the voice command data, and/or the suit tracking system data.
  • For example, the processor 110 may be configured to: output graphical data to the optical system 112; control operation of the optical system based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control whether the optical system is in an active state or deactivated state based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control content displayed by the waveguide display 111 based at least on the eye tracking data, the voice command data, and/or the suit tracking data; steer a field of view of the waveguide display 111 based at least on the eye tracking data, the voice command data, and/or the suit tracking data; control an operation (e.g., an amount of tint) of the electrochromic layer 118, for example, based at least on the eye tracking data, the voice command data, the suit tracking data, and/or a sensed brightness; and/or output audio data to the at least one speaker 120 for presentation to the user, for example, based at least on the eye tracking data, the voice command data, and/or the suit tracking data.
  • The waveguide display 111 may be implemented as any suitable waveguide display. The waveguide display 111 may include the at least one optical system 112, at least one waveguide 114, and/or at least one tint layer (e.g., at least one electrochromic layer 118). For example, the optical system 112 may include at least one processor, at least one collimator, and/or at least projector 116. The optical system 112 may be configured to project images at least through the waveguide 114 to be displayed to the user. In some embodiments, the waveguide 116 may be a diffractive, mirror, or beam splitter based waveguide. In some embodiments, the waveguide display 111 may include at least one lens, at least one mirror, diffraction gratings, at least one polarization sensitive component, at least one beam splitter, the at least one waveguide 114, at least one light pipe, at least one window, and/or the projector 116.
  • The optical system 112 may be configured to receive image data from the processor 110 and project images through the waveguide 114 for display to the user.
  • The tint layer (e.g., the electrochromic layer 118) may be positioned on a side of a viewable portion of the waveguide 114 (e.g., positioned on a back side such that a viewable portion of the waveguide 114 is between the tint layer and the user 502). For example, the tint layer may improve a perceived brightness of content displayed by the waveguide display 111 in a high brightness environment. For example, the electrochromic layer 118 may receive an electric stimulus from the processor 110 and/or the optical system 112 to darken the electrochromic layer 118 so as to improve a perceived brightness. In some embodiments, the processor 110 and/or the optical system 112 may automatically control a tint level of the electrochromic layer 118 based at least on a sensed environmental brightness. For example, the electrochromic layer 118 may provide a variable tint. For example, the electrochromic layer 118 may dim real world ambient light from passing through a viewable portion of the waveguide 114 and improve display visibility.
  • Referring now to FIGS. 5-8, exemplary embodiments of a space suit helmet 102 of FIG. 1 worn by a user 502 (e.g., an astronaut) according to the inventive concepts disclosed herein are depicted. In addition to one or more of the elements shown in FIGS. 1-4, the space suit helmet 102 may include at least one ring 504, a first surface structure (e.g., an outer surface structure; e.g., an impact bubble 602), a second surface structure (e.g., an inner surface structure; e.g., a pressure bubble 606), a gap 604 between the first surface structure and the second surface structure, an interior cavity 608, and/or wires 610 (e.g., connecting the optical system 112 to the processor 110).
  • For example, the first surface structure (e.g., an outer surface structure; e.g., an impact bubble 602) and the second surface structure (e.g., an inner surface structure; e.g., a pressure bubble 606) may be coupled to the ring 504. The inner surface structure (e.g., the impact bubble 602) may be configured to maintain an oxygenated environment within the interior cavity 608 of the space suit helmet 102. The outer surface structure (e.g., the impact bubble 602) may be configured to absorb impacts. Each of the inner surface structure and the outer surface structure may be at least translucent (e.g., translucent or transparent), such that the user 502 is able to see through the inner surface structure and the outer surface structure. The inner surface structure and the outer surface structure may be any suitable shape, such as having at least one flat surface, at least one curved surface, or a combination thereof. For example, the outer surface structure may be the impact bubble 602, and the inner surface structure may be the pressure bubble 606.
  • The waveguide display 111 may be implemented in and/or on the space suit helmet 102. The waveguide display 111 may be positioned at any suitable location, such as in a direct forward view or some other location (e.g., off to a side of the user 502 and/or down at chin level of the user 502). In some embodiments, the waveguide display 111 may be adjustably positionable (e.g., tiltable and/or movable in a lateral and/or vertical direction), such as by use of a motor, magnets, a pivot joint, and/or a track); in some of such embodiments, the processor 110 may be configured to control an orientation and/or a position of a viewable portion of the waveguide display 111; in other of such embodiments, the orientation and/or the position of a viewable portion of the waveguide display 111 may be manually adjusted.
  • As shown in FIG. 5, the waveguide display 111 may be mounted within space suit helmet 102 in the interior cavity 608. For example, the waveguide display 111 may be mounted to the space suit helmet 102 near the ring 504 at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502.
  • As shown in FIG. 6, the waveguide display 111 may be mounted within space suit helmet 102 in between the first surface structure (e.g., the impact bubble 602) and the second surface structure (e.g., the pressure bubble 606). For example, the waveguide display 111 may be mounted to the ring 504 of the space suit helmet 102. For example, the waveguide display 111 may be positionable at any suitable height and lateral position. For example, the waveguide display 111 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, the waveguide display 111 may be positioned at chin level level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, as shown in FIG. 6, positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion. For example, as shown in FIG. 6, positioning the waveguide display 111 between the first surface structure (e.g., the impact bubble 602) and the second surface structure (e.g., the pressure bubble 606) may protect the waveguide display 111 and maximize a volume inside of the pressure bubble 606 for the user 502 to move around in the pressure bubble 606 and not bump into the waveguide display 111.
  • As shown in FIG. 7, the waveguide display 111 may be mounted within space suit helmet 102. For example, the optical system 112 may be mounted in between the first surface structure (e.g., the impact bubble 602) and the second surface structure (e.g., the pressure bubble 606). For example, the waveguide 114 may be mounted at least in part in the interior cavity 608. In some embodiments, the optical system 112 may be configured to project images through the inner surface structure and the waveguide 114 to be displayed to the user. In some embodiments, the waveguide 114 may extend through the inner surface structure to within the interior cavity 608. For example, the optical system 112 may be mounted to the ring 504 of the space suit helmet 102. For example, the waveguide 114 may be positionable at any suitable height and lateral position. For example, the waveguide 114 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, the waveguide 114 may be positioned at chin level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, as shown in FIG. 7, positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • As shown in FIG. 8, the waveguide display 111 may be mounted on an exterior of the space suit helmet 102 such that the outer surface structure is positioned between the waveguide display 111 and the inner surface structure. For example, the waveguide display 111 may be mounted to an exterior of the ring 504 of the space suit helmet 102. For example, the waveguide display 111 may be positionable at any suitable height and lateral position. For example, the waveguide display 111 may be positioned at eye level such that (a) when the user 502 is looking straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, the waveguide display 111 may be positioned at chin level such that (a) when the user 502 is looking down and straight ahead, the waveguide display 111 is in a field of view of at least one eye of the user 502 or (b) when the user 502 looks down and to the side (e.g., the left or right side), the waveguide display 111 is in a field of view of at least one eye of the user 502. For example, as shown in FIG. 8, positioning the optical system 112 outside of the oxygenated interior cavity 608 may reduce a likelihood of an electrical spark causing combustion.
  • Referring now to FIG. 9, an exemplary embodiment of a method 900 according to the inventive concepts disclosed herein may include one or more of the following steps. Additionally, for example, some embodiments may include performing one more instances of the method 900 iteratively, concurrently, and/or sequentially. Additionally, for example, at least some of the steps of the method 900 may be performed in parallel and/or concurrently. Additionally, in some embodiments, at least some of the steps of the method 900 may be performed non-sequentially. Additionally, in some embodiments, at least some of the steps of the method 900 may be performed in sub-steps of providing various components.
  • A step 902 may include providing a space suit helmet, comprising a surface structure, an inner surface structure, and a waveguide display, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure, wherein the waveguide display is implemented at least one of in or on the space suit helmet, wherein the waveguide display comprises a waveguide and an optical system configured to project images at least through the waveguide to be displayed to the user.
  • Further, the method 900 may include any of the operations disclosed throughout.
  • As will be appreciated from the above, embodiments of the inventive concepts disclosed herein may be directed to a method and a system including a space suit helmet having a waveguide display.
  • As used throughout and as would be appreciated by those skilled in the art, “at least one non-transitory computer-readable medium” may refer to as at least one non-transitory computer-readable medium (e.g., e.g., at least one computer-readable medium implemented as hardware; e.g., at least one non-transitory processor-readable medium, at least one memory (e.g., at least one nonvolatile memory, at least one volatile memory, or a combination thereof; e.g., at least one random-access memory, at least one flash memory, at least one read-only memory (ROM) (e.g., at least one electrically erasable programmable read-only memory (EEPROM)), at least one on-processor memory (e.g., at least one on-processor cache, at least one on-processor buffer, at least one on-processor flash memory, at least one on-processor EEPROM, or a combination thereof), or a combination thereof), at least one storage device (e.g., at least one hard-disk drive, at least one tape drive, at least one solid-state drive, at least one flash drive, at least one readable and/or writable disk of at least one optical drive configured to read from and/or write to the at least one readable and/or writable disk, or a combination thereof), or a combination thereof).
  • As used throughout, “at least one” means one or a plurality of; for example, “at least one” may comprise one, two, three, . . . , one hundred, or more. Similarly, as used throughout, “one or more” means one or a plurality of; for example, “one or more” may comprise one, two, three, . . . , one hundred, or more. Further, as used throughout, “zero or more” means zero, one, or a plurality of; for example, “zero or more” may comprise zero, one, two, three, . . . , one hundred, or more.
  • In the present disclosure, the methods, operations, and/or functionality disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods, operations, and/or functionality disclosed are examples of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the methods, operations, and/or functionality can be rearranged while remaining within the scope of the inventive concepts disclosed herein. The accompanying claims may present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
  • It is to be understood that embodiments of the methods according to the inventive concepts disclosed herein may include one or more of the steps described herein. Further, such steps may be carried out in any desired order and two or more of the steps may be carried out simultaneously with one another. Two or more of the steps disclosed herein may be combined in a single step, and in some embodiments, one or more of the steps may be carried out as two or more sub-steps. Further, other steps or sub-steps may be carried in addition to, or as substitutes to one or more of the steps disclosed herein.
  • From the above description, it is clear that the inventive concepts disclosed herein are well adapted to carry out the objects and to attain the advantages mentioned herein as well as those inherent in the inventive concepts disclosed herein. While presently preferred embodiments of the inventive concepts disclosed herein have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the broad scope and coverage of the inventive concepts disclosed and claimed herein.

Claims (21)

1. A system, comprising:
a space suit helmet, comprising:
a surface structure;
an inner surface structure, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure; and
a waveguide display implemented at least one of in or on the space suit helmet, the waveguide display comprising:
a waveguide positioned at least in part in the interior cavity; and
an optical system configured to project images at least through the waveguide to be displayed to the user.
2. (canceled)
3. (canceled)
4. The system of claim 1, wherein the optical system is positioned between the inner surface structure and the surface structure, wherein the optical system is configured to project images through the inner surface structure and the waveguide to be displayed to the user.
5. The system of claim 1, wherein the optical system is positioned between the inner surface structure and the surface structure, wherein the waveguide extends through the inner surface structure to within the interior cavity.
6. (canceled)
7. The system of claim 1, wherein the waveguide includes a tint layer.
8. The system of claim 7, wherein the tint layer is an electrochromic layer configured to shift from transparent to less transparent based on an electric stimulus.
9. The system of claim 1, wherein the space suit helmet further comprises at least one processor configured to output graphical data to the optical system, wherein the space suit helmet further comprises an eye tracking system configured to track eye movement of the user, wherein the eye tracking system is configured to output eye tracking data to the to the at least one processor, wherein the at least one processor is further configured to control operation of the optical system based at least on the eye tracking data.
10. The system of claim 9, wherein the at least one processor is further configured to control whether the optical system is in an active state or deactivated state based at least on the eye tracking data.
11. The system of claim 9, wherein the at least one processor is further configured to receive a user input based at least on the eye tracking data, wherein the at least one processor is further configured to control content displayed by the waveguide display based at least on the user input.
12. The system of claim 9, wherein the space suit helmet further comprises a voice recognition system configured to receive voice commands from the user, wherein the voice recognition system is configured to output voice command data to the at least one processor, wherein the at least one processor is further configured to control operation of the optical system based at least on the voice command data.
13. The system of claim 9, wherein the space suit helmet further comprises a suit tracking system configured to track a direction of where a field of view (FOV) through the waveguide display is pointing and output suit tracking data to the at least one processor, wherein the at least one processor is further configured to control operation of the optical system based at least on the suit tracking data
14. The system of claim 1, wherein the space suit helmet further comprises at least one processor configured to output graphical data to the optical system, wherein the at least one processor is further configured to steer a field of view of the waveguide display.
15. A method, comprising:
providing a space suit helmet, comprising a surface structure, an inner surface structure, and a waveguide display, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure, wherein the waveguide display is implemented at least one of in or on the space suit helmet, wherein the waveguide display comprises a waveguide positioned at least in part in the interior cavity and an optical system configured to project images at least through the waveguide to be displayed to the user.
16. A system, comprising:
a space suit helmet, comprising:
a surface structure;
an inner surface structure, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure; and
a waveguide display implemented at least one of in or on the space suit helmet, wherein the waveguide display is positioned on an exterior of the space suit helmet such that the surface structure is positioned between the waveguide display and the inner surface structure, the waveguide display comprising:
a waveguide; and
an optical system configured to project images at least through the waveguide to be displayed to the user.
17. A system, comprising:
a space suit helmet, comprising:
a surface structure;
an inner surface structure, wherein the inner surface structure is configured to maintain an oxygenated environment within an interior cavity of the space suit helmet, wherein a user is able to see through the inner surface structure and the surface structure;
a gap between the surface structure and the inner surface structure; and
a waveguide display positioned in the gap between the inner surface structure and the surface structure, the waveguide display comprising:
a waveguide; and
an optical system configured to project images at least through the waveguide to be displayed to the user.
18. The system of claim 17, wherein the waveguide includes a tint layer.
19. The system of claim 18, wherein the space suit helmet further comprises at least one processor configured to output graphical data to the optical system, wherein the space suit helmet further comprises an eye tracking system configured to track eye movement of the user, wherein the eye tracking system is configured to output eye tracking data to the at least one processor, wherein the at least one processor is further configured to control operation of the optical system based at least on the eye tracking data, wherein the space suit helmet further comprises a suit tracking system configured to track a direction of where a field of view (FOV) through the waveguide display is pointing and output suit tracking data to the at least one processor, wherein the at least one processor is further configured to control operation of the optical system based at least on the suit tracking data.
20. The system of claim 19, wherein the tint layer is an electrochromic layer, wherein the at least one processor is further configured to control an amount of tint of the electrochromic layer based at least on a sensed brightness and the eye tracking data.
21. The system of claim 20, wherein the tint layer is an electrochromic layer, wherein the at least one processor is further configured to control the amount of tint of the electrochromic layer based at least on the sensed brightness, the eye tracking data, voice command data, and the suit tracking data.
US16/932,241 2020-07-17 2020-07-17 Space suit helmet having waveguide display Active US11243400B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/932,241 US11243400B1 (en) 2020-07-17 2020-07-17 Space suit helmet having waveguide display
EP21186009.3A EP3951477A3 (en) 2020-07-17 2021-07-16 Space suit helmet having waveguide display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/932,241 US11243400B1 (en) 2020-07-17 2020-07-17 Space suit helmet having waveguide display

Publications (2)

Publication Number Publication Date
US20220019078A1 true US20220019078A1 (en) 2022-01-20
US11243400B1 US11243400B1 (en) 2022-02-08

Family

ID=77300720

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/932,241 Active US11243400B1 (en) 2020-07-17 2020-07-17 Space suit helmet having waveguide display

Country Status (2)

Country Link
US (1) US11243400B1 (en)
EP (1) EP3951477A3 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230166874A1 (en) * 2021-11-30 2023-06-01 Hamilton Sundstrand Space Systems International, Inc. Atmospheric suit helmet display and display-based control
US20230204958A1 (en) * 2021-12-28 2023-06-29 David Fliszar Eyewear electronic tinting lens with integrated waveguide

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
US20160131906A1 (en) * 2014-11-07 2016-05-12 Honeywell International Inc. Compact proximity display utilizing image transfer
US20170038593A1 (en) * 2014-07-18 2017-02-09 Vuzix Corporation Near-Eye Display With Self-Emitting Microdisplay Engine
US10067560B1 (en) * 2017-01-25 2018-09-04 Rockwell Collins, Inc. Head tracker for dismounted users
US20190250408A1 (en) * 2018-02-12 2019-08-15 Thales Peripheral vision in a human-machine interface

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4897715A (en) 1988-10-31 1990-01-30 General Electric Company Helmet display
US5091719A (en) 1989-12-26 1992-02-25 General Electric Company Helmet display
US5420828A (en) 1992-06-25 1995-05-30 Geiger; Michael B. Viewing screen assembly
US5535025A (en) 1994-02-01 1996-07-09 Hughes Training, Inc. Helmet mounted off axis liquid crystal display with a fiber optic wedge and a curved reflector
US5715094A (en) 1996-12-03 1998-02-03 Hughes Electronics Lensless helmet/head mounted display
US8643948B2 (en) 2007-04-22 2014-02-04 Lumus Ltd. Collimating optical device and system
AU2010243329B2 (en) 2009-04-29 2014-01-30 Snap Inc. Head mounted display
US9445639B1 (en) 2012-11-08 2016-09-20 Peter Aloumanis Embedding intelligent electronics within a motorcyle helmet
US9247779B1 (en) 2012-11-08 2016-02-02 Peter Aloumanis Enhanced global positioning system (GPS) based functionality for helmets
US9244280B1 (en) 2014-03-25 2016-01-26 Rockwell Collins, Inc. Near eye display system and method for display enhancement or redundancy
US9500868B2 (en) 2014-07-10 2016-11-22 Honeywell International Inc. Space suit helmet display system
US9626936B2 (en) * 2014-08-21 2017-04-18 Microsoft Technology Licensing, Llc Dimming module for augmented and virtual reality
US9733475B1 (en) 2014-09-08 2017-08-15 Rockwell Collins, Inc. Curved waveguide combiner for head-mounted and helmet-mounted displays (HMDS), a collimated virtual window, or a head up display (HUD)
WO2016138438A1 (en) 2015-02-27 2016-09-01 LAFORGE Optical, Inc. Augmented reality eyewear
US10324290B2 (en) 2015-12-17 2019-06-18 New Skully, Inc. Situational awareness systems and methods
US9977427B2 (en) * 2016-01-05 2018-05-22 The Charles Stark Draper Laboratory, Inc. System and method for assisted extravehicular activity self-return
CN110383141A (en) 2016-12-29 2019-10-25 曼戈泰克有限责任公司 Improvement type head-up display system for being used together with the helmet
US10642038B1 (en) 2017-01-30 2020-05-05 Rockwell Collins, Inc. Waveguide based fused vision system for a helmet mounted or head worn application
GB201714572D0 (en) 2017-09-11 2017-10-25 Bae Systems Plc Head-mounted display and control apparatus and method
US10325560B1 (en) 2017-11-17 2019-06-18 Rockwell Collins, Inc. Head wearable display device
CN107885124B (en) * 2017-11-21 2020-03-24 中国运载火箭技术研究院 Brain and eye cooperative control method and system in augmented reality environment
CN207571392U (en) 2017-12-13 2018-07-03 深圳市炬视科技有限公司 Intelligent safety helmet
CN109512074B (en) 2018-12-25 2024-02-20 安徽中航显示技术有限公司 Novel fire-fighting helmet based on optical waveguide display technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038593A1 (en) * 2014-07-18 2017-02-09 Vuzix Corporation Near-Eye Display With Self-Emitting Microdisplay Engine
US20160109943A1 (en) * 2014-10-21 2016-04-21 Honeywell International Inc. System and method for controlling visibility of a proximity display
US20160131906A1 (en) * 2014-11-07 2016-05-12 Honeywell International Inc. Compact proximity display utilizing image transfer
US10067560B1 (en) * 2017-01-25 2018-09-04 Rockwell Collins, Inc. Head tracker for dismounted users
US20190250408A1 (en) * 2018-02-12 2019-08-15 Thales Peripheral vision in a human-machine interface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230166874A1 (en) * 2021-11-30 2023-06-01 Hamilton Sundstrand Space Systems International, Inc. Atmospheric suit helmet display and display-based control
US20230204958A1 (en) * 2021-12-28 2023-06-29 David Fliszar Eyewear electronic tinting lens with integrated waveguide

Also Published As

Publication number Publication date
EP3951477A3 (en) 2022-05-04
EP3951477A2 (en) 2022-02-09
US11243400B1 (en) 2022-02-08

Similar Documents

Publication Publication Date Title
US20220057831A1 (en) Headset Computer That Uses Motion And Voice Commands To Control Information Display And Remote Devices
EP3182051B1 (en) Methods of vestibulo-ocular reflex correction in display systems
EP3029550B1 (en) Virtual reality system
US9959591B2 (en) Display apparatus, method for controlling display apparatus, and program
US10740971B2 (en) Augmented reality field of view object follower
US10303435B2 (en) Head-mounted display device, method of controlling head-mounted display device, and computer program
US9910513B2 (en) Stabilizing motion of an interaction ray
KR102373940B1 (en) Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
WO2016063504A1 (en) Head-mounted type display device and method for controlling same, and computer program
KR20230117236A (en) Eyewear including sign language-to-speech translation
US9442631B1 (en) Methods and systems for hands-free browsing in a wearable computing device
JP6459421B2 (en) Head-mounted display device, method for controlling head-mounted display device, computer program
EP3951477A2 (en) Space suit helmet having waveguide display
US20160133051A1 (en) Display device, method of controlling the same, and program
US9500868B2 (en) Space suit helmet display system
JP6492673B2 (en) Head-mounted display device, method for controlling head-mounted display device, computer program
US10901225B1 (en) Systems and methods for positioning a head-mounted display
JP2017091433A (en) Head-mounted type display device, method of controlling head-mounted type display device, and computer program
CN106154548A (en) Clairvoyant type head-mounted display apparatus
US11435593B1 (en) Systems and methods for selectively augmenting artificial-reality experiences with views of real-world environments
US9720233B2 (en) Compact proximity display utilizing image transfer
JP2016091348A (en) Head-mounted display device and control method for the same as well as computer program
JP6394108B2 (en) Head-mounted display device, control method therefor, and computer program
WO2020137088A1 (en) Head-mounted display, display method, and display system
US11619814B1 (en) Apparatus, system, and method for improving digital head-mounted displays

Legal Events

Date Code Title Description
AS Assignment

Owner name: ROCKWELL COLLINS, INC., IOWA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KEITH, CHRISTOPHER A.;REEL/FRAME:053243/0711

Effective date: 20200716

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE