US20230224364A1 - Dynamic sensor network in atmospheric suit - Google Patents

Dynamic sensor network in atmospheric suit Download PDF

Info

Publication number
US20230224364A1
US20230224364A1 US17/571,674 US202217571674A US2023224364A1 US 20230224364 A1 US20230224364 A1 US 20230224364A1 US 202217571674 A US202217571674 A US 202217571674A US 2023224364 A1 US2023224364 A1 US 2023224364A1
Authority
US
United States
Prior art keywords
suit
atmospheric
network
controller
ports
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/571,674
Other versions
US11716386B1 (en
Inventor
Monica Torralba
Ashley Rose Himmelmann
Jake Rohrig
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hamilton Sundstrand Corp
Original Assignee
Hamilton Sundstrand Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hamilton Sundstrand Corp filed Critical Hamilton Sundstrand Corp
Priority to US17/571,674 priority Critical patent/US11716386B1/en
Assigned to HAMILTON SUNDSTRAND CORPORATION reassignment HAMILTON SUNDSTRAND CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIMMELMANN, ASHLEY ROSE, TORRALBA, Monica, ROHRIG, JAKE
Priority to EP23150949.8A priority patent/EP4210342A1/en
Publication of US20230224364A1 publication Critical patent/US20230224364A1/en
Application granted granted Critical
Publication of US11716386B1 publication Critical patent/US11716386B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G6/00Space suits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • AHUMAN NECESSITIES
    • A62LIFE-SAVING; FIRE-FIGHTING
    • A62BDEVICES, APPARATUS OR METHODS FOR LIFE-SAVING
    • A62B17/00Protective clothing affording protection against heat or harmful chemical agents or for use at high altitudes
    • A62B17/008High-altitude pressure suits
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D1/00Dropping, ejecting, releasing, or receiving articles, liquids, or the like, in flight
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/46Arrangements or adaptations of devices for control of environment or living conditions

Definitions

  • Exemplary embodiments pertain to the art of atmospheric suit and, in particular, to a dynamic sensor network in an atmospheric suit.
  • an atmospheric suit is used not only for protection against impacts but also to maintain a habitable environment.
  • an extravehicular mobility unit which includes a helmet and full body suit supplied by an oxygen tank, maintains an environment that sustains the astronaut.
  • a system in an atmospheric suit includes a controller within the atmospheric suit.
  • the system also includes a hub and spoke network arranged within the atmospheric suit.
  • the controller is the hub of the network and each spoke of the network represents wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit.
  • the controller obtains data from one or more sensors coupled to one or more of the plurality of ports.
  • the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the controller processes the data from the one or more sensors to obtain information.
  • a wearer of the atmospheric suit specifies processing of the data.
  • the controller provides information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
  • the one or more output devices include audio, video, or haptic output devices.
  • one of the one or more sensors is a camera
  • one of the one or more output devices is a display device
  • the controller obtains images from the camera and provides the information based on the images for display to the wearer on the display device.
  • system also includes microcontrollers corresponding with one or more of the plurality of ports.
  • the network includes redundant communication between two or more of the plurality of microcontroller or between two or more of the ports.
  • system also includes a cover on each of the plurality of ports.
  • a method of assembling a system in an atmospheric suit includes configuring a controller within the atmospheric suit.
  • the method also includes arranging a network as a hub and spoke network within the atmospheric suit.
  • the controller is the hub of the network and each spoke of the network represents wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit.
  • the configuring the controller includes the controller obtaining data from one or more sensors coupled to one or more of the plurality of ports.
  • the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the configuring the controller includes the controller processing the data from the one or more sensors to obtain information.
  • the configuring the controller includes the controller obtaining an indication of the processing of the data from a wearer of the atmospheric suit.
  • the configuring the controller includes the controller providing information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
  • the one or more output devices include audio, video, or haptic output devices.
  • one of the one or more sensors is a camera
  • one of the one or more output devices is a display device
  • the configuring the controller includes the controller obtaining images from the camera and providing the information based on the images for display to the wearer on the display device.
  • the arranging the network includes disposing microcontrollers corresponding with one or more of the plurality of ports.
  • the arranging the network includes configuring redundant communication between two or more of the plurality of microcontroller or between two or more of the ports.
  • the method also includes disposing a cover on each of the plurality of ports.
  • FIG. 1 shows an atmospheric suit that includes a dynamic sensor network according to one or more embodiments
  • FIG. 2 is a block diagram of an exemplary dynamic sensor network according to one or more embodiments.
  • FIG. 3 is a block diagram of another exemplary dynamic sensor network according to one or more embodiments.
  • an atmospheric suit maintains a habitable environment for the wearer in different applications.
  • the atmospheric suit may be an EMU. While the atmospheric suit is essential in an otherwise uninhabitable environment, it can be bulky and restrict spatial awareness.
  • the helmet of the atmospheric suit is fixed such that a wearer moves their head without moving the helmet (i.e., the transparent portion of the helmet).
  • looking to the side or behind requires moving the body (and, correspondingly, the atmospheric suit) to expose the side or back to the transparent portion of the helmet.
  • sensors may be needed for safety or data-gathering. These sensors may be difficult to carry and operate in the atmospheric suit.
  • Embodiments of the systems and methods detailed herein relate to a dynamic sensor network in an atmospheric suit.
  • the network may be structured in a hub and spoke configuration with a controller of the atmospheric suit acting as the hub.
  • Each spoke may lead to a port accessible outside the atmospheric suit, and different sensors may be coupled to the port, as needed.
  • a battery of the atmospheric suit may act as the hub with the spokes facilitating charging of the sensors.
  • FIG. 1 shows an atmospheric suit 100 that includes a dynamic sensor network 200 ( FIG. 2 ) according to one or more embodiments.
  • the exemplary atmospheric suit 100 shown in FIG. 1 is an EMU 105 .
  • Systems that are affixed as part of the EMU 105 include a primary life support system (PLSS) 120 and a display and control module (DCM) 130 . These systems 120 , 130 , along with components of the EMU 105 , create a habitable environment for a wearer performing extravehicular activity in space.
  • PLSS primary life support system
  • DCM display and control module
  • applications for the controller system architecture may also include underwater (e.g., in an atmospheric diving suit), earth-based (e.g., in a hazmat suit or contamination suit), high-altitude (e.g., in a flight suit), and sub-surface environments.
  • underwater e.g., in an atmospheric diving suit
  • earth-based e.g., in a hazmat suit or contamination suit
  • high-altitude e.g., in a flight suit
  • sub-surface environments e.g., any suit that includes the helmet to maintain a habitable environment.
  • the EMU 105 includes a helmet 110 , shown with an exemplary in-helmet display as one exemplary output device 115 a and a speaker as another exemplary output device 115 b (generally referred to as output device 115 ).
  • the helmet 110 has a transparent inner bubble that maintains the environment in the EMU 105 , as well as a transparent outer bubble that protects against impacts.
  • the display device may include a screen on a swingarm that allows the screen to be raised to eye level for viewing or may include an organic light emitting diode (OLED) array.
  • An OLED display device may be inside the helmet, with the inner bubble acting as a substrate, or may be in the gap between the inner and outer bubbles, with the outer bubble acting as the substrate.
  • a display device may also be on a swingarm or otherwise affixed on the outside of the helmet 110 .
  • the EMU 105 may include two or more display devices whose number and location is not intended to be limited by the discussion of exemplary embodiments.
  • the speaker may be inside the inner bubble or may include a diaphragm on the outside of the inner bubble that vibrates to produce an audio output.
  • the numbers, types, and locations of speakers is not intended to be limited by the examples.
  • haptic or combination output devices 115 may be provided in the EMU 105 .
  • the numbers, types, and locations of output devices 115 that provide information to the wearer of the EMU 105 are not intended to be limited by the discussion of specific examples.
  • One or more sensors 140 may dynamically be affixed to the EMU 105 . Dynamic refers to the fact that the numbers and positions of sensors 140 may be changed at any time, even during extravehicular activity.
  • Two exemplary sensors 140 are indicated in FIG. 1 . Also indicated is an unused port 155 . While not visible, each of the sensors 140 is coupled to the dynamic sensor network 200 via a port 155 . As the expanded view indicates, the port 155 may have a cover 150 when unused to prevent dust or other particles from entering the port 155 .
  • FIG. 1 Two exemplary sensors 140 are indicated in FIG. 1 .
  • an unused port 155 While not visible, each of the sensors 140 is coupled to the dynamic sensor network 200 via a port 155 . As the expanded view indicates, the port 155 may have a cover 150 when unused to prevent dust or other particles from entering the port 155 .
  • one sensor 140 e.g., Geiger counter
  • another sensor 140 e.g., camera
  • the camera may be angled down so that the path ahead of the EMU 105 may be viewed in real time on a display used as the output device 115 while walking.
  • the dynamic sensor network 200 that facilitates obtaining data from these and other sensors 140 and providing information via one or more output devices 115 is detailed with reference to FIG. 2 .
  • FIG. 2 is a block diagram of an exemplary dynamic sensor network 200 according to one or more embodiments.
  • the dynamic sensor network 200 may be in a spoke and hub arrangement.
  • a controller 210 acts as the hub.
  • the controller 210 may be part of the DCM 130 , for example, with one or more processors and memory devices that facilitate obtaining data from one or more of the sensors 140 via communication lines 220 (i.e., spokes) and providing information to one or more output devices 115 .
  • the exemplary dynamic sensor network 200 is shown with sensors 140 a through 140 n (e.g., camera, infrared camera, proximity sensor, Geiger counter, rangefinder) while ports 155 a through 155 x are shown. That is, some ports 155 may be unused during a given mission.
  • FIG. 2 also shows output devices 115 a through 115 m (e.g., audio, visual, haptic).
  • one of the sensors 140 may be a camera coupled to a port 155 near the hand of the EMU 105 . This camera may be used to see around objects in a cave or the like.
  • the images provided as data over the communication line 220 corresponding to the data port 155 may be projected to an OLED display on the inner bubble of the helmet 110 as the output device 115 .
  • a sensor 140 may be a Geiger counter coupled to one of the ports 155 .
  • the radiation readings provided to the controller 210 over the corresponding communication line 220 may be checked by the controller 210 to determine if a threshold value has been crossed. If so, the controller 210 may provide an audible alert to an output device 115 that is a speaker or provide haptic feedback to an output device 115 that implements a vibration.
  • the type of data provided by a given sensor 140 may determine the analysis performed by the controller 210 , as well as the output provided to an output device 115 .
  • the data may be provided with an identifier or may be recognizable based on the content.
  • the controller 210 may essentially implement a mapping of the processing that is appropriate for each data type. Input from the wearer of the EMU 105 , provided via the DCM 130 , for example, may affect the processing that is performed by the controller 210 .
  • the table below provides exemplary processing that may be performed based on the data obtained by the controller 210 .
  • the examples provided for explanatory purposes are not intended to limit additional sensors 140 , processing by the controller 210 , or additional output devices 115 .
  • processing performed output to output data from sensor 140 by controller 210 device 115 image data from pass through or any display to one or more camera specified image display devices processing radiation level from compare with a audio, video, Geiger counter predefined threshold haptic alert value proximity (distance) pass through or display distance to from proximity sensor compare with a closest object or provide predefined threshold alert based on a distance distance below the predefined threshold distance range to an object pass through range display to one or from rangefinder more display devices
  • a battery 230 of the EMU 105 may be a hub.
  • the battery may be part of the PLSS 120 , for example.
  • One or more sensors 140 may be powered or charged via power lines 240 from the battery 230 to corresponding ports 155 .
  • FIG. 3 is a block diagram of an exemplary dynamic sensor network 300 according to one or more embodiments.
  • FIG. 3 shows additional or alternate features as compared with the exemplary dynamic sensor network 200 shown in FIG. 2 .
  • the fundamental components of the controller 210 and communication lines 220 from various ports 155 to which sensors 140 may couple is shown.
  • each port 155 may include an optional microcontroller 310 a through 310 x (generally referred to as 310 ).
  • data from each sensor 140 that is coupled to a port 155 may be routed through a microcontroller 310 via a communication line 220 to the controller 210 .
  • redundant wired or wireless communication lines 320 may be included between ports or, more specifically, between microcontrollers 310 . That is, sensors 140 may communicate data from a corresponding port 155 to another port 155 for relay to the controller 210 via a communication line 220 from the other port 155 . This may be necessitated due to failure of the communication line 220 from the port 155 corresponding with the sensor 140 , for example.
  • microcontrollers 310 may communicate data obtained from a corresponding sensor 140 to another microcontroller 310 . The data may be relayed to the controller 210 or may be combined with data from the sensor 140 corresponding to the other microcontroller 310 prior to being provided to the controller 210 , for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Remote Sensing (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)
  • Helmets And Other Head Coverings (AREA)

Abstract

A system in an atmospheric suit includes a controller within the atmospheric suit. The system also includes a network that is a hub and spoke network arranged within the atmospheric suit. The controller is the hub of the network and each spoke of the network represents wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit.

Description

    BACKGROUND
  • Exemplary embodiments pertain to the art of atmospheric suit and, in particular, to a dynamic sensor network in an atmospheric suit.
  • In some environments and applications, an atmospheric suit is used not only for protection against impacts but also to maintain a habitable environment. In a space application, for example, an extravehicular mobility unit (EMU), which includes a helmet and full body suit supplied by an oxygen tank, maintains an environment that sustains the astronaut.
  • BRIEF DESCRIPTION
  • In one exemplary embodiment, a system in an atmospheric suit includes a controller within the atmospheric suit. The system also includes a hub and spoke network arranged within the atmospheric suit. The controller is the hub of the network and each spoke of the network represents wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit.
  • In addition to one or more of the features described herein, the controller obtains data from one or more sensors coupled to one or more of the plurality of ports.
  • In addition to one or more of the features described herein, the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the controller processes the data from the one or more sensors to obtain information.
  • In addition to one or more of the features described herein, a wearer of the atmospheric suit specifies processing of the data.
  • In addition to one or more of the features described herein, the controller provides information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
  • In addition to one or more of the features described herein, the one or more output devices include audio, video, or haptic output devices.
  • In addition to one or more of the features described herein, one of the one or more sensors is a camera, one of the one or more output devices is a display device, and the controller obtains images from the camera and provides the information based on the images for display to the wearer on the display device.
  • In addition to one or more of the features described herein, the system also includes microcontrollers corresponding with one or more of the plurality of ports.
  • In addition to one or more of the features described herein, the network includes redundant communication between two or more of the plurality of microcontroller or between two or more of the ports.
  • In addition to one or more of the features described herein, the system also includes a cover on each of the plurality of ports.
  • In another exemplary embodiment, a method of assembling a system in an atmospheric suit includes configuring a controller within the atmospheric suit. The method also includes arranging a network as a hub and spoke network within the atmospheric suit. The controller is the hub of the network and each spoke of the network represents wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit.
  • In addition to one or more of the features described herein, the configuring the controller includes the controller obtaining data from one or more sensors coupled to one or more of the plurality of ports.
  • In addition to one or more of the features described herein, the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the configuring the controller includes the controller processing the data from the one or more sensors to obtain information.
  • In addition to one or more of the features described herein, the configuring the controller includes the controller obtaining an indication of the processing of the data from a wearer of the atmospheric suit.
  • In addition to one or more of the features described herein, the configuring the controller includes the controller providing information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
  • In addition to one or more of the features described herein, the one or more output devices include audio, video, or haptic output devices.
  • In addition to one or more of the features described herein, one of the one or more sensors is a camera, one of the one or more output devices is a display device, and the configuring the controller includes the controller obtaining images from the camera and providing the information based on the images for display to the wearer on the display device.
  • In addition to one or more of the features described herein, the arranging the network includes disposing microcontrollers corresponding with one or more of the plurality of ports.
  • In addition to one or more of the features described herein, the arranging the network includes configuring redundant communication between two or more of the plurality of microcontroller or between two or more of the ports.
  • In addition to one or more of the features described herein, the method also includes disposing a cover on each of the plurality of ports.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The following descriptions should not be considered limiting in any way. With reference to the accompanying drawings, like elements are numbered alike:
  • FIG. 1 shows an atmospheric suit that includes a dynamic sensor network according to one or more embodiments;
  • FIG. 2 is a block diagram of an exemplary dynamic sensor network according to one or more embodiments; and
  • FIG. 3 is a block diagram of another exemplary dynamic sensor network according to one or more embodiments.
  • DETAILED DESCRIPTION
  • A detailed description of one or more embodiments of the disclosed apparatus and method are presented herein by way of exemplification and not limitation with reference to the Figures.
  • As previously noted, an atmospheric suit maintains a habitable environment for the wearer in different applications. In the exemplary space application, the atmospheric suit may be an EMU. While the atmospheric suit is essential in an otherwise uninhabitable environment, it can be bulky and restrict spatial awareness. For example, unlike a motorcycle helmet or the like, the helmet of the atmospheric suit is fixed such that a wearer moves their head without moving the helmet (i.e., the transparent portion of the helmet). Thus, looking to the side or behind requires moving the body (and, correspondingly, the atmospheric suit) to expose the side or back to the transparent portion of the helmet. In addition, depending on the nature and duration of an extravehicular mission, sensors may be needed for safety or data-gathering. These sensors may be difficult to carry and operate in the atmospheric suit.
  • Embodiments of the systems and methods detailed herein relate to a dynamic sensor network in an atmospheric suit. The network may be structured in a hub and spoke configuration with a controller of the atmospheric suit acting as the hub. Each spoke may lead to a port accessible outside the atmospheric suit, and different sensors may be coupled to the port, as needed. According to alternate or additional embodiments, a battery of the atmospheric suit may act as the hub with the spokes facilitating charging of the sensors.
  • FIG. 1 shows an atmospheric suit 100 that includes a dynamic sensor network 200 (FIG. 2 ) according to one or more embodiments. The exemplary atmospheric suit 100 shown in FIG. 1 is an EMU 105. Systems that are affixed as part of the EMU 105 include a primary life support system (PLSS) 120 and a display and control module (DCM) 130. These systems 120, 130, along with components of the EMU 105, create a habitable environment for a wearer performing extravehicular activity in space. While an EMU and a space application are specifically discussed for explanatory purposes, applications for the controller system architecture according to one or more embodiments may also include underwater (e.g., in an atmospheric diving suit), earth-based (e.g., in a hazmat suit or contamination suit), high-altitude (e.g., in a flight suit), and sub-surface environments. Generally, any suit that includes the helmet to maintain a habitable environment is referred to as an atmospheric suit.
  • The EMU 105 includes a helmet 110, shown with an exemplary in-helmet display as one exemplary output device 115 a and a speaker as another exemplary output device 115 b (generally referred to as output device 115). The helmet 110 has a transparent inner bubble that maintains the environment in the EMU 105, as well as a transparent outer bubble that protects against impacts. The display device may include a screen on a swingarm that allows the screen to be raised to eye level for viewing or may include an organic light emitting diode (OLED) array. An OLED display device may be inside the helmet, with the inner bubble acting as a substrate, or may be in the gap between the inner and outer bubbles, with the outer bubble acting as the substrate. A display device may also be on a swingarm or otherwise affixed on the outside of the helmet 110. According to exemplary embodiments, the EMU 105 may include two or more display devices whose number and location is not intended to be limited by the discussion of exemplary embodiments.
  • The speaker may be inside the inner bubble or may include a diaphragm on the outside of the inner bubble that vibrates to produce an audio output. The numbers, types, and locations of speakers is not intended to be limited by the examples. Further, in addition to audio and visual output devices 115, haptic or combination output devices 115 may be provided in the EMU 105. The numbers, types, and locations of output devices 115 that provide information to the wearer of the EMU 105 are not intended to be limited by the discussion of specific examples.
  • One or more sensors 140 (e.g., video/still camera, infrared camera, proximity sensor, Geiger counter, rangefinder) may dynamically be affixed to the EMU 105. Dynamic refers to the fact that the numbers and positions of sensors 140 may be changed at any time, even during extravehicular activity. Two exemplary sensors 140 are indicated in FIG. 1 . Also indicated is an unused port 155. While not visible, each of the sensors 140 is coupled to the dynamic sensor network 200 via a port 155. As the expanded view indicates, the port 155 may have a cover 150 when unused to prevent dust or other particles from entering the port 155. In FIG. 1 , one sensor 140 (e.g., Geiger counter) is shown affixed to an arm of the EMU 105 and another sensor 140 (e.g., camera) is shown affixed at the hip. As shown, the camera may be angled down so that the path ahead of the EMU 105 may be viewed in real time on a display used as the output device 115 while walking. The dynamic sensor network 200 that facilitates obtaining data from these and other sensors 140 and providing information via one or more output devices 115 is detailed with reference to FIG. 2 .
  • FIG. 2 is a block diagram of an exemplary dynamic sensor network 200 according to one or more embodiments. As previously noted, the dynamic sensor network 200 may be in a spoke and hub arrangement. According to an exemplary embodiment, a controller 210 acts as the hub. The controller 210 may be part of the DCM 130, for example, with one or more processors and memory devices that facilitate obtaining data from one or more of the sensors 140 via communication lines 220 (i.e., spokes) and providing information to one or more output devices 115. The exemplary dynamic sensor network 200 is shown with sensors 140 a through 140 n (e.g., camera, infrared camera, proximity sensor, Geiger counter, rangefinder) while ports 155 a through 155 x are shown. That is, some ports 155 may be unused during a given mission. FIG. 2 also shows output devices 115 a through 115 m (e.g., audio, visual, haptic).
  • For example, one of the sensors 140 may be a camera coupled to a port 155 near the hand of the EMU 105. This camera may be used to see around objects in a cave or the like. The images provided as data over the communication line 220 corresponding to the data port 155 may be projected to an OLED display on the inner bubble of the helmet 110 as the output device 115. As another example, a sensor 140 may be a Geiger counter coupled to one of the ports 155. The radiation readings provided to the controller 210 over the corresponding communication line 220 may be checked by the controller 210 to determine if a threshold value has been crossed. If so, the controller 210 may provide an audible alert to an output device 115 that is a speaker or provide haptic feedback to an output device 115 that implements a vibration.
  • The type of data provided by a given sensor 140 may determine the analysis performed by the controller 210, as well as the output provided to an output device 115. The data may be provided with an identifier or may be recognizable based on the content. The controller 210 may essentially implement a mapping of the processing that is appropriate for each data type. Input from the wearer of the EMU 105, provided via the DCM 130, for example, may affect the processing that is performed by the controller 210. The table below provides exemplary processing that may be performed based on the data obtained by the controller 210. The examples provided for explanatory purposes are not intended to limit additional sensors 140, processing by the controller 210, or additional output devices 115.
  • TABLE 1
    Exemplary inputs and outputs of a dynamic sensor network 200.
    processing performed output to output
    data from sensor 140 by controller 210 device 115
    image data from pass through or any display to one or more
    camera specified image display devices
    processing
    radiation level from compare with a audio, video,
    Geiger counter predefined threshold haptic alert
    value
    proximity (distance) pass through or display distance to
    from proximity sensor compare with a closest object or provide
    predefined threshold alert based on a distance
    distance below the predefined
    threshold distance
    range to an object pass through range display to one or
    from rangefinder more display devices
  • According to an alternate or additional embodiment, a battery 230 of the EMU 105 may be a hub. The battery may be part of the PLSS 120, for example. One or more sensors 140 may be powered or charged via power lines 240 from the battery 230 to corresponding ports 155.
  • FIG. 3 is a block diagram of an exemplary dynamic sensor network 300 according to one or more embodiments. FIG. 3 shows additional or alternate features as compared with the exemplary dynamic sensor network 200 shown in FIG. 2 . The fundamental components of the controller 210 and communication lines 220 from various ports 155 to which sensors 140 may couple is shown. As shown in FIG. 3 , each port 155 may include an optional microcontroller 310 a through 310 x (generally referred to as 310). Thus, data from each sensor 140 that is coupled to a port 155 may be routed through a microcontroller 310 via a communication line 220 to the controller 210.
  • Additionally or alternately, redundant wired or wireless communication lines 320 may be included between ports or, more specifically, between microcontrollers 310. That is, sensors 140 may communicate data from a corresponding port 155 to another port 155 for relay to the controller 210 via a communication line 220 from the other port 155. This may be necessitated due to failure of the communication line 220 from the port 155 corresponding with the sensor 140, for example. Alternately, microcontrollers 310 may communicate data obtained from a corresponding sensor 140 to another microcontroller 310. The data may be relayed to the controller 210 or may be combined with data from the sensor 140 corresponding to the other microcontroller 310 prior to being provided to the controller 210, for example.
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
  • While the present disclosure has been described with reference to an exemplary embodiment or embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this present disclosure, but that the present disclosure will include all embodiments falling within the scope of the claims.

Claims (20)

1. A system in an atmospheric suit, the system comprising:
a controller within the atmospheric suit;
a first network configured as a first hub and spoke network arranged within the atmospheric suit, wherein the controller is the hub of the first network, each spoke of the first network represents first wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit;
a battery; and
a second network configured as a second hub and spoke network arranged within the atmospheric suit, wherein the battery is the hub of the second network represents second wiring that leads to one of the plurality of ports accesible from outside the atmospheric suit.
2. The system according to claim 1, wherein the controller is configured to obtain data from one or more sensors coupled to one or more of the plurality of ports via the first network, and wherein the battery is configured to facilitate charging of the one or more sensors via the second network.
3. The system according to claim 2, wherein the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the controller is configured to process the data from the one or more sensors to obtain information.
4. The system according to claim 3, wherein a wearer of the atmospheric suit specifies processing of the data.
5. The system according to claim 3, wherein the controller is configured to provide information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
6. The system according to claim 5, wherein the one or more output devices include audio, video, or haptic output devices.
7. The system according to claim 5, wherein one of the one or more sensors is a camera, one of the one or more output devices is a display device, and the controller is configured to obtain images from the camera and provide the information based on the images for display to the wearer on the display device.
8. The system according to claim 1, further comprising microcontrollers corresponding with one or more of the plurality of ports.
9. The system according to claim 8, wherein the t network includes redundant communication between two or more of the plurality of microcontroller or between two or more of the ports.
10. The system according to claim 1, further comprising a cover on each of the plurality of ports.
11. A method of assembling a system in an atmospheric suit, the method comprising:
configuring a controller within the atmospheric suit;
arranging a first network configured as a first hub and spoke network within the atmospheric suit, wherein the controller is the hub of the first network, each spoke of the first network represents first wiring that leads to one of a plurality of ports accessible from outside the atmospheric suit; and
arranging a second network configured as a second hub and spoke network arranged within the atmospheric suit, wherein a battery is the hub of the second network, each spoke of the second network represents second wiring that leads to one of the plurality of ports accessible from outside the atmospheric suit.
12. The method according to claim 11, wherein the configuring the controller includes the controller obtaining data from one or more sensors coupled to one or more of the plurality of ports via the first network, and wherein the battery is configured to facilitate charging of the one or more sensors via the second network.
13. The method according to claim 12, wherein the one or more sensors includes a camera, a proximity sensor, a range finder, or a Geiger counter and the configuring the controller includes the controller processing the data from the one or more sensors to obtain information.
14. The method according to claim 13, wherein the configuring the controller includes the controller obtaining an indication of the processing of the data from a wearer of the atmospheric suit.
15. The method according to claim 13, wherein the configuring the controller includes the controller providing information based on the data to a wearer of the atmospheric suit as output to one or more output devices.
16. The method according to claim 15, wherein the one or more output devices include audio, video, or haptic output devices.
17. The method according to claim 15, wherein one of the one or more sensors is a camera, one of the one or more output devices is a display device, and the configuring the controller includes the controller obtaining images from the camera and providing the information based on the images for display to the wearer on the display device.
18. The method according to claim 11, wherein the arranging the first network includes disposing microcontrollers corresponding with one or more of the plurality of ports.
19. The method according to claim 18, wherein the arranging the first network includes configuring redundant communication between two or more of the microcontrollers or between two or more of the ports.
20. The method according to claim 11, further comprising disposing a cover on each of the plurality of ports.
US17/571,674 2022-01-10 2022-01-10 Dynamic sensor network in atmospheric suit Active US11716386B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/571,674 US11716386B1 (en) 2022-01-10 2022-01-10 Dynamic sensor network in atmospheric suit
EP23150949.8A EP4210342A1 (en) 2022-01-10 2023-01-10 Dynamic sensor network in atmospheric suit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/571,674 US11716386B1 (en) 2022-01-10 2022-01-10 Dynamic sensor network in atmospheric suit

Publications (2)

Publication Number Publication Date
US20230224364A1 true US20230224364A1 (en) 2023-07-13
US11716386B1 US11716386B1 (en) 2023-08-01

Family

ID=84901587

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/571,674 Active US11716386B1 (en) 2022-01-10 2022-01-10 Dynamic sensor network in atmospheric suit

Country Status (2)

Country Link
US (1) US11716386B1 (en)
EP (1) EP4210342A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170246961A1 (en) * 2016-02-25 2017-08-31 California Institute Of Technology Adaptive Charging Network using Adaptive Charging Stations for Electric Vehicles
US20200285464A1 (en) * 2017-06-05 2020-09-10 Umajin Inc. Location tracking system and methods
US20210005850A1 (en) * 2014-01-15 2021-01-07 LAT Enterprises, Inc., d/b/a MediPak Energy Systems Wearable and replaceable pouch or skin for holding a portable battery pack
US20210291198A1 (en) * 2020-03-17 2021-09-23 The Boeing Company Releasable dust mitigation covering
US20210337913A1 (en) * 2020-05-01 2021-11-04 Rockwell Collins, Inc. Led information cueing apparatus for spacesuit helmet
US20210386145A1 (en) * 2020-06-11 2021-12-16 Angelina Dong Tactile Feedback Glove

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7190636B1 (en) * 2005-02-25 2007-03-13 Depaola Victor R Diving suit and environmental detecting system
US8692886B2 (en) 2007-10-31 2014-04-08 Timothy James Ennis Multidirectional video capture assembly
US10491819B2 (en) 2017-05-10 2019-11-26 Fotonation Limited Portable system providing augmented vision of surroundings
US20210223557A1 (en) 2019-08-21 2021-07-22 Hypergiant Industries, Inc. Active display helmet

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210005850A1 (en) * 2014-01-15 2021-01-07 LAT Enterprises, Inc., d/b/a MediPak Energy Systems Wearable and replaceable pouch or skin for holding a portable battery pack
US20170246961A1 (en) * 2016-02-25 2017-08-31 California Institute Of Technology Adaptive Charging Network using Adaptive Charging Stations for Electric Vehicles
US20200285464A1 (en) * 2017-06-05 2020-09-10 Umajin Inc. Location tracking system and methods
US20210291198A1 (en) * 2020-03-17 2021-09-23 The Boeing Company Releasable dust mitigation covering
US20210337913A1 (en) * 2020-05-01 2021-11-04 Rockwell Collins, Inc. Led information cueing apparatus for spacesuit helmet
US20210386145A1 (en) * 2020-06-11 2021-12-16 Angelina Dong Tactile Feedback Glove

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Mohammed Taj-Eldin (WIRELESS BODY AREA NETWORKS FOR INTRA-SPACESUIT COMMUNICATIONS: MODELING, MEASUREMENTS AND WEARABLE ANTENNAS, MOHAMMED TAJ-ELDIN, KANSAS STATE UNIVERSITY, AN ABSTRACT OF A DISSERTATION, 2015; hereinafter Taj) (Year: 2015) *

Also Published As

Publication number Publication date
EP4210342A1 (en) 2023-07-12
US11716386B1 (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN106890067B (en) Indoor blind man navigation robot
EP3439322B1 (en) Audio output assembly for a head-mounted display
CN103620527A (en) Headset computer that uses motion and voice commands to control information display and remote devices
DE60313319D1 (en) AIR EDUCATION SYSTEM
CN110360996B (en) Sensor unit, moving object positioning device, and moving object
US20230195212A1 (en) Visual tracking of peripheral devices
US10812896B2 (en) High compliance microspeakers for vibration mitigation in a personal audio device
CN110244461B (en) Augmented reality display system and display method thereof
US20220001552A1 (en) Robot for detecting and saving life in small space
US20180164878A1 (en) Virtual Rigid Framework for Sensor Subsystem
Lopez et al. Robotman: A security robot for human-robot interaction
US11716386B1 (en) Dynamic sensor network in atmospheric suit
GB2535723A (en) Emergency guidance system and method
CA3037657A1 (en) Route guidance and obstacle avoidance system
US7333073B2 (en) System for the remote assistance of an operator during a work stage
WO2016135448A1 (en) Emergency guidance system and method
US20220021999A1 (en) Spatialized audio rendering for head worn audio device in a vehicle
US11533961B1 (en) Multi-functional vehicle helmet
JP6831949B2 (en) Display control system, display control device and display control method
US20230166874A1 (en) Atmospheric suit helmet display and display-based control
KR102436036B1 (en) Method And Apparatus for Searching Survivor
WO2020110293A1 (en) Display control system, display control device, and display control method
US20230362526A1 (en) Water-resistant microphone assembly
WO2023067884A1 (en) Information processing system, information processing method, information processing device, and computer program
Di Capua Augmented reality for space applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: HAMILTON SUNDSTRAND CORPORATION, NORTH CAROLINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TORRALBA, MONICA;HIMMELMANN, ASHLEY ROSE;ROHRIG, JAKE;SIGNING DATES FROM 20220105 TO 20220107;REEL/FRAME:058590/0954

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE