WO2022119659A2 - First responder radio communications system using multi-frequency radios voice operated by a distributed artificial intelligence system virtual personal assistant - Google Patents

First responder radio communications system using multi-frequency radios voice operated by a distributed artificial intelligence system virtual personal assistant Download PDF

Info

Publication number
WO2022119659A2
WO2022119659A2 PCT/US2021/055872 US2021055872W WO2022119659A2 WO 2022119659 A2 WO2022119659 A2 WO 2022119659A2 US 2021055872 W US2021055872 W US 2021055872W WO 2022119659 A2 WO2022119659 A2 WO 2022119659A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
radio unit
radio
artificial intelligence
instance
Prior art date
Application number
PCT/US2021/055872
Other languages
French (fr)
Other versions
WO2022119659A9 (en
WO2022119659A3 (en
Inventor
Robert Marino
Clive Hohberger
Mark HUNTZINGER
Original Assignee
Jkmaraf, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jkmaraf, Llc filed Critical Jkmaraf, Llc
Publication of WO2022119659A2 publication Critical patent/WO2022119659A2/en
Publication of WO2022119659A9 publication Critical patent/WO2022119659A9/en
Publication of WO2022119659A3 publication Critical patent/WO2022119659A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]

Abstract

A wireless communications system comprises a computer-controlled base station radio unit and a plurality of computer-controlled user-wearable radio units, each radio unit comprising a plurality of radio subsystems operating at a plurality of different frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit and the user-wearable radio units; and wherein the base station unit comprises a primary instance of an artificial intelligence software; and each user-wearable radio unit comprises a secondary instance of the artificial intelligence software; and wherein collectively all instances of the artificial intelligence software communicate over the plurality of radio subsystems and comprise a distributed artificial intelligence communication system which assist each user in performance of their tasks, monitors their location and health, and supports the user when needed through coordination of emergency assistance.

Description

FIRST RESPONDER RADIO COMMUNICATIONS SYSTEM USING MULTIFREQUENCY RADIOS VOICE OPERATED BY A DISTRIBUTED ARTIFICIAL INTELLIGENCE SYSTEM VIRTUAL PERSONAL ASSISTANT
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to US Provisional Patent Application No. 63/093,954 filed on October 20, 2020, the contents of which are incorporated herein by reference in their entirety
TECHNICAL FIELD
[0002] The present disclosure is directed to a computer-based artificial intelligence softwarebased radio communication system with health status monitoring and location tracking intended for use by police, fire, emergency medical and other emergency users.
BACKGROUND
[0003] Emergency radio communications are essential tools of first responders, disaster response, and incident management teams. There is an established benefit to monitor and predict user situational factors such as air supply and ambient temperature, and ensure regular user status updates on user global positioning system (GPS) location and user basic health vital signs including heart rate, respiration rate and %Oxygen saturation (PO2).
[0004] The need for reliable communication of voice, location and health status information is especially acute when the user is trapped or injured. An artificial intelligence system can anticipate when the user is or trending toward life risk, needs medical attention or rescue even if the user is injured or unconscious, and issue appropriate communications to incident command and nearby users for rescue assistance.
SUMMARY
[0005] A distributed intelligent integrated radio system, denoted by IRIS, an acronym for Intelligent Radio Integrated System, is described for first responder users including police, fire, emergency medical services, dive teams, SWAT and Hostage Rescue. It is equally applicable to military as well as civilian first response teams.
[0006] The IRIS system is a wireless communications system for use between first responders comprising a base station radio unit and a plurality of user-wearable radio units, multi-protocol and multi-frequency radio communications for both voice and location/health status communications, with enhanced emergency communications for each user-wearable radio unit user. The IRIS system may also include an incident command radio unit for communication with the base station radio unit and/or the plurality of user-wearable radio
[0007] Once turned on, IRIS radio operation and communications is hands-free, and done entirely done by voice using an artificial intelligence virtual assistant, which in an exemplary embodiment is referred to herein as “Phoebe”, which is part of a distributed artificial intelligence throughout the entire IRIS radio system. Phoebe brings the familiar usage of a voice virtual assistant such as Apple’s Siri and Amazon’s Alexa in a form optimized for use with emergency first responder radio.
[0008] IRIS introduces artificial intelligence-based voice control of all aspects of radio communication and information distribution, eliminating the need for mechanical operation of the radio. An artificial intelligence-based voice virtual personal assistant also vocally addressed as Phoebe in each user-wearable radio unit assists in communications management and coordination with the primary artificial intelligence instance in the base station radio unit. Through the primary artificial intelligence instance, each secondary artificial intelligence instance in other user-wearable radio unit users and the artificial intelligence instances of the supporting infrastructure, such as an incident command artificial intelligence instance, may be connected to for voice and data communication.
[0009] Each user-wearable radio unit user regularly provides both it’s location and health status to the base station radio unit. GNSS-2 based GPS location tracking, enhanced with personal inertial navigation systems (INS) for dead-reckoning GPS position updates, provides information to the base station including continual 3-dimensional location estimation in meter to sub-meter accuracy. Biometric sensors continuously record key vital signs indicators such as heart and respiration rates, heart rhythm, core body temperature and %blood oxygen saturation (PO2), and IRIS regularly communicates to the base station radio unit each user-wearable radio unit user’s health status and artificial intelligence-identified dangerous health parameter trends.
[0010] Phoebe supports voice and/or data communications, tracks each user’s current or last known location, and ensures regular status updates on and user’s location, situational parameters and basic health signs. Phoebe anticipates or identifies when the responder is or trending toward life or medical risk even if the user is injured or unconscious, and issues appropriate warnings to incident command and communicates with nearby users for rescue assistance.
[0011] In one embodiment, a wireless communications system is disclosed comprising a computer-controlled base station radio unit and a plurality of computer-controlled user-wearable radio units, each radio unit comprising a plurality of radio subsystems operating at a plurality of different frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit and the user-wearable radio units, and wherein the base station unit comprises a primary instance of an artificial intelligence software; and each the user-wearable radio unit comprises a secondary instance of the artificial intelligence software, and wherein collectively all instances of the artificial intelligence software communicate over the plurality of radio subsystems and comprise a distributed artificial intelligence communication system, referred to herein as “Phoebe.”
[0012] In one embodiment, a wireless communications system is disclosed comprising a computer-controlled base station radio unit and a plurality of computer-controlled user-wearable radio units, each radio unit comprising 3 or more radio subsystems capable of simultaneous operation at a plurality of different frequencies., the radio units configured for two-way computer-controlled transmission and receipt of voice and/or data communication between them.
BRIEF DESCRIPTION OF THE FIGURES
[0013] The following detailed description of specific embodiments of the present disclosure can be best understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
[0014] Fig. 1 shows a block diagram of the complete IRIS radio system.
[0015] Fig. 2 shows a block diagram of a user-wearable IRIS radio unit.
[0016] Figs. 3a-3b show schematic diagrams of mesh network radio communications.
[0017] Fig. 4 shows a block diagram of the digital voice compaction process.
[0018] Fig. 5 shows a modified US ASCII 7-bit encoding table used in voice compaction.
DETAILED DESCRIPTION
Overview
[0019] Referring to FIGS. 1-5, the present application is directed to an intelligent radio system primarily intended for use by police, fire, emergency medical and other civilian or military first response teams. [0020] Wikipedia defines artificial intelligence (Al) as “intelligence demonstrated by machines, as opposed to the natural intelligence displayed by animals including humans.” Leading Al textbooks define the field as the study of “intelligent agents”: “any system that perceives its environment and takes actions that maximize its chance of achieving its goals.” This is achieved through application-specific artificial intelligence software executed by a computer or microprocessor. Distributed artificial intelligence is when the artificial intelligence software function is spread across many linked computers or microprocessors, each computer artificial intelligence application providing some part of a collective artificial intelligence result.
[0021] A distributed intelligent integrated radio system, denoted by IRIS, an acronym for Intelligent Radio Integrated System, is described for use by an suitable group of users 5, particularly a group of first responders.
[0022] The base station radio unit 10 is also connected wirelessly to a plurality of identical IRIS user-wearable radio units 100 (denoted as 1...N) each worn by individual users 5 who are members of that team and may also be denoted as users 5 - 1-N. For schematic purposes, only the IRIS user-wearable radio units 100 of user 5 - 1 and user 5 - N are shown. The IRIS userwearable radio unit 100 of each user 5 is configured to provide two-way wireless communication to every other IRIS user-wearable radio unit 100 of every other user 5 either directly, or indirectly as relayed through one or more other user-wearable radio unit 100 or base station radio unit 10.
[0023] FIG. 1 shows a block diagram of the wireless communications system 1 or IRIS radio system 1 for a group or team of users 5 . It is comprised of a base station radio unit 10 configured for two-way computer-controlled transmission and receipt of voice and/or data communication to an incident command radio unit 50 and support infrastructure teams of users 5 such as pumper truck team 80 and medical support team 90. Base station radio unit 10 also is configured to communicate through a wired or wireless connection to a group of offsite computers and associated data processing and storage devices configured to receive, process and store third party communication, such as those communications described herein, that is referred to herein as the cloud 40 for secondary database processing and archiving of any and/or all communications of the IRIS radio system 1.
[0024] IRIS is a wireless communications system 1 comprising a computer-controlled base station radio unit 10 and a plurality of computer-controlled user- wearable radio units 100, each radio unit 10, 50, 100 comprising 3 or more radio subsystems capable of simultaneous operation at a plurality of different frequencies; the radio units configured for two-way computer- controlled transmission and receipt of voice and/or data communication between them.
[0025] The base station radio unit 10 is also connected wirelessly to a plurality of identical IRIS user- wearable radio units 100 (denoted as users 5 1...N) each worn by individual users 5 who are members of that user group 5. For schematic purposes, only the IRIS user-wearable radio units of user 5 - 1 and user 5 - N are shown. The IRIS user-wearable radio unit 100 of each user 5 has wireless connectivity 60 to every the IRIS user- wearable radio unit 100 of every other user 5 either directly, or indirectly as relayed through one or more other user-wearable radio unit 100 or base station radio unit 10. The base station radio unit 10 may also be connected wirelessly to an incident command radio unit. The base station radio unit 10 is generally used to direct or coordinate a plurality users 5 each having a user -wearable radio unit 100 as a team associated with a particular incident or call, such as the response of a group of firefighters or paramedics from one location (e.g., firehouse) to a particular address or location to address a fire, health emergency, traffic accident, or other similar incident. The incident command radio unit 50 generally is used to direct or coordinate a plurality of incidents, or one large incident, that requires the coordination of users 5 from a plurality of teams, which might include a plurality of teams of users 5 from different precincts or locations within one municipality, or a plurality of teams from different municipalities or other governmental units.
[0026] In one embodiment, a wireless communications system 1 comprises a computer- controlled base station radio unit 10 and a plurality of computer-controlled user-wearable radio units 100, each radio unit 10, 100 comprising a plurality of radio subsystems 20, 120 each operating at a plurality of different radio frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit 10 and the user-wearable radio units 100. The base station unit 10 comprises a primary instance 30 of an artificial intelligence software 15; and each user-wearable radio unit 100 comprises a secondary instance 110 of the artificial intelligence software 15, and wherein collectively all instances 30, 110 of the artificial intelligence software 15 communicate over the plurality of radio subsystems 25, 120 and comprise a distributed artificial intelligence communication system 15.
The Phoebe Virtual Personal Assistant
[0027] Each user- wearable IRIS radio 100 is a hands-free, voice-response intelligent communications system in which no buttons have to be pushed, nor switches set, no knobs turned nor displays consulted to operate any aspect or feature of the radio system, aside from a master ON-OFF switch. Knobs, switches and handheld displays are difficult to operate using heavy gloves. Often first responders may be unable to operate mechanical radio controls because they have both hands full; may be on the move; or may be physically immobilized due to an injury or accident.
[0028] Once turned on, IRIS radio operation and communications is entirely done by voice using an artificial intelligence virtual assistant, Phoebe, which is part of a distributed artificial intelligence throughout the entire IRIS radio system. Phoebe is the name for a virtual personal assistant described herein that comprises a portion of a distributed artificial intelligence software system 15 which is distributed throughout the entire IRIS radio system 1 by incorporation into every radio unit 2 of the system. Every implementation or version of Phoebe installed in each radio unit 2 is called an instance. The base station radio unit 10 comprises a plurality of major subsystems, including base station radio communications management subsystem 20 that comprises a plurality of base station radio subsystems 25 operating at a plurality of different radio frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit 10 and other radio units 50, 100, and base station radio unit 10 primary artificial intelligence software (Phoebe) instance 30. The base station radio communications management subsystem 20 is computer-controlled by one or more base station computer 35 that is also used to execute the base station radio unit 10 primary artificial intelligence software (Phoebe) instance 30. The base station computer 35 may comprise any suitable computing device configured to control base station radio communications management subsystem 20 and radio communications of the base station radio subsystems 25 and execute the primary artificial intelligence software (Phoebe) instance 30, including computing devices similar to those utilized in smartphones, tablet computers, and laptop computers. The primary artificial intelligence software (Phoebe) instance 30 may comprise any suitable primary personal virtual assistant 32, and may include Siri by Apple Inc., Alexa by Amazon.com, Inc., or Cortana by Microsoft Corporation, or adaptations thereof.
[0029] Each user- wearable radio unit 100 comprises a plurality of major subsystems, including user- wearable radio communications management subsystem 105 that comprises a plurality of user- wearable radio subsystems 115 operating at a plurality of different radio frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit 10 and other radio units 50, 100, and userwearable radio unit 100 primary artificial intelligence software (Phoebe) instance 110. The user- wearable radio communications management subsystem 120 is computer-controlled by one or more computer 135 that is also used to execute the user-wearable radio unit 10 secondary artificial intelligence software (Phoebe) instance 110. The computer 135 may comprise any suitable computing device configured to control user-wearable radio communications management subsystem 120 and radio communications of the user- wearable radio subsystems 125 and execute the secondary artificial intelligence software (Phoebe) instance 110, including computing devices similar to those utilized in smartphones, tablet computers, and laptop computers. The secondary artificial intelligence software (Phoebe) instance 110 may comprise any suitable secondary personal virtual assistant 112, and may include Siri by Apple Inc., Alexa by Amazon.com, Inc., or Cortana by Microsoft Corporation, or adaptations thereof.
[0030] Each incident command radio unit 50 comprises a plurality of major subsystems, including incident command radio communications management subsystem 55 that comprises a plurality of incident command radio subsystems 60 operating at a plurality of different radio frequencies to provide two-way computer-controlled transmission and receipt of voice and/or data communication between the incident command radio unit 50 and other radio units 10, 100, and incident command radio unit 50 incident command artificial intelligence software (Phoebe) instance 70. The incident command radio communications management subsystem 55 is computer-controlled by one or more computer 75 that is also used to execute the incident command radio unit 50 incident command artificial intelligence software (Phoebe) instance 85. The computer 75 may comprise any suitable computing device configured to control incident command radio communications management subsystem 55 and radio communications of the incident command radio subsystems 60 and execute the incident command artificial intelligence software (Phoebe) instance 85, including computing devices similar to those utilized in smartphones, tablet computers, and laptop computers. The incident command artificial intelligence software (Phoebe) instance 85 may comprise any suitable incident command personal virtual assistant 87, and may include Siri by Apple Inc., Alexa by Amazon.com, Inc., or Cortana by Microsoft Corporation, or adaptations thereof. The elements of the base radio unit 10, the user-wearable radio unit 100, and the incident command radio unit 50 may comprise the same elements, or different elements, in whole or in part.
[0031] The name Phoebe was selected because it is a rare but recognized name from Greek mythology composed of two strong and easily recognized voice phonemes, “Fee Bee”, notwithstanding, any suitable name may be given to the virtual personal assistant comprising a portion of the distributed artificial intelligence software system 15 described herein. Similar to Siri and Alexa, Phoebe is a portion of the distributed artificial intelligence software system that behaves like a single entity to each user- wearable radio unit 100, which is a node in the IRIS radio system 1. Every version of Phoebe installed in each IRIS radio unit 10, 50, 100 is referred to herein as an “instance”. Each IRIS radio unit 10, 50, 100 has a corresponding Phoebe instance 30, 85, 110, comprising a personal virtual assistant addressed by voice as “Phoebe”. These instances may be the same as or different from one another. For example, they may present themselves the same to a user 5 using the same voiceprint and same standard greetings or responses, but may provide different information to a user 5 appropriate to the user 5 context and role, such as, for example, presenting different information and/or options to the user 5 of a user- wearable radio unit 100 that is directly responding to an incident versus a base station user 5 that is controlling the response to the incident to which the user 5 of the user-wearable radio unit 100 is responding versus an incident command user 5 that is controlling responses to the incident to which the users 5 of the user-wearable radio unit 100 and base station radio unit 10 are responding as well as simultaneous responses to other incidents. The Phoebe instance 110 portion of artificial intelligence software 15 is in a form optimized for use with IRIS user system 1 radio communications with emergency responders as users 5. Phoebe instance 110 also provides each user 5 with the ability to use Phoebe to reconfigure on-demand a local chat group to help focus on and solve their current problem or problems associated with an incident, and use the respective Phoebe instances 30, 85 to seek information from and convey information to the base radio unit 10 and incident command radio unit 50 and their respective users 5. Phoebe instances 30, 85, 110 portions of artificial intelligence software 15 will then coordinate all emergency communications management with all relevant users 5.
[0032] Referring to FIG. 2, all voice messaging to and from each user 5 through helmet headset system 120 goes directly to the Phoebe artificial intelligence instance 110 coupled to IRIS device controller 260 in IRIS radio 100. Voice communication that is intended for all radio links are passed through IRIS device controller 260 to the communications radio controller 200, which is configured to determine which voice channels are active and the assigned active users 5 of a particular user’s 5 current “chat group”.
[0033] The concept of a chat group is created to enable but also restrict normal voice conversation between only those users 5 working on a common task, such as a fire hose team; EMT team, diver team or police partners. A chat group is task-oriented, often short-lived and dynamically reconfigurable by each user 5 that is a member of the chat group through their respective Phoebe instance 110 and user 5 secondary virtual personal assistant 112 as they move between tasks, and may also include a user 5 that is at the base station radio unit 10 or incident command radio unit 50, respectively. User 5 voice communications are assumed to be shared only with the active chat group. Phoebe instances 30, 85, 110 also provide each user 5 with the ability to reconfigure on demand the local chat group to optimize around the incident they are trying to address, and to seek information from and convey information to the base station radio unit 10 and incident command radio unit 50. However, each user 5 at any time may direct their respective Phoebe instance 30, 85, 110 to send a specific voice message elsewhere, including to another selected IRIS system 1 user 5; the base station radio unit 10 user 5, or the incident command radio unit 50 user 5.
[0034] Occasionally, as in emergencies or rescue operations, the base station radio unit 10 and incident command center 50 need to become part of a user’s chat group. The chat group voice- controlled feature managed by Phoebe instances 110 and users 5 of the chat group reduce extraneous voice communication created by and received by users 5, while allowing easy break- in for important and emergency messages from any user 5.
[0035] The base station radio unit 10 comprises a plurality of major subsystems, including radio communications management 20 and primary Phoebe instance 30. The base station radio unit 10 has the optional ability to utilize cloud 40 computing and storage services for processing, logging, storage and sharing of all data acquired during the incident. Other computing services of the base station radio unit 10 and cloud 40 may provide sophisticated artificial intelligence analysis and display of acquired and logged user operational and health status data, as needed by users 5 such as team leaders, incident commanders and emergency service personnel.
[0036] In one embodiment of the IRIS wireless communications system 1 each secondary instance 110 of the Phoebe portion of artificial intelligence software 15 in each user-wearable radio unit 100 comprises a secondary virtual personal assistant 112, and the primary instance 30 of the Phoebe portion of the artificial intelligence software 15 in the base station radio unit 10 comprises a primary virtual personal assistant 32, and the primary virtual personal assistant and each secondary virtual personal assistant are configured to provide the respective users 5 with a voice command mode and a voice response mode for operation of the respective radio units 10,100. The voice command mode enables the user 5 to use his/her voice to interface with IRIS wireless communications system 1 to initiate instructions, commands, information requests, or otherwise provide a command input to the system through a conventional headset or other microphone. The voice response mode enables the user 5 to use his/her voice to interface with IRIS wireless communications system 1 to acknowledge receipt of instructions, commands, answer information requests, or otherwise provide a voice response to the system through a conventional headset or other microphone.
[0037] In one embodiment of the IRIS wireless communications system 1 the secondary instance 110 and secondary virtual personal assistant 112 in each user- wearable radio unit 100 is configured to provide direct two-way computer-controlled voice and/or data communication with the primary instance 30 and primary virtual personal assistant 32 of the base station radio unit .
[0038] In one embodiment of the IRIS wireless communications system 1 the secondary Phoebe instance 110 and secondary virtual personal assistant 112 that is configured to provide two-way voice and/or data communication with the primary virtual assistant 32 of the base station radio unit is configured as a secondary radio relay for two-way voice and/or data communication between the primary virtual assistant 32 and another secondary virtual assistant 112. In this embodiment, the user- wearable radio unit 100 and the secondary Phoebe instance 110 and secondary virtual personal assistant 112 that is used as a secondary radio relay acts as a pass-through device to relay two-way communications from the base station radio unit 10 to a user- wearable radio unit 100 that may be out of range of the base station radio unit or otherwise blocked from receiving radio signals being broadcast directly from the base station radio unit. In one embodiment, the secondary radio relay may relay communications to the another secondary virtual assistant 112 only, and in another embodiment may also enable the user 5 of the user-wearable radio unit 100 acting as the secondary radio relay to also receive the voice and/or data communication being relayed.
[0039] In one embodiment of the IRIS wireless communications system 1, the system further comprises an incident command radio unit 50 comprising an incident command Phoebe instance 85 of the artificial intelligence software 15 and comprising an incident command virtual personal assistant 87 that is configured to provide two-way voice and/or data communication with the primary virtual personal assistant 32. In this embodiment the secondary virtual personal assistant 112 in each user-wearable radio unit 100 is configured to provide two-way voice and/or data communication with the incident command radio unit 50. In this embodiment the primary virtual personal assistant 32 in the base station radio unit 10 is configured as a primary radio relay for two-way voice and/or data communication between the secondary virtual personal assistant 112 and the incident command virtual personal assistant 87.
[0040] In one embodiment of the IRIS wireless communications system 1 the secondary virtual personal assistant 112 of one user- wearable radio unit 100 is configured to provide two- way voice and/or data communication with the incident command virtual personal assistant 87 of the incident command radio unit 50, using at least one other secondary virtual personal assistant 112 and the respective user-wearable radio unit 100 as a secondary radio relay and the primary virtual personal assistant 32 of the base station radio unit 10 as the primary radio relay.. [0041] In one embodiment the base station radio unit 10 Phoebe artificial intelligence instance 30 is a different more advanced artificial intelligence system than the secondary Phoebe artificial intelligence instance 110 in each user-wearable IRIS radio system 100, and has different and/or additional artificial intelligence functions. The base station radio Phoebe instance 30 is also configured for and capable of communicating with the incident command Phoebe instance 87 of the incident command radio unit 50 that is configured to communicate with other users 5 with user- wearable radio units 100 having secondary Phoebe instances 112 that comprise additional support infrastructure (e.g., fire equipment teams, SWAT teams, medical services teams and search and rescue boat teams or helicopter teams) to support the operations of and rescue individual users 5 located at one incident, for example: The base station radio unit 10 contains the primary instance of the artificial intelligence software, and provides several functions as described below. One function of the primary Phoebe instance 32 of the artificial intelligence software 15 within the base station radio unit 10 is configured for two-way voice and/or data communication with any of the secondary Phoebe instances 112 of the artificial intelligence software 15 of the user- wearable radio units 100. Another function of the primary instance 32 of the artificial intelligence software 15 within the base station radio unit 10 is configured for two- way voice and/or data communication with any selected group of the plurality of secondary Phoebe instances 112 of the artificial intelligence software 15 in the user- wearable radio units 100. Another function is the receipt and archiving of all communications with the base station radio unit 10 to each user-wearable radio unit 100 user 5, including those relayed through the base station radio unit 10 with the user- wearable radio units 100 of all first responders/users 5 providing support functions and the incident command radio unit 50 and the users 5 in command of the incident. Another function is the coordination and archiving of action support requests from each user- wearable radio unit 100 user 5 to other first responder/user 5 support functions (e.g., firefighting teams, SWAT teams, diver support teams, EMT and medical evacuation teams) and return of action service updates to the requesting user 5. Another function is the formation of multiple chat groups comprising one or more other selected user-wearable radio unit 100 users 5; then relay and archiving of all group messages through the base station radio unit 10 with the intended group of users 5.
User-Wearable IRIS Radio Subsystems
[0042] FIG. 2 shows a block diagram of a user- wearable IRIS radio unit 100, which may also be referred to as a user- wearable IRIS radio system 100. In one embodiment, user- wearable IRIS radio unit 100 comprises several user- wearable radio subsystems 25 including: a plurality of radio subsystems 220, 240 and 250 operating at different radio frequencies and protocols and an optional legacy radio system 230, all controlled by IRIS radio controller 260; the smart antenna subsystem 270 for all the VHF/UHF radio systems; and the GNSS-2 (dual channel L2- L5) enhanced GPS radiolocation system 210. Each user- wearable IRIS radio unit 100 is voice operated through the secondary virtual personal assistant 112 Phoebe using the user helmet headset system 120. In one embodiment the user- wearable IRIS radio unit 100 comprises 3 or more user-wearable radio subsystems 25 capable of simultaneous operation at a plurality of different radio frequencies. A plurality of user wearable radio units 100 configured for two-way computer-controlled transmission and receipt of voice and/or data communication between them.
[0043] The user- wearable IRIS radio unit 100 uses a plurality of user-wearable radio subsystems 25 operating at different frequencies and different communications protocols, but all user-wearable radio subsystems 25 provide at least limited voice, user location and health status data communications. Multiple radio frequencies and digital communications protocols are utilized within the user- wearable IRIS radio unit 100 to enable dynamic reconfiguration of the IRIS radio unit 100 to maintain communication with other radio units 2 as the unit undergoes changes in radio signal propagation due to motion of the user 5 though the incident site. Dynamic reconfiguration of the user-wearable radio subsystem 25 of each user-wearable IRIS radio unit 100 is accomplished by its secondary Phoebe instance 110 and ensures that voice and/or data communication is maintained with base station radio unit 10. The objective is that users’ physical location and health status are continuously available to the base station radio unit 10, and available for both the local team commander and the incident command leadership. Use of multiple frequencies enables reliable communication to be maintained despite radio signal propagation issues which occur particularly in urban canyon environments and in multi-story buildings, particularly in basements and subbasements.
[0044] The first IRIS primary radio system is a FirstNet cellular radio 220 operating both in the cellular bands in the special reserved FirstNet Authority 758-805 MHz range (Band 14). FirstNet cellular radio 220 is the primary channel for voice communication and also provides data communications with each user. As of August 2021, over 12,000 public safety agencies and organizations including FEMA and FBI utilize FirstNet, and AT&T provides 4G LTE coverage over 77% of the continental United States including 99% of the population. AT&T also provides emergency cellular access points for deployment on site at incidents, using both ground-based and air-base temporary cells. UHF Mesh Network Radio System
[0045] One primary IRIS radio subsystem is configured to operate at UHF frequencies utilizing the IEEE 802.15.4 Mesh Protocol/different from 802.11 WiFi Protocol) using a digital packet communications protocol. As used with IRIS, data is transmitted as 102 bytes messages including GF(103) Reed-Solomon error-correction codewords, esh radio subsystem 240 is utilized as a primary data network to provide data to the primary Phoebe instance 30 comprising the user 5 user status including: GNSS-2 enhanced dual band L2-L5 GPS 210 location data, as augmented by continuous inertial navigation system 280 to update the last known position; biometric sensor 130 health status information including heart rate, respiratory rate, heart rhythm, core body temperature, and %PO2 blood oxygen saturation levels from health status and environmental assessment system 170. IRIS mesh radio subsystem 240 is also utilized as a primary data network to provide data to the primary Phoebe instance 30 comprising the user 5 situational status including: ; physical sensor information 140 such as remaining air in the breathing system cylinder, ambient temperature of the environment and optionally ambient oxygen; also any toxic or explosive gas levels as processed by health status & environmental assessment system 170,
[0046] FIG 3a contains a schematic diagram of a mesh network 305. A network manager 310 connects to a host application 300 in the base station radio unit 10. Each user 5 mote connects to multiple other motes via a dynamic radio link initially established when the network is activated. Communication in the IRIS mesh network 305 is performed using a Time Slotted Channel Hopping (TSCH) link layer managed by the network manager 310. Time in the network is organized into timeslots, synchronized within 10’s of microseconds, to enable collision-free packet exchange among many IRIS radios units 2 and per-transmission UHF channel-hopping.
[0047] FIG 3b contains a different view of part of FIG. 3a. The network manager internal mote 315 initially directly links to a user mote 320 (Mote 1); mote 330 (Mote 2); mote 340 (Mote 3); and through them indirectly to the rest of the user motes in the IRIS mesh network 305. Every network device has one or more parent connections. This parent relationship is more easily visualized in FIG. 3b. The mote 340 (Mote 3) has motes 320 (Mote 1) and mote 330 (Mote 2) as parents, which provide redundant paths to mote 340 to overcome communications interruption due to interference, physical obstruction or multi-path fading.
[0048] The mesh network 305 is self-organizing to dynamically provide communication links not only with the base station unit 10 but as relayed messages (hops') through 1 or more IRIS radio units 2 of users 5, similar to Internet routing. For example, suppose the direct link for packet transmission between mote 340 and the network manager 310 fails, as well as the relay link via mote 320. In the next re-transmission to mote 340, the network manager may try a path through mote 330 and/or a different RF channel.
[0049] In one embodiment of the IRIS wireless communications system 1 one secondary instance 110 of the artificial intelligence software 15 in the user- wearable radio unit 100 may additionally provide two-way computer-controlled transmission and receipt of voice and/or data communication between at least one other secondary instances 110 of the artificial intelligence software 15 in at least one other user- wearable radio.
[0050] Dynamic reorganization continues until successful transmission is achieved. A message may be relayed from, to or through each IRIS user- wearable radio unit 100 transparently through multiple hops as the users 5 move about the incident site. Note that the incident site itself may be changing as often occurs during a fire, flood, earthquake, avalanche, tornado, hurricane, or other natural or man-made disaster.
[0051] A secondary Phoebe instance 110 of the artificial intelligence software 15 in the userwearable radio unit 100 may additionally provide two-way computer-controlled transmission and receipt of voice and/or data communication between at least one other secondary Phoebe instances 110 of the artificial intelligence software 15 in at least one other user- wearable radio unit 100. Mesh network two-way voice and/or data radio communications between a first userwearable radio unit 100 and a plurality of other user- wearable radio units 100 is provided when the secondary virtual personal assistant of the one respective user-wearable radio unit is configured to provide two-way voice and/or data communication with the secondary virtual assistants of the plurality of other respective user-wearable radio units, each acting as a mesh link in a mesh radio relay.
[0052] During an emergency each IRIS UHF mesh network radio subsystem 240 also serves as a unique UHF radio beacon to assist in location of the user 5 wearing it and is capable of carrying digitally compacted voice communications in an emergency.
Low Frequency (10 kHz to 300 kHz) Radio System
[0053] In one embodiment of IRIS radio subsystem 25 comprises a Low-Frequency (LF) radio subsystem 250 comprising a software defined radio. This LF radio subsystem 250 is configured to operate at a frequency between 10 kHz and 300kHz utilizing a modulation method selected from the group consisting of amplitude modulation, single sideband modulation, phase modulation, frequency modulation, pulse modulation, or any combination thereof.
[0054] The use of LF for emergency radio communications has been proven effective in caves, mines, and undersea communication. As an example, the 87 kHz HeyPhone single-sideband (SSB) voice communications developed by the British Cave Rescue Council was the primary communications technology used in the 2018 rescue of a boys’ soccer team inside a large cave complex in Thailand.
[0055] LF radio frequencies are particularly effective for communicating over large area warehouses or high-rise buildings, which electrically resemble above-ground caves. Both VHF (<10m wavelength) and UHF (<lm wavelength) radio signals propagate as a far-field (primarily electric-field) sky wave primarily along a line of sight at UHF frequencies. In contrast, LF radio signals, because of their extremely long wavelength (3500m at 87 kHz), propagate as a nearfield (primarily magnetic-field) groundwave propagating around obstacles, and penetrating deep into both earth and fresh and saltwater, including multiple building subbasements and underground transit tunnels. The approximately 3450m wavelength of 87 kHz LF radio 10 requires that RF propagation be through a magnetic loop coupling, typically using either an air coil or a ferrite rod coil with high RF currents. In IRIS radio units 2, including user-wearable radio units 100, the LF antenna is separate from smart multiband antenna system 270.
[0056] Therefore, an LF radio provides an emergency communications supplement to normal VHF/UHF first responder communications. Because of its high transmit power consumption due to magnetic field propagation, LF radio 250 is used primarily for emergency communications with endangered responders inside structures. LF radio subsystem 250 is also a backup to the UHF GHz mesh network radio subsystem 240 for sending location and health and environmental status information messages, particularly from user- wearable radio units 100 to the base station radio unit 10. LF radio subsystem 250 is capable of also bidirectionally carrying emergency compacted voice communications.
[0057] Unlike the 87 kHz HeyPhone cave rescue radio using analog Single-Sideband (SSB) voice transmission, the LF radio subsystem 250 is a software-defined digital radio (SDR) using a digital packet protocol for communication of both data and compacted speech. In each packet, Reed-Solomon Error Correction is used in each packet to ensure digital data recovery and avoid packet retransmissions to the largest extent possible. [0058] Radio transceivers today are both analog and digital in nature, some using the well- known Software Defined Radio (SDR) Technology. IRIS user-wearable radio unit 100 comprises multiple VHF and UHF devices including FirstNet cellular radio subsystem 220 (758- 805 MHz); GNSS-2 radiolocation system (1170-83 and 1222-33 MHz); and mesh network subsystem 240 (2400-2484 MHz). In addition, a software-configurable smart antenna system 270 may be used to digitally redefine the electric field antenna substructures in smart antenna system 270 for operation in the 785-2484 MHz range, to support the antenna structures for the IRIS radio 100.
[0059] An optional conventional radio subsystem 235 such as the Project 25 digital 2-way Radio (P25) may be integrated in addition to or in place of the 150-480 MHz Legacy band VHF>UHF radio subsystem 230 to provide both voice backup and enable coordination with other emergency responder services utilizing the traditional safety radio network bands.
Speech Compaction Technology
[0060] In FIG. 2, in one embodiment of the IRIS user- wearable radio unit 100, the FirstNet cellular radio subsystem 220 and optional legacy band radio subsystem 230 voice communications are transmitted directly as these technologies already are designed for good quality analog or digital speech at their VHF or UHF operating frequencies.
[0061] The narrow bandwidth of the LF radio subsystem 250 and the small digital packet size of the Mesh radio subsystem 240 make them not well suited to standard voice communications. An alternative method of voice message transmission as a compacted Reed-Solomon error correction digital packet protocol is now specified which conforms to the data transmission capabilities of both UHF IEEE 802.15.4 mesh radio subsystem 240 and the 10kHz to 300 kHz LF radio.
[0062] Standard analog telephony uses an audio bandwidth of 3.5 to 4.0 kHz. The LF radio subsystem 250 is modulation bandwidth limited by the low carrier frequency to much less than 4 kHz, typically limited to about 2.6 kHz as used in the HeyPhone 87 kHz single sideband (SSB) cave rescue radio, limiting analog speech quality.
[0063] Mesh network radio subsystem 240 requires all transmissions be digital packets of approximately 102 bytes (816 bits) payload or less. Conventional VOIP digitized voice transmission is out of the question on the shared mesh network radio subsystem 240. The novel strategy used to send voice messages herein meeting both the radio bandwidth and short packet length limitations of an IRIS user- wearable radio system 100 is as follows: 1) convert the audio speech to text using Phoebe speech recognition 150 software included in secondary Phoebe instance 112 using standard speech recognition software appropriate for the language used by the user 5, producing either an English or 7-bit ASCII phonetic text representation; 2) use multiple ASCII text compaction technologies to convert the text into a byte string representation to produce the text message bits; 3) encode those message bits into a sequence of GF(103) Reed- Solomon Error-Corrected short digital speech packets; 4) transmit the sequence of digitized speech packets over one or more of the IRIS radio subsystems 220, 230, 235, 240, 250.
[0064] In FIG. 4, Phoebe speech recognition capability 150 is interfaced with the speech to text conversion subsystem 150, which comprises text compaction subsystem 162. Operation is based on the user 5 secondary Phoebe instance 110 receiving the user’s 5 speech through headset 120 and IRIS device controller 270. In speech to text conversion 160, utilizing text lookup engine 161, input speech is converted into text. Speech to text conversion subsystem 160 compacts the text to form compacted voice packets 168, encoded using GF(103) Reed-Solomon Error Correction for data correction in the event of corruption during radio transmission.
[0065] In FIG. 2, the compacted voice packets 168 may be routed through communications radio controller 200 for transmission on one or more of IRIS radio subsystems 220, 230, 235, 240, 250 in a message sequence of one or more digital packets.
[0066] On receipt of such a message sequence from any of the IRIS radio subsystems, any receiving IRIS user-wearable radio unit 100 then uses GF(103) Reed-Solomon Error-Correction technology to recover the voice message bits within each 102 byte digital speech packet. It then concatenates the sequence of message bits to recover the complete compacted digital voice message. It decompacts the sequence of voice message bits into text. Then IRIS device controller 260, controlled by secondary Phoebe instance 110, uses text to speech subsystem 155 to decompact the digital packet sequence and regenerate the sequence into speech and send through IRIS device controller 260 to the user’s 5 headset 120.
Example of Voice Compaction Usage
[0067] The voice compaction strategy and its benefit in user communication is more easily understood by an example. Consider the following emergency message, spoken by the affected user 5 into their headset 120, which secondary Phoebe instance 110 determines must be transmitted as an emergency message over any available radio channel including the mesh radio network 240 and LF radio 250: “Mayday I’m trapped in the Mercer building 21st floor collapsed and I fell 2 floors trapped under debris I lost my left boot and I think my ankle is broken. ”
Statistics of this message are: if audio, 20 seconds of >3 kHz radio bandwidth; and if transmitted using VOIP digital protocol, 640kB of data. If the message is converted using speech to text technology, the message statistics are: the total message ASCII characters (including spaces) used is 132 total words, including the one-character words like “I” which total 27.
[0068] Two simple encoding technologies are available which may be simply implemented in IRIS user-wearable radio unit 100 to compact almost every word in the above text message, regardless of its length, to no more than 2 bytes. A useful simplifying assumption is that each word ends in a space character, so that the spaces between words do not need to be explicitly encoded.
[0069] The first encoding strategy is dictionary-based compaction and may be used for words of two letters or more, including their trailing space. Here the dictionary size is an 8,192 word optimized library 164, selected for the work focus of users 5 of a first responder team using the IRIS user-wearable radio unit 100. All common words, terms and acronyms used by users 5 are included in optimized first responder library 164.
[0070] The second encoding strategy may use a modified version of US ASCII 7 -bit encoding which may be used for characters such as numbers and letters, including spelling of words not in the dictionary. The modified ASCII table 165 is shown in Fig 5. The byte encoding starts with a leading 0 followed by the upper 3 bits (U3B) column index and the lower 4 bits (L4B) row index. Note that the first 32 positions, having 3 bits (U3B) column index equal to 0 or 1, have been replaced by common single character and 2-character words, and a few statistically- frequent common 3 -letter words like “THE” and “AND” together with their trailing space, enabling common words to be encoded in 1 byte.
[0071] In the algorithm described below, 1 byte is used for each letter, 1 or 2 bytes are typically used to encode each word. An initially empty output byte string is provided for construction of the codeword string used in the digital packet communication.
[0072] The voice to text and speech compaction 160 algorithm proceeds as follows: Step 1, a text lookup engine 161 in speech to text conversion subsystem 160 scans the next text sequence and tests to see if that is a word is in the optimized library 164 or a special ASCII codeword from table 165 shown in FIG. 5; Step 2, if it is a word in optimized library 164, then 15-bit library index plus a leading “1” is formed as two, both bytes are copied to the output byte string, Go to Step 1.; Step 3, if it is a special ASCII codeword in modified ASCII table 165, then it is a 7-bit index to the modified 7-bit ASCII table, with a leading zero and copied to the output byte string. Go to Step 1.; Step 4, use other ASCII Sequences 166 for encoding anything else. Other ASCII sequences 166 utilize standard ASCII 7-bit encoding on a character-by-character basis. This includes sequences of numbers and letters, including spelling of unfamiliar words and acronyms. Upon completion of the encoding sequence, a trailing Space should be encoded. Go to step 1.; Step 5, continue Steps 1-4 until the text sequence for the entire voice message is encoded. Then use form compacted voice packets 168 with Reed-Solomon Error Correction to complete the digital packet formation sequence.
[0073] The exemplary Mayday message above may be viewed in the light of this voice to text and speech compaction method, as follows. Un-bolded words are 2-byte indices to optimized library 164. Bold text is encoded using single-byte modified ASCII table 165 or characters are encoded using other ASCII sequences 166. The count of required bytes for each word encoded is displayed below the text. | ]
Figure imgf000021_0001
[0074] Using the methods described above, the compacted speech encoding requires 56 bytes. This compares with 132 bytes for the original text version of the message. The compacted digital message is only 42% of the length of the uncompacted digital message. The 56-byte compacted speech message plus Reed-Solomon Error Correction bytes will fit within the 102- byte (header + data) payload of a single-packet mesh network RSEC error-corrected message. The original voice message was 20 seconds, equivalent to a 640 kB VOIP transmission. The 102-byte compacted, error-corrected text version packet for the voice message is less than 0.00015% of the byte length of the digitized VOIP. The digital packet is easily reconstructed back to both text for display and voice using a corresponding reverse dictionary in text to speech 155.
[0075] The 102-byte compacted message single packet can be transmitted over a 1200 bps LF radio channel in less than a second, as compared to 20 seconds of analog audio or 640kB of VOIP. This, together with time-division multiplexing, allows multiple users to get their clearly- identified emergency voice messages through almost simultaneously, as well as individually receive emergency support messages. The speech recognition and speech generation subsystem 150 decodes voice input from the user for recognition by the secondary Phoebe instance 110. Each voice message is converted to a highly compacted text form using speech recognition and speech generation system 150 together with voice to text and text compaction system 160 and sent as a compacted speech data packet through the communications radio controller 200 on an available channel to the base station radio unit 10. This allows archiving of all voice communications as compacted data packets by the base station radio unit 10 in the cloud 40.
[0076] If no direct voice communication is available with a given user 5 through the cellular FirstNet radio subsystem 220 or the optional legacy radio subsystem 230, but a single hop or multi-hop mesh network radio subsystem 240 link is available, then compacted speech packets may be relayed both ways through the mesh network radio subsystem 240 as well. In a similar manner, compacted speech data packets may be used by the LF Emergency radio subsystem 250 as backup to the mesh network radio subsystem 240. LF Emergency radio subsystem 250 also is used in a Mayday situation to ensure emergency voice as well as data communication with an injured user. Responses from the Phoebe instance 110 are converted into synthetic speech by the speech recognition and speech synthesizer subsystem 150 and played in the user’s headset 120.
GNSS-2 Enhanced GPS Augmented with Inertial Navigation
[0077] Each user-wearable radio unit 100 further comprises an enhanced GPS location system configured to provide a user location comprising a latitude position, a longitude position and an elevation, and additionally comprises a GNSS-2 receiver using GPS L2 and L5 frequencies. The use of L2-L5 GNSS-2 enhanced GPS system 210 using the dual frequency L2 and L5 enhanced GPS system gives more accurate position information, particularly in elevation. GNSS-2 global positioning (“dual band GPS”) enhanced GPS rollout will be complete in the US in 2021. There are European, Russian, Chinese, and Indian satellite geolocation system equivalents. New GNSS-2 integrated circuit systems and ceramic dual-band antennas are available off the shelf for immediate implementation in GNSS-2 System 210 and smart multiple antenna system 270.
[0078] The use of the GPS L2 with supplementary L5 channel provides a potential accuracy of better than ±1 m in (x, y) position and ±1.5m in z (elevation). Note that actual GNSS-2 position accuracy is based on the number of satellites actually seen at the time of measurement. GNSS-2 receivers estimate the error magnitude and report it in their data output. [0079] The biggest problem in measuring elevation with GNSS-2 measurement 210 is determining what is the local GPS “sea-level anomaly” in elevation measurement on land, due to local variations in altitude and gravity. The base station radio unit 10 may provide all users with a local ground-level zero-reference elevation correction for the incident site, even for an extended site like might exist in a wildfire. Corrected GPS elevation measurements enable better determination of what floor (elevation) the user is on in a multi-story building, or better location of an incident that includes a plurality of different elevations as might occur in a wildfire, avalanche, or extended area natural or man-made disaster or accident scene.
[0080] In one embodiment of the IRIS wireless communications system 1, in the IRIS radio units 2, the elevation of each radio unit is reported relative to a reference elevation comprising an incident location ground-level zero-elevation correction, and the incident location groundlevel zero-elevation correction is broadcast by the base station radio unit 10 to the user-wearable radios 100 and incident command radio unit.
[0081] The GNSS-2 location data is reported as follows. The latitude and longitude is reported as signed 10-digit decimal degrees, with 7 digits precision. The elevation (z) is reported as decimal meters, 5 digits with 1 -decimal digit precision. The error radius (E) is reported as decimal meters, 5 digits with 1-decimal digit precision. The time is reported in HHMMSS, integer hours minutes and seconds. A flag is reported for either the last known GNSS-2 position or a physically known position. In addition to the GNSS-2 position measurement, the last physically known position of the user 5 may be known relative to any identified reference point for which a latitude, longitude, and elevation have been previously accurately identified. Position correction is not unlike how automobiles correct their GPS to last known physically position when turning a comer, since the comer has an accurately -measured map position.
[0082] Within a large building, whether a warehouse of a high-rise office building. GNSS-2 satellite links may not always be available. The enhanced GPS location system further comprises a 3-axis inertial navigation system providing a 3-axis dead-reckoning position correction from the last known GNSS-2 location of the user 5. Three-axis micromechanical accelerometers are used by companies such as Trimble and Xsens in small modules used to detect accelerations by inertial navigation system 280 associated with position change from the last known GNSS-2 location provided by enhanced GPS location system 210 and integrate the motions to estimate the position offsets for an accurate estimated position. Local inertial navigation position offsets may be reported as (Ax, Ay, Az), each as 5 digits with 1-decimal digit precision relative to the last known position. Using supplementary inertial navigation correction to the GNSS-2 location, the reported user location comprises the last known measured GNSS-2 location or the last identified physical location of the user 5 as corrected with a position offset from the 3-axis inertial navigation system.
[0083] Base station radio unit 10 and incident command center 50 may utilize local or cloud 40 software to create a three-dimensional map of the locations of all or selected users 5. Companies like Mapslndoor already offer smartphone apps built off of Google Maps which have been or may be applied to large indoor structures such as stadiums, convention centers, campuses and large corporate office or hospital complexes to allow personal navigation.
[0084] Where no existing indoor map of an incident site exists, knowing the relative base station radio unit 10 location can utilize all the incoming location packets from each the userwearable radio unit 100 of each user 5 to reconstruct the incident site. Site mapping for personnel location is easier in outdoor incidents such as wildfires, where accurate terrain maps are available, and prestored in cloud 40. Reconstruction of indoor sites is aided even when basic building plan information is known, such as the location of the building walls and entrances, and the elevation of each floor, ideally pre-stored in cloud 40. The more detail that is known about each building structure, the more accurate assessment can be made of where the users 5 are in that structure, and their level of risk. In the case of rescue support users 5, having a personnel site map helps determine which users 5 are in the best position to support the rescue of a user 5 requiring assistance, and which users 5 are still critical to control the incident and must be kept in place. All this is driven by accurate location knowledge of the position of each user IRIS userwearable radio unit 100 location and its continual position updates sent to the base station radio unit 10.
[0085] The micromechanical 3-axis accelerometers used in inertial navigation system 280 also detect any sudden acceleration of the user 5 such as caused by a fall, or being struck by an object. Analysis by the secondary Phoebe instance 110 of the artificial intelligence software 15 of the user 5 then determines whether or not to automatically issue a Mayday message and activate LF radio 250 (if not already active).
The Health and Environmental Status Subsystem
[0086] In FIG. 2, the health and environmental assessment system 170 collects and forms data packets of the user status comprising primary life sign and life support comprising a plurality of user health vital signs comprising a heart rate, a heart rhythm, a blood pressure, a core body temperature, a respiration rate, and an %Oxygen saturation (PO2) , and user 5 environment data that comprises a user location comprising a latitude position, a longitude position and an elevation from each user 5 for transmission to the base station radio unit 10. Health and environmental assessment system 170 is connected remotely to body-mounted biometric and physiological sensors 130 and physical environment sensors 140.
[0087] Data from biometric and physiological sensors 130 attached to the user 5, say in the form of individual sensors, a smart wristwatch, a smart ring, smart clothing, or a self-adhesive body sensor, collect parameters including heart rate, heart rhythm (e.g., EKG), core body temperature, respiration rate and percent oxygen saturation (PO2).
[0088] Data from physical and chemical sensors 140, may include life support data from either a fireman’s Self-Contained Breathing Apparatus (SCBA) or a diver’s SCUBA remaining air pressure sensor sensors; and suit or apparatus-mounted ambient temperature and toxic gas sensors including carbon monoxide and hydrogen cyanide sensors.
These data are combined in health status and environmental measurement subsystem 170 to form error-corrected health status data packets. These data packets are nominally sent several times per minute to the base station radio unit 10, typically through the UHF mesh network radio 240 and/or LF radio 250.
Use of a Distributed Artificial Intelligence System
[0089] Artificial intelligence software 15 is utilized to determine the user status and situational status of each user 5 using a user-wearable radio unit 100.
[0090] There are both voice communication and nonvoice datalinks within wireless communications 60 between each secondary Phoebe instance 110 of user- wearable radio unit 100 and the primary Phoebe instance 30 in the base station radio unit 10. The primary Phoebe instance 30 also coordinates any necessary emergency communication directly between the secondary Phoebe instance 110 of a user 5 that requires assistance, such as a user 5 with a life threatening condition, with the incident command Phoebe instance 85 and support infrastructure to support the operations of and/or rescue an such a user 5.
[0091] The use of the artificial intelligence software 15 capabilities of the Phoebe instances 30, 85, 110 are manifest. They have the ability to perform continual artificial intelligence analysis of the health and situational conditions and any danger to the users 5 of user-wearable radio units 100 and provide advise to both these users 5 and the users 5 directing the response to the incident from the base station radio unit 10 and the incident command radio unit 50 as to actions to mitigate the health or situational risk based on the user status and situational status information received from the users 5 of the user- wearable radio units 100.
[0092] Since the user 5 of a user- wearable radio unit 100 often may not be in direct contact with another user member, any medical or situational problem occurring to them may go unobserved by other first responder users 5. This is where the distributed artificial intelligence software 15 capabilities of the respective Phoebe instances 30, 85, 110 are most useful. The user 5 of a user- wearable radio unit 100 is always being monitored by the secondary Phoebe instance 110, which is monitoring the user status and situational status both through sensors that are continuously collecting and communicating data and direct voice communication with the user 5. The secondary Phoebe instance 110 of the user 5 has the analytic capability of interpreting all this user status and situational status data in real-time and determining if the user 5 is at medical risk (e.g., is unconscious, or has a cardiac arrhythmia) or in a situation risk (e.g., has fallen, or is low on remaining air) and communicating to the users 5 operating the base station radio unit and/or the incident command radio unit both the analyzed problem and a recommended action to mitigate the risk.
[0093] In one embodiment of the IRIS wireless communications system 1 the secondary Phoebe instance 110 of the artificial intelligence software 15 in each user- wearable radio unit 100 is configured to: 1) determine a user status comprising a plurality of user health vital signs comprising a heart rate, a heart rhythm, a blood pressure, a core body temperature, a respiration rate, and an %Oxygen saturation (PO2), and a user location comprising a latitude position, a longitude position and an elevation; 2) determine a situational status comprising a remaining air supply amount, an ambient temperature, a toxic gas indication, a user body position, and a fall detection indication; and; 3) communicate the user status and the situational status to the primary instance of the artificial intelligence software of the base station radio unit.
[0094] Of particular concern are life-threatening conditions, where the secondary Phoebe instance of the user 5 may also communicate an emergency condition to the primary Phoebe instance 30 and/or the incident command Phoebe instance 85.
[0095] In one embodiment of the of the IRIS wireless communications system 1 further comprising an incident command radio unit 50 comprising an incident command instance 85 of the artificial intelligence software 15 configured to provide two-way voice and/or data communication with the primary instance 30 of the artificial intelligence software, the secondary instance 110 of the artificial intelligence software 15 in each user-wearable radio unit 100 is configured to: 1) analyze the user status for a life-threatening condition of the user 5 comprising a heart arrhythmia condition, a respiratory failure condition, a heart failure condition, a loss of consciousness condition, or a combination thereof; and 2) if a life-threatening condition of the user is determined, communicate the life threatening condition of the user as an emergency condition to the primary instance 30 of the artificial intelligence software 15 in the base station radio unit 100, wherein the primary instance of the artificial intelligence software is configured to communicate the life threatening condition of the user and the emergency condition to the incident command instance 85 of the artificial intelligence software in the incident command radio unit 50.
[0096] Since the primary Phoebe instance 30 in the base station unit 10 maintains the location of all user-wearable radio units 100, it can provide a list of a group of users 5 with user locations proximate the user 5 experiencing an emergency condition to the incident command center 50 that are positioned to assist and/or rescue the user 5 experiencing the emergency condition.
[0097] In one embodiment of the IRIS wireless communications system 1, upon the primary Phoebe instance 30 of the artificial intelligence software 15 in the base station radio unit 100 receiving communication of an emergency condition report from any secondary instance of any artificial intelligence the primary instance 30 of the artificial intelligence software 15 in the base station radio unit 10: 1) determines a group of other user- wearable radio units 100 and respective other users 5 that are in close proximity to the user 5 with the life threatening condition; and 2) communicates the group of other user- wearable radio units 100 and respective other users 5 to the incident command instance 85 of the artificial intelligence software 15 of the incident command radio unit 50. The user 5 of the incident command radio unit 50 may then select a user or users 5 from the group to provide assistance to the user 5 that is experiencing the life-threatening condition to triage the condition and affect rescue of the afflicted user 5.
[0098] The primary instance 30 of the artificial intelligence software 15 is effectively a realtime database of the user status, including the health and location, and situational status of all user- wearable radio unit 100 users 5. The primary instance 30 of the artificial intelligence software 15 may also be used in a strategic way, by providing consolidated and analyzed data on user status, including the health and location, and situational status of all user-wearable radio unit 100 users 5 to the incident command radio unit 50, and using its artificial intelligence capability make recommendations to the incident command radio unit 50 and the commanding user 5. [0099] In one embodiment of the IRIS wireless communications system 1 further comprising an incident command radio unit 50 comprising an incident command instance 85 of the artificial intelligence software 15 configured to provide two-way voice and/or data communication with the primary instance 110 of the artificial intelligence software 15, wherein the primary instance of the artificial intelligence software 15 analyzes the user status and the situational status of a plurality of users 5 and provides a recommendation to the incident command instance 85 of the artificial intelligence software 15 to deploy or redeploy the users 5 to predetermined user locations. This may, for example, comprise an initial deployment of users 5 at an incident where the users are deployed based on a predetermined skill set accumulated and stored by the artificial intelligence software 15 based on an assessment of the incident by the user 5 of the base station radio unit 10, such as an team leader, or by the user 5 of the incident command radio unit 50, such as an incident commander. In another example, this may comprise a redeployment of users 5 at an incident where the users are redeployed based on a predetermined skill set accumulated and stored by the artificial intelligence software 15 based on an assessment of changed conditions at the incident by the user 5 of the base station radio unit 10, such as an team leader, or by the user 5 of the incident command radio unit 50, such as an incident commander.
Exemplary Use of Phoebe and Artificial Intelligence
[00100] To more fully understand Phoebe’s use by a user 5, the following exemplary usage scenario is given in which firemen users 5 utilize the IRIS radio system 1 and virtual assistants Phoebe 12, 87, 112 in the course of a typical fire incident. Here a 6-person group of users 5 of comprising firemen headed by team leader Tony and also including Ramiro, Gus, Tiffany, Lou and Anna are on a single group of users 5 and a radio subnet. It is assumed that all subnet voice and/or data communications are being passively monitored by the respective users 5 of the base station radio unit 10 and incident command center 50 through their respective primary and incident command Phoebe instances 30, 85, and archive to cloud 40. Initially it is assumed that all users 5 are operating in a chat group with Tony, the team leader.
[00101] This scenario is the voice stream inside Gus’s helmet and his support from his personal secondary Phoebe instance 110. For clarification:
1. Voice commands to Gus’s personal secondary instance 110 of Phoebe, each ending in the action command now and voice responses from Phoebe are bolded. 2. Note that all others in the chat group’s communications with their own personal secondary instance 110 of Phoebe will also be heard, but in each case, they are communicating only their personal instance 110 of Phoebe, not Gus’s Phoebe instance.
3. Interactions with Phoebe by others outside Gus’s chat group not heard by Gus will be italicized.
4. In the scenario below, the ellipsis . . . indicates an unspecified passage of time.
[00102] Note that each person’s Phoebe instance 110 thus acts for its IRIS user-wearable radio unit 100 user 5 only. For example, the only time that Gus’s Phoebe instance 110 would accept a command is when Gus commands Phoebe ... now, and does not act when Phoebe’s name is heard over Gus’s user- wearable radio unit, as in a chat group where a Phoebe command is spoken by another, such as Tony.
[00103] Exemplary scenario:
Message from Tony: Gus and Tiffany, take a 114 inch hose and go in through the west door
Gus: Phoebe tell Tony and Tiffany we will do now
Gus: Phoebe start chat group with Tiffany now
Phoebe: Starting chat group with Tiffany only
Gus: Phoebe tell Pumper Truck I need a 100 foot 114 inch attack hose at west door for Tiffany to pick up now
Phoebe (to chat group)’. Pumper Truck confirms setting up 1 14 -inch attack hose with handline nozzle to deliver to Tiffany at west door
Tiffany: I have the hose setup and will meet you at west door
Gus: Thanks, Tiff
Gus: Phoebe We are in stairwell. Tell pumper truck we are ready for water now
Phoebe: Confirming water start
Gus: Phoebe tell Tony and Command we are entering west door now
Gus: Phoebe tell Tony and Command first floor is secure. Moving up the stairs. Need 50 feet more hose now
Phoebe: Telling Command and Pumper Truck that you need 50 feet more hose. Phoebe: Pumper Truck says must add hose to string. Water must shut down. Please wait.
Phoebe: 50 feet extra hose added. Ready to turn water back on.
Gus: Phoebe tell Pumper Truck to turn water on now
Phoebe: Confirming water is turned on
Gus: Tiffany and I are moving up burning stairwell.
Phoebe: (to chat group, Tony and incident command) Emergency! Stairwell collapsed under Gus. He fell 11 feet. Checking on Gus and Tiffany
Phoebe: Gus where are you? Are you Ok?
Gus: Trapped under stairwell wreckage. Think my left leg is broken. I just can’t move and am in pain. I feel blood. May be bleeding from leg.
Phoebe: Tiffany, are you ok?
Tiffany: Yes, I was not on the part that collapsed
Phoebe: (to chat group, Tony and incident command) Mayday declared for Gus only.
Broken leg, bleeding, trapped under wreckage.
Phoebe: Joining Command, Medical and Tony in chat group with Gus and Tiffany
Medical: Vitals appear Ok for situation. Gus, you are down to 27% remaining air. The atmosphere is toxic. Try to breathe more slowly to conserve air.
Command: Are you in any fire danger?
Tiffany: No, just burning debris. Gus in under some
Tony: Command, now on location with Gus and Tiffany. Gus was on stairwell when it collapsed trapping him. Tiffany did not fall and is OK. Gus is bleeding and thinks his left leg is broken. Clearing to get to his location; need more help. Phoebe which fellow users are nearest to Gus now?
Phoebe: Anna and Lou are approximately 70 feet away
Tony: Phoebe Ask Lou and Anna to assist me at Gus’ location now
Phoebe: Confirming that Lou and Anna are on the way to Gus’s location. Adding Lou and Anna to your chat group
Tony: Command, Lou and Anna are on site clearing with me and Tiffany.
Tiffany: Gus? Talk to me.
Tiffany: Medical, Gus is unconscious! Anna: I reached Gus. Breathing but bleeding bad from fractured leg. Applying tourniquet.
Medical: Weak, slow pulse and respiration.
Medical: Medical evacuation is on the way. Phoebe Join medical evacuation user group with Tony’s chat group now
Phoebe: Joining medical evacuation with Tony’s chat group
Medical evacuation: On the way less than 3 minutes
Tony: Gus is clear and medical evacuation is with him
Medical evacuation: Phoebe Tell all that Gus has a compound fracture of lower left leg. Bleeding stopped by tourniquet; left lower leg is in air splint. On oxygen. Gus is in litter for extraction. Evacuation started.
Phoebe: Confirming Gus is stable and being extracted in litter
Tony: Phoebe end Gus’s Mayday status now
Phoebe: Mayday status ended for Gus
Tony: Phoebe leaving chat group. Connect me to Command only now
Phoebe (to Tony and Command only) Now connected for chat to Command only
Tony: Command, Gus’s extraction complete. Ready for reassignment. Command: Stand by.
[00104] In the above scenario, the entire distributed artificial intelligence system 15 with Phoebe instances 30, 85, 110 act in multiple roles. The base station radio unit 10 primary Phoebe instance 30 coordinates with multiple individual groups of users 5 and their secondary Phoebe instances 110 as required in tasks as forming and managing chat groups. The artificial intelligence system 15 in the base station radio unit 10 primary Phoebe instance 30 does all the coordination and communication with the pumper truck team, incident command and medical support, such as outlined below:
1. Initially, the base station radio unit 10 primary Phoebe instance 30 is used as a primary radio relay for voice messages from Tony to Tiffany and Gus.
2. Gus’s secondary Phoebe instance 110 requests base station radio unit 10 primary Phoebe instance 30 to coordinate with Tiffany’s secondary Phoebe instance 110 to set up a virtual direct communications chat group. Gus’s secondary Phoebe instance 110 communicates with the base station radio unit 100 primary Phoebe instance 30 to provide coordination with the pumper truck team 80:
3.1. Independently, the artificial intelligence software 15 in base station radio unit 10 primary Phoebe instance 30 coordinates with other users 5 of pumper truck team 80 with their secondary Phoebe instances 110 to request for a 100 foot 1 !4“ attack hose and nozzle to deliver to Tiffany at the west door.
3.2. The base station radio unit 10 primary Phoebe instance 30 coordinates with the other users 5 of the Pumper Truck 80 to turn on the water when Gus and Tiffany are in the stairwell and are ready.
3.3. Later, the base station radio unit 10 primary Phoebe instance 30 independently coordinates insertion of 50 feet more fire hose by the pumper truck 80 so that Gus and Tiffany can work their way up the stairwell. Gus uses his secondary Phoebe instance 110 through base station radio unit 10 primary Phoebe instance 30 to relay a voice message to Tony and incident command radio unit 50 of their fire-fighting status on stairwell Gus’s secondary Phoebe instance 110 uses data from Gus’s inertial navigation system 280 to detect that Gus has fallen 11 feet. Knowing that Gus was in the stairwell, Gus’s secondary Phoebe instance 110 artificial intelligence software interprets this as a stairwell collapse. Gus’s secondary Phoebe instance 110 tells base station radio unit 10 primary Phoebe instance 30 to declare an emergency condition and independently takes care of the notification of Tony (as user 5 group leader), and the user 5 of the incident command radio unit 50. Base station radio unit 10 primary Phoebe instance 30 enquires health and safety status of Gus and Tiffany. Base station radio unit 10 primary Phoebe instance 30 of artificial intelligence software 15 interprets Gus’s health and safety status voice description and declares a Mayday (life- threatening condition) for Gus only.
8.1. Base station radio unit 10 primary Phoebe instance 30 then notifies Tony (as leader), the user 5 of incident command radio unit 50 and the other users 5 of Medical Support 90.
8.2. Base station radio unit 10 primary Phoebe instance 30 commands Gus’s secondary
Phoebe instance 110 to activate IRIS LF radio 250 for emergency voice and health status communications. 8.3. Base station radio unit 10 primary Phoebe instance 30 then joins the incident command instance 85 of incident command radio unit 50 and medical support 90 in a chat group with Tony and the other people already in it.
8.4 Base station radio unit 10 primary Phoebe instance 30, using its user, location database, determines a group of other user-wearable radio units and respective other users that are in close proximity to the user with the life threatening condition; and communicates the group of other user-wearable radio units and respective other users to the incident command instance of the artificial intelligence software of the incident command radio unit.
9. Tony uses the distributed Phoebe artificial intelligence software to obtain additional help for Gus:
9.1. Tony’s secondary Phoebe instance 110 uses base station radio unit 10 primary Phoebe instance 30 to consult the 3D building map resource to determine where the nearest members of his group of users 5 are, in this case Lou and Anna.
9.2. Tony’s secondary Phoebe instance 110 then uses base station radio unit 10 primary Phoebe instance 30 to order the secondary Phoebe instances 110 of Lou and Anna to join him at Gus’s user location and add the secondary Phoebe instances 110 of Lou and Anna to Tony’s chat group.
9.3. Base station radio unit 10 primary Phoebe instance 30 sends confirmation to Tony’s secondary Phoebe instance 110 after notifying the secondary Phoebe instances 110 of Lou and Anna of the meeting location.
10. Base station radio unit 10 primary Phoebe instance 30 knows Gus’s user location and directs the medical evacuation to Gus’s user location.
11. Finally, Tony uses the collective Phoebe instances 30, 85, 110 to relay status to everyone involved that Gus has been rescued and through base station radio unit 10 primary Phoebe instance 30 and terminates Gus’s Mayday communication status.
12. Tony then orders his secondary Phoebe instance 110 to have base station radio unit 10 primary Phoebe instance 30 remove himself (Tony) from the current chat group and connect him in chat mode only to the incident command instance 85 of the incident command radio unit 50
[00105] In summary, by utilizing only voice control and response, the distributed hand-free Phoebe virtual assistant coordinated many of routine tasks and eliminated many of user 5 one- on-one messaging to set up communications to manage routine tasks such as getting a firehose setup and then extended 50 feet. [00106] Once Gus fell, his inertial navigation system 280 and secondary Phoebe instance 110 identified what had occurred. Using base station radio unit 10 primary Phoebe instance 30, coordinated the other users 5 of the support groups involved to activate the needed functions for Gus. Importantly, the respective Phoebe instances 30, 85, 110 also kept the channels of communication narrow, by building a dynamically reconfigurable chat group of just the individuals and support centers necessary at any given time and actually involved in resolving Gus’s emergency condition, reducing extraneous communication with users 5 that were not needed to resolve his condition.
[00107] The distributed Phoebe artificial intelligence software 15 is optimized to identify, assess and take immediate action in unpredictable, dangerous and life-threatening situations. The strategy is always that Phoebe’s artificial intelligence software 15 preliminary life threat assessment delivered continuously is always valuable. The IRIS user- wearable radio unit 100 secondary Phoebe instance 110 artificial intelligence provided to each user detects a critical change in any user’s 5 monitored parameter, such a fall, breathing apparatus being out of air, environmental change such as an explosion or flashover, or medical emergency such as a gunshot, unconsciousness or heart attack. Then, the user’s 5 secondary Phoebe instance 110 assesses the situation and reports to base station radio unit 10 primary Phoebe instance 30 to allow that artificial intelligence generated assessment to be shared with the incident command radio unit 50 incident command Phoebe instance 85 and all relevant support functions. The distributed Phoebe artificial intelligence software can take real-time actions to communicate through the IRIS user- wearable radio unit 100, including gathering more situation-specific data and requesting support functions response. These data and actions are always monitored by the incident command radio unit 50 incident command Phoebe instance 85 its user 5, such as an incident command personnel, and can be modified by them as needed.
[00108] Improvements provided by the hands-free IRIS wireless communication system 1 and the distributed Phoebe artificial intelligence software 15 provided over the radio subsystems disclosed herein provide high-reliability, multiband radio reporting of the real-time location and health status of every user 5, and use the Phoebe artificial intelligence software-based analytics to coordinate responses to a wide variety of rapidly-changing incidents and to mitigate life- threatening situations.
Many modifications and other embodiments of the present disclosure set forth herein will come to mind to one skilled in the art to which this subject matter pertains, once having the benefit of the teachings in the foregoing descriptions and associated drawings. Therefore, it is understood that the subject matter of the present disclosure is not limited to the specific embodiments disclosed, and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purpose of limitation.

Claims

We claim:
1. A wireless communications system comprising a computer-controlled base station radio unit and a plurality of computer-controlled user-wearable radio units, each radio unit comprising a plurality of radio subsystems operating at a plurality of different frequencies to provide two- way computer-controlled transmission and receipt of voice and/or data communication between the base station radio unit and the user-wearable radio units, and wherein the base station unit comprises a primary instance of an artificial intelligence software; and each user-wearable radio unit comprises a secondary instance of the artificial intelligence software, and wherein collectively all instances of the artificial intelligence software communicate over the plurality of radio subsystems and comprise a distributed artificial intelligence communication system.
2. The wireless communication system of Claim 1 wherein the primary instance of the artificial intelligence software within the base station radio unit is configured for two-way voice and/or data communication with any of the secondary instances of the artificial intelligence software of the user-wearable radio units.
3. The wireless communication system of Claim 1 wherein the primary instance of the artificial intelligence software within the base station radio unit is configured for two-way voice and/or data communication with any selected group of the plurality of secondary instances of the artificial intelligence software in the user-wearable radio units.
4. The wireless communication system of Claim 1 wherein one secondary instance of the artificial intelligence software in the user-wearable radio unit may additionally provide two-way computer-controlled transmission and receipt of voice and/or data communication between at least one other secondary instance of the artificial intelligence software in at least one other userwearable radio.
5. A wireless communications system comprising a computer-controlled base station radio unit and a plurality of computer-controlled user-wearable radio units, each radio unit comprising 3 or more radio subsystems capable of simultaneous operation at a plurality of different frequencies., the radio units configured for two-way computer-controlled transmission and receipt of voice and/or data communication between them.
6. The wireless communications system of Claim 1 or Claim 5 wherein one the radio subsystem is configured to operate at a frequency between 10 kHz and 300kHz utilizing a modulation method selected from the group consisting of amplitude modulation, single sideband
- 34 - modulation, phase modulation, frequency modulation, pulse modulation, or a combination thereof.
7. The wireless communications system of Claim 1 or Claim 5 wherein one the second radio subsystem comprises is configured to operate at a frequency above 1000 MHz utilizing the IEEE 802.15.4 Mesh Protocol.
8. The wireless communications system of Claim 1 or Claim 5 wherein at least one of the plurality of radio subsystems utilizes a Reed-Solomon error-correction digital packet protocol.
9. The wireless communications system of Claim 1 or Claim 5 wherein at least one of the plurality of radio subsystems comprise a software-defined radio receiver.
10. The wireless communications system of Claim 1 wherein each secondary instance of the artificial intelligence software in each user-wearable radio unit comprises a secondary virtual personal assistant, and the primary instance of the artificial intelligence software in the base station radio unit comprises a primary virtual personal assistant, and the primary virtual personal assistant and each secondary virtual personal assistant are configured to provide respective user with a voice command mode and a voice response mode for operation of the respective radio units.
11. The wireless communication system of Claim 10 wherein the secondary virtual personal assistant in each user-wearable radio unit is configured to provide direct two-way computer- controlled voice and/or data communication with the primary virtual personal assistant of the base station radio unit.
12. The wireless communication system of Claim 11 wherein the secondary virtual personal assistant that is configured to provide two-way voice and/or data communication with the primary virtual personal assistant of the base station radio unit is configured as a secondary radio relay for two-way voice and/or data communication between the primary virtual personal assistant and another secondary virtual personal assistant.
13. The wireless communication system of Claim 10 further comprising an incident command radio unit comprising an incident instance of the artificial intelligence software and comprising an incident command virtual personal assistant that is configured to provide two- way voice and/or data communication with the primary virtual personal assistant, wherein the secondary virtual personal assistant in each user-wearable radio unit is configured to provide two-way voice and/or data communication with the incident command radio unit, and wherein
- 35 - the primary virtual personal assistant in the base station radio unit is configured as a primary radio relay for two-way voice and/or data communication between the secondary virtual personal assistant and the incident command virtual personal assistant.
14. The wireless communication system of Claim 13 wherein the secondary virtual personal assistant of one user-wearable radio unit is configured to provide two-way voice and/or data communication with the incident command virtual personal assistant of the incident command radio unit, using at least one other secondary virtual personal assistant and the respective userwearable radio unit as a secondary radio relay and the primary virtual personal assistant of the base station radio unit as the primary radio relay.
15. The wireless communication system of Claim 10 wherein a mesh radio unit configured to provide two-way voice and/or data radio communications between a first user- wearable radio unit and one or a plurality of other user-wearable radio units is provided when the secondary virtual personal assistant of one respective user-wearable radio unit is configured to provide two-way voice and/or data communication with the secondary virtual personal assistant of at least one other respective user-wearable radio unit, each user-wearable radio unit acting as a mesh link in a mesh of the secondary radio relays.
16. The wireless communication system of Claim 1 wherein each user- wearable radio unit further comprises an enhanced GPS location system configured to provide a user location comprising a latitude position, a longitude position and an elevation.
17. The wireless communication system of Claim 16 wherein the enhanced GPS location system comprises a GNSS-2 receiver using GPS L2 and L5 frequencies.
18. The wireless communication system of Claim 16 wherein the elevation is reported relative to a reference elevation comprising an incident location ground-level zero-elevation correction, and the incident location ground-level zero-elevation correction is broadcast by the base station radio unit.
19. The wireless communication system of Claim 17 wherein the enhanced GPS location system further comprises a 3-axis inertial navigation system providing a 3-axis dead-reckoning position correction from the last known GNSS-2 location of the user.
20. The wireless communication system of Claim 19 wherein the reported user location comprises the last known good user location or the last physically known position of the user as corrected with a position offset from the 3-axis inertial navigation system.
21. The wireless communication system of Claim 1 wherein the secondary instance of the artificial intelligence software in each user-wearable radio unit is configured to: determine a user status comprising a plurality of user health vital signs comprising a heart rate, a heart rhythm, a blood pressure, a core body temperature, a respiration rate, and an %Oxygen saturation (PO2), and a user location comprising a latitude position, a longitude position and an elevation; determine a situational status comprising a remaining air supply amount, an ambient temperature, a toxic gas indication, a user body position, and a fall detection indication; and communicate the user status and the situational status to the primary instance of the artificial intelligence software of the base station radio unit.
22. The wireless communication system of Claim 21 further comprising an incident command radio unit comprising an incident command instance of the artificial intelligence software configured to provide two-way voice and/or data communication with the primary instance of the artificial intelligence software, wherein the secondary instance of the artificial intelligence software in each user- wearable radio unit is configured to: analyze the user status for a life-threatening condition of the user comprising a heart arrhythmia condition, a respiratory failure condition, a heart failure condition, a loss of consciousness condition, or a combination thereof; and if a life-threatening condition of the user is determined, communicate the life threatening condition of the user as an emergency condition to the primary instance of the artificial intelligence software in the base station radio unit, wherein the primary instance of the artificial intelligence software is configured to communicate the life threatening condition of the user and the emergency condition to the incident command instance of the artificial intelligence software in the incident command radio unit.
23. The wireless communication system of Claim 22 wherein the primary instance of the artificial intelligence software in the base station radio unit: determines a group of other user-wearable radio units and respective other users that are in close proximity to the user with the life-threatening condition; and communicates the group of other user-wearable radio units and respective other users to the incident command instance of the artificial intelligence software of the incident command radio unit.
24. The wireless communication system of Claim 21 further comprising an incident command radio unit comprising an incident command instance of the artificial intelligence software configured to provide two-way voice and/or data communication with the primary instance of the artificial intelligence software wherein the primary instance of the artificial intelligence software analyzes the user status and the situational status of a plurality of users and provides a recommendation to the incident command instance of the artificial intelligence software to deploy or redeploy the users to predetermined user locations.
- 38 -
PCT/US2021/055872 2020-10-20 2021-10-20 First responder radio communications system using multi-frequency radios voice operated by a distributed artificial intelligence system virtual personal assistant WO2022119659A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063093954P 2020-10-20 2020-10-20
US63/093,954 2020-10-20

Publications (3)

Publication Number Publication Date
WO2022119659A2 true WO2022119659A2 (en) 2022-06-09
WO2022119659A9 WO2022119659A9 (en) 2022-07-07
WO2022119659A3 WO2022119659A3 (en) 2022-08-18

Family

ID=81854902

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/055872 WO2022119659A2 (en) 2020-10-20 2021-10-20 First responder radio communications system using multi-frequency radios voice operated by a distributed artificial intelligence system virtual personal assistant

Country Status (1)

Country Link
WO (1) WO2022119659A2 (en)

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6934632B2 (en) * 2003-10-08 2005-08-23 Navcom Technology, Inc. Method for using three GPS frequencies to resolve carrier-phase integer ambiguities
US8751151B2 (en) * 2012-06-12 2014-06-10 Trx Systems, Inc. System and method for localizing a trackee at a location and mapping the location using inertial sensor information
US9172747B2 (en) * 2013-02-25 2015-10-27 Artificial Solutions Iberia SL System and methods for virtual assistant networks
US20160324442A1 (en) * 2015-05-08 2016-11-10 Proteus Digital Health, Inc. Loose wearable receiver systems
US10185513B1 (en) * 2015-06-05 2019-01-22 Life365, Inc. Device configured for dynamic software change
AU2017355691A1 (en) * 2016-11-07 2019-05-02 Whelen Engineering Company, Inc. Network and connected devices for emergency response and roadside operations
US11072405B2 (en) * 2017-11-01 2021-07-27 Tampa Deep-Sea X-Plorers Llc Autonomous underwater survey apparatus and system
US20190208051A1 (en) * 2017-12-29 2019-07-04 Motorola Mobility Llc Context detection with accelerated ai training and adaptive device engagement
US10820181B2 (en) * 2018-02-09 2020-10-27 Rapidsos, Inc. Emergency location analysis system
US10631129B1 (en) * 2018-10-01 2020-04-21 International Business Machines Corporation First responder feedback-based emergency response floor identification
WO2020085924A1 (en) * 2018-10-26 2020-04-30 Motorola Solutions, Inc Device, system and method for modifying actions associated with an emergency call
US20200264303A1 (en) * 2019-02-19 2020-08-20 Mayo Foundation For Medical Education And Research Systems, methods, and media for determining a three dimensional location of an object associated with a person at risk of falling down

Also Published As

Publication number Publication date
WO2022119659A9 (en) 2022-07-07
WO2022119659A3 (en) 2022-08-18

Similar Documents

Publication Publication Date Title
US10796396B2 (en) Emergency response augmented reality-based notification
US10070260B1 (en) Positioning information application system, gateway device, and wearable device
US11232702B2 (en) Automated sensing of firefighter teams
US7245216B2 (en) First responder communications system
US10028104B2 (en) System and method for guided emergency exit
CN108123728B (en) Cloud pipe end mode-based emergency rescue system and use method thereof
US20070103292A1 (en) Incident control system with multi-dimensional display
US20110130636A1 (en) Systems, Methods and Devices for the Rapid Assessment and Deployment of Appropriate Modular Aid Solutions in Response to Disasters.
US20180228448A1 (en) Information processing system, information processing device, terminal device, and information processing method
CN105547281A (en) Method and device for determination of rescue path
CN107708062A (en) A kind of fire evacuation system and method based on indoor positioning
CN109714747A (en) Intelligent lifesaving system based on NFC and LoRa communication and unmanned air vehicle technique
CN104548399A (en) Intelligent evacuation and rescue indicating device and method
CN110971261A (en) Wearable emergency alarm system based on NFC and LoRa communication technology
JP2017021559A (en) Terminal device, management device, radio communication system, and photographic image display method
JP2006237666A (en) Communication system, user wireless terminal and searcher wireless terminal
JP2017033495A (en) Wearable apparatus
KR20160050992A (en) Interaction apparatus for rescue information and method using the same
JP2017038205A (en) Information processing system, information processor, information processing method, and terminal unit
WO2022119659A9 (en) First responder radio communications system using multi-frequency radios voice operated by a distributed artificial intelligence system virtual personal assistant
JP2003010348A (en) Network system for collecting disaster information or the like in underground or in building structure
JP2017131319A (en) Biological sensor device, management device, communication system, and communication method
JP2008015659A (en) Situation detection terminal and situation reporting system using the same
US10841780B1 (en) System and method of automatically evaluating and communicating an emergency situation
CN110840421A (en) SOS mutual rescue system

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21901236

Country of ref document: EP

Kind code of ref document: A2