CA3075015A1 - Emergency alert system - Google Patents

Emergency alert system Download PDF

Info

Publication number
CA3075015A1
CA3075015A1 CA3075015A CA3075015A CA3075015A1 CA 3075015 A1 CA3075015 A1 CA 3075015A1 CA 3075015 A CA3075015 A CA 3075015A CA 3075015 A CA3075015 A CA 3075015A CA 3075015 A1 CA3075015 A1 CA 3075015A1
Authority
CA
Canada
Prior art keywords
user
reports
data
user devices
incident
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CA3075015A
Other languages
French (fr)
Inventor
Peter Tanner
Colleen Smith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telus Corp
Original Assignee
Telus Communications Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telus Communications Inc filed Critical Telus Communications Inc
Priority to CA3075015A priority Critical patent/CA3075015A1/en
Publication of CA3075015A1 publication Critical patent/CA3075015A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B27/00Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
    • G08B27/005Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations with transmission via computer network
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/10Alarms for ensuring the safety of persons responsive to calamitous events, e.g. tornados or earthquakes

Abstract

An alert system receives reports from user devices, for example in relation to incidents of unsafe conditions. The reports include location information of the user devices and user input selecting an option of a plurality of options. The options can relate to the user's mood in relation to the incident. A smoothing function is applied to the data and a heatmap displayed at the user devices according to the smoothed data.

Description

EMERGENCY ALERT SYSTEM
TECHNICAL FIELD
[0001] Emergency information.
BACKGROUND
[0002] Current emergency alert systems typically broadcast alerts to a large fixed receiving area. Alerts are often not relevant to a large proportion of the people in the receiving area. This can lead to people ignoring alerts.
[0003] Current emergency alerts typically include limited information. Most information received by the public is via the media, as well as social media.
This information can reach the public via many layers of intermediaries which can result in distortion of the information. It would be desirable to have more direct and more detailed information available to the public.
SUMMARY
[0004] There is provided a method of processing user reported data, the method including receiving at a server reports from a plurality of user devices, each report comprising an indication of a selection of an option from a plurality of options by a respective user of a respective user device, and a location of the respective user device. A
smoothing function is applied to the reports to obtain smoothed data. The smoothing function can be applied at the server and the smoothed data transmitted to each user device, or the reports transmitted to the user devices and the smoothing function applied at the user devices. A heatmap is generated at each user device based on the smoothed data. In an embodiment, the smoothed data may be sent to the user devices as graphical data representing the heatmap, the heatmap generated at each user device by display of the graphical data.
[0005] The following steps A-E may be applied by a plurality of user devices:
[0006] A display an input screen configured to receive user input selecting from a plurality of options;
[0007] B receive user input selecting an option of the plurality of options;
[0008] C transmit to a server location information of the respective user device and an indication of the received user input;
[0009] D receive data from the server, the data relating to the received user input and location information of plural user devices of the plurality of user devices; and
[0010] E display a heatmap based on the received data.
[0011] The heatmap may be displayed based on smoothed data, the received data being smoothed data obtained by applying a smoothing function to the reports, or the reports may be sent to the user devices and the smoothing function applied to the reports at the user devices to obtain the smoothed data.
[0012] These and other aspects of the device and method are set out in the claims.
BRIEF DESCRIPTION OF THE FIGURES
[0013] Embodiments will now be described with reference to the figures, in which like reference characters denote like elements, by way of example, and in which:
[0014] Fig. 1 is a flow chart showing steps of accumulating user data.
[0015] Fig. 2 is a flow chart showing steps of processing and displaying data.
[0016] Fig. 3 is a schematic diagram showing elements that may carry out the steps shown in Fig. 1 and Fig. 2.
[0017] Fig. 4 is a simplified depiction of a user device showing a reported incident map screen.
[0018] Fig. 5 is a representation of an exemplary user interface showing a reported incident map screen.
[0019] Fig. 6 is a simplified depiction of a user device showing a mood reporting screen.
[0020] Fig. 7 is a simplified depiction of a user device showing a preparation screen.
[0021] Fig. 8 is a representation of an exemplary user interface showing a location monitoring selection screen.
[0022] Fig. 9 is a representation of an exemplary user interface showing an alert screen.
[0023] Fig. 10 is a representation of the exemplary user interface showing the alert screen of Fig. 9, but with a toggle set to "official alerts only".
[0024] Fig. 11 is a representation of an exemplary user interface showing an alert type filter selection screen.
[0025] Fig. 12 is a representation of an exemplary user interface showing a location alert settings screen.
[0026] Fig. 13 is a representation of an exemplary user interface showing a settings screen.
[0027] Fig. 14 is a representation of an exemplary user interface showing an additional information screen.
[0028] Fig. 15 is a representation of an exemplary user interface showing a reported incidents screen.
DETAILED DESCRIPTION
[0029] Steps of accumulating data in an exemplary method of accumulating and displaying user data are shown in Fig. 1. The steps shown in Fig. 1 may be implemented on a user device, according to software instructions downloaded to the user device or pre-installed on the user device. The user device may be a mobile user device such as a cellphone, tablet, laptop or a computer installed in a vehicle. In an example, the software instructions may be implemented as an app downloaded by or pre-installed on the user device. In another example, the software instructions may be incorporated into a webpage downloaded by the user device and implemented by a browser of the user device.
A
download from a server by a user device is a transmission by the server to the user device.
[0030] In optional step 10 of Fig. 1, input is received from a user indicating that the user has observed an incident of an unsafe condition. For example, the input may be the user clicking on a mood reporting button 112 as described below. In another example, the user may be queried as to whether they are affected by an incident in response to the user entering an area including a known incident, or a known incident appearing in the user's area, and the input may be the user's response to the query in the case that the user responds affirmatively. In another example, the query may be combined with further options for response, e.g. the user is presented with options relating to their mood about an incident in step 16, with an additional option to respond that they are not affected.
This latter example is a case in which optional step 10 may be omitted as a separate step.
[0031] In optional step 12 of Fig. 1, the user device presents an input element or elements allowing the user to select from a plurality of options corresponding to possible incident types. In optional step 14 of Fig. 1, the user device receives input from the user selecting an option of the plurality of options corresponding to incident type. Steps 12 and 14 may be omitted, for example, if the incident type is already known. The incident type will be already known for example if in step 10 the user responded affirmatively to a query as to whether they were affected by a known incident.
[0032] In step 16 of Fig. 1, the user device presents an input element or elements allowing the user to select from a plurality of options relating to the user's reaction to the incident. For example, the options may relate to the user's mood about the incident.
Information on the user's mood will give first responders and the public an idea of not only what is occurring at a location but also what is the mood/sense of urgency.
[0033] For example, the mood information, in combination with a selection of an incident type, might convey information amounting to see there's a forest fire but I'm still at a safe distance so I'm not feeling threatened'. This would help let first responders know that while there's something happening, it may not yet be urgent, and they can send resources accordingly. This would also, when the information is incorporated into a map as described below, let the public know to avoid the area. This solution could help offset emergency calls as it's a way for the public to let the emergency services know something is happening even if they don't need immediate help or a response.
This functions dually as a way for the public to report non-emergency/developing situations to first responders and other members of the public. This functionality may reduce 911 calls for non-emergency situations and help increase public confidence and reduce panic.
[0034] In step 18 of Fig. 1, the user device receives input from the user selecting an option of the plurality of options relating to the user's reaction to the incident.
[0035] In optional step 20 of Fig. 1, the user device transmits to a server an indication of the selected option of the plurality of options corresponding to incident type. This step would be omitted in the event that steps 12 and 14 are omitted.
[0036] In step 22 of Fig. 1, the user device transmits to the server an indication of the selected option of the plurality options corresponding to the user's reaction to the incident. In Step 24 of Fig. 1, the user device transmits location information of the user device to the server. Typically, the location information will be obtained by the user device using satellite information such as from GPS. In some cases, location information may be obtained by an external source, e.g. a cellular network, without the user device transmitting location information. In such a case, the external source may transmit the location information to the server and step 24 may be omitted. Steps 20, 22 24 may be implemented using multiple transmissions or combined into a single transmission.
Regardless of whether they are transmitted in multiple transmissions or a single transmission, the data collected by the user device and transmitted to the server in whichever of these steps 20, 22 and 24 is present, is associated together and collectively referred to in this document as a "report".
[0037] Fig. 2 shows steps of processing and displaying data, such as data accumulated in Fig. 1. In step 30 of Fig. 2, a server receives reports from user devices.
The reports may be reports generated for example by the method steps illustrated in Fig.
1. In step 32 of Fig. 2, the server aggregates reports into events based on proximity. An event is a data record that represents a collection of reports that are aggregated together at least for display purposes. Optionally an event may also be created manually without a report, such as a government issued alert in respect of an area. Such a display is described below in relation to step 40 of Fig. 2. Typically, only reports of the same incident type will be aggregated into an event. The incident type may be obtained for example as part of the user input used to generate the report, or by the user device detecting that the user has entered an area having previous reports of an incident type and querying the user as to whether the user is affected by that incident type. The process for generating an event and assigning reports to an event may be designed so that all reports corresponding to an event will also typically correspond to the same incident. The converse, that all reports corresponding to the same incident will correspond to the same event, will often not be the case, as it is useful to distinguish between the effects of an incident at different locations, and thus, assign reports at sufficiently distant locations to different events even if they most likely correspond to the same incident. In an embodiment, each event may be mutually exclusive in that each report can only be assigned to one event at any one time.

In another example, reports may belong to multiple events. For example, events may also be defined at different levels of granularity. For example, reports may be assigned to events of a first set of events for display in a zoomed in map, and the same reports may also be assigned to events of a second set of fewer events for display in a less zoomed in map; further levels may also be added. These levels may also be displayed at the same time, e.g. by representing events of the second set with large icons and events of the first set with smaller icons. In an embodiment where the user device generates events from report data sent to the user device, the events may be generated dynamically based on the zoom level of the map, so that reports are aggregated at a greater distance the more zoomed out the map is. First responder information may be entered in relation to a report and linked to an event at the user device when the report including the first-responder-entered information is aggregated into the event. An example of how events may be displayed using icons is described below in relation to Fig 4.
[0038] In an example, each report is associated with a defined radius, and reports are aggregated into events based on overlap of these radii for reports of the same incident type. For example, where each report is associated with a contracting circle as described below, an event may be defined as a set of reports with contiguously overlapping areas or that were contiguously overlapping prior to contracting. When a report of a given incident type has an initial radius that does not overlap with an existing event of that incident type, the report may be classified as a new event of that incident type. If the report does overlap with an existing event of that incident type, the report and the area within the radius may be aggregated into that event. The event may be removed when the areas associated with the reports of the event disappear. Plural events may be merged if additional reports cause them to overlap. In another embodiment, each event may be assigned an event position and an event radius around the event position. Each report that is not within an existing event radius may be classified as a new event.
In further embodiments in which there is an event position and radius, the event position may correspond to the first report of the event, or the event position may dynamically be repositioned according to an average of the positions of the reports of the event. Report positions may be offset from the location of the user device on submitting the report, according to a location offset described below.
[0039] In an embodiment, the event may be displayed only when it includes a number of reports exceeding a threshold. For example, the threshold may be 5, 10 or 20.
Events below the threshold may still be visible to certain users, for example first responders. Dispatch centers may receive notifications when an event reaches one or more thresholds. The one or more thresholds may be the same as or different from the threshold for display of the event to regular users. Individual reports may also be visible to first responders. Optionally, additional verification may be needed for submitting reports that will not be associated with an event that is already displayed.
For example, submitters of such reports may be required to submit a picture or short video, which may be verified manually or using Artificial Intelligence/Machine Learning (AI/ML).
[0040] Users may be presented with an option to verify an event when clicking on an icon representing it. In an embodiment, events that have been verified by trusted users such as first responders are displayed differently, such as with a "verified"
badge 110 as described below in relation to Fig. 4. Reports by first responders may also generate events as a single report, ignoring the threshold for the number of reports to display an event. Where other reports or events are present, a first responder report may be included in an event with non-first responder reports in the same way as other non-first responder reports. In an embodiment, the verification provided by a first responder applies to all events that include reports in a verification radius, for example 1000m (11(m) from the point at which the first responder entered the verification, or in relation to which the first responder entered the verification. The above may apply to verification provided by a first responder verifying a pre-existing event, verification provided by a first responder entering a report, or both.
[0041] In step 34 of Fig. 2, the server performs smoothing on values representing users' selections from options relating to the users' reactions to generate smoothed data.
Typically, this step separately considers reports corresponding to each incident type for which there are reports to generate separate smoothed data for each incident type. An overall smoothed data for all reports may also be generated.
[0042] The smoothed data may be generated based on selections from predefined options of users reporting incidents, for example the selection of four options shown in Fig. 6 and described below. In an embodiment, the options have an ordering such that each option can be assigned a numerical score so that different selections can be averaged to produce a numerical score corresponding to or in between the numerical scores of the different options. In one example, the options relate to severity of mood, e.g. the following four options: "There's a problem, but I'm okay"; "There's a problem, and I'm nervous"; "I'm very worried, but can stay here"; and "I need to leave this area now".
Where options are assigned numerical scores to which the weighted averaging or other smoothing function is applied, the different options may be assigned equidistant numerical scores such as 1, 2, 3, 4, or may be assigned non-equidistant numbers, e.g. 1, 2, 4, 10. The numbers also do not have to be whole numbers. A value for any geographical location may be determined using a smoothing function applied to the numerical scores of recent nearby reports. In an example, the smoothing function is a weighted average of the numerical scores, in which at any given location the weight given to a numerical score for a report decays based on time elapsed since the report and horizontal distance of the given location from the report location. For example, a maximum time decay period of 24 hours and an initial radius of 500m from the center point may be used.
All numbers given here are examples only and may be adjusted. As the 24 hours counts down to 0 hours, the 500m radius at which the weight reaches 0 may contract down to Om.
For example, the following formula may be used to obtain weights: weight a max{0, 1 ¨
(distance from report location)/(initial radius) - (time ago that the report occurred)/(maximum time decay period)). The final values obtained from applying the weighted average or other smoothing function to the input reports may then be depicted using the heatmap, with numerical scores converted into colour, shading or other heatmap output.
[0043] In an embodiment, if a user submits multiple reports within a time threshold, the subsequent reports overwrite the previous reports and reset a timer for the time threshold. Multiple reports separated by a time exceeding the time threshold may be included separately. In an example, the time threshold may be 15 minutes. In an example, the subsequent reports may only overwrite the previous reports in the event that the reports have some degree of similarity, for example corresponding to the same incident type or event.
[0044] In an embodiment, a report radius used for the smoothing may also be used to determine overlap between reports to classify reports into events as described above. In another embodiment, different radii may be used.
[0045] In an embodiment, the same weighted averaging formula or other smoothing function is applied to all locations and to all report types.
However, alternatively the proximity weighting or other smoothing function may take into account other factors. Example of other factors include location, population density, report density, variance of reports in an area, and incident type.
[0046] Where there are few reports, smoothed data may be omitted, or not included in a heat map displayed in step 38 of Fig. 2. For example, reports may be counted in the smoothed data to be displayed in the heat map only when aggregated into an event with enough reports for the event to be displayed.
[0047] Although described as occurring on a server, the smoothing may also occur on the user device using, e.g. anonymized reports sent from the server.
[0048] Optionally, additional information may be taken into account when generating the smoothed data, for example how fast a user is to respond on their mood when the user is prompted to do so on entering a location including an existing event.
[0049] In step 36 of Fig. 2, the server sends the information on events and the smoothed data to user devices. Where the smoothed data is generated at the user device, the reports may be sent to user devices in this step, and the smoothed data may be generated at the user device in step 34 subsequently. In a further embodiment, the aggregation of events into reports in step 32 may also occur at the user device subsequently to the reports being sent to user devices in step 36.
[0050] In step 38 of Fig. 2, a heat map is displayed at a user device, the heat map representing the smoothed data generated in step 34 of Fig. 2. The heat map may be overlaid on a conventional map. In step 40 of Fig. 2, the events generated in step 32 of Fig. 2 are displayed on the heatmap, for example using icons overlaying the heatmap.
[0051] In step 42 of Fig. 2, notifications are sent to users that have submitted reports. Notifications may include, for example, updates on user reported moods corresponding to the event, or information on the corresponding incident obtained from, for example, first responders. In an embodiment, users who have submitted reports indicating a more nervous mood may be sent more notifications than users who have submitted reports indicating a less nervous mood.
[0052] Notifications can include, for example, notifications sent automatically to users on sending reports and notifications sent by official outlets (police, Environment Canada, etc). A notification may be sent immediately on the user sending a report. The content of the notification may be based on information in the report. For example, it may include information associated with the event, if any, into which the report is aggregated.
Examples can include:
[0053] Thank you for reporting this event. Please note that emergency response is aware and have been dispatched to help. Approximate time to arrival is x'
[0054] Thank you for reporting this event. Please find a safe place to shelter or evacuate the area if you are uncomfortable, and able to. If you require immediate emergency response, call 911 now.'
[0055] Thank you for reporting this event. It has been logged in the system and further information will be sent to you as it becomes available'
[0056] A notification from an official outlet can include, for example, further generic instructions to help ensure safety, or a message to users who have sent in the reports to give them more information on the situation.
[0057] Frequency of notifications may be adjusted to users with different severity of mood. For example, automatic notifications may be sent at a frequency depending on severity; each automatic notification may include updated information from a database including automatically collected or manually entered information on the event. In another example, notifications, either automatic or manually sent, may be filtered and users who indicated a greater severity may be by default given a less strict filter for notifications that they receive. The filter may be applied at the sending location or at the user device.
Notifications may also be sent to users based on any other characteristics of their reports.
Official sources may be given the option to select an event and give a notification to all users who sent reports that are aggregated into that event.
[0058] Staff and official viewers, e.g. at a first responder command centre, may be provided with additional views of the data in the system in addition to the views provided to regular users. Additional options may be provided in the additional views and/or additional options may be provided with respect to views provided to regular users. For example, a map with a point cloud display may be provided to allow the official viewers to zoom in to see individual reports. On selecting an individual report, a viewer may be provided with information on the report such as how long the reporting user took to fill out the report, any attachments such as a picture or short video, etc. A
map with a point cloud may be accompanied by a sortable list of the reports in the view area of the map. The additional views, along with conventional views, may be supplied via a web portal. The additional options may also allow the first responders to change the event definitions. For example, there may be an option to define an area on the map within which all reports meeting selected criteria are aggregated in one event, or within which all reports are aggregated only into events considered to be within that area, or within which all events can only aggregate reports from within the area, or any combination of these. Also, an option may be provided to combine multiple events into a single event manually.
[0059] Fig. 3 is a schematic diagram showing system components of an exemplary system for carrying out the methods disclosed here. Cell phone 50 and laptop 52 are examples of user devices. User devices 50 and 52 connect to network 54, in this example using wireless communications 56. Network 54 may be the intemet or other network. Server 58 connects to network 54 using wired or wireless link 60. The word "server" in this document refers to any single or collection of computers that acts as a server, for example, a single computer, a collection of computers, or a virtual server implemented on one or many computers. The user devices 50 and 52 may carry out steps of the methods disclosed here by following instructions on the user devices, which may be downloaded or pre-loaded software, or e.g. instructions included in a webpage and implemented in a browser. Instructions may be downloaded from the same or a different server than server 58.
[0060] In this document, the word "click" is used to refer to a selection/pointing user interface event and can include clicking with a mouse, touch on a touchscreen, eye tracking, etc. It can also include non-pointing modes of entry such as keyboard entry or voice when used as a substitute for pointing/selection input. For the app described below in relation to Figs. 4-15, typically "click" would refer to a touch input on a cellphone.
Different modes of entry may be used on different user devices.
[0061] Fig. 4 shows a reported incident map screen 102 presented on a user device 100. The reported incident map screen 102 includes a map 104 which displays events 106A-106C in an area. The area shown may be for example an area that the user has chosen to follow or for example a local area around the user's current location.

Multiple maps 104 may also be shown simultaneously using a split screen. The map shows a heat map drawn using smoothed data as described above in relation to Fig. 2.
The heat mapping may be an overall heat map based on aggregation of all reports or may be based on specific events or incident types, for example, only reports associated with the particular events 106A-106C shown on the map 104 or only reports corresponding to incident types for which there are corresponding events shown on the map 104.
[0062] The events 106A-106C may each be represented on the map by a respective icon. These icons may be clicked on to gain further information about the event, for example information on the event location, number of reports, type of incident to which the event corresponds, etc. The icons may represent the type of incident, for example using simple pictorial representations as shown in Fig. 5. Clicking on an incident may also provide the user an option to submit information regarding the incident, for example to supply a comment on the incident or to verify the incident as further described below.
[0063] Each of the events 106A-106C may correspond to a single report, for example when reported by a trusted user such as a first responder, but more typically represent an event aggregating multiple reports, as described above in relation to Fig. 2.
[0064] The map 104 shows the heatmapping schematically using contour lines 108. The events 106A-106C are displayed in the map 104 using icons that are shown in Fig. 4 as being positioned on contour lines 108, but this is coincidental.
Also, the heatmap need not use contour lines 108 but may use any other form of heatmapping, such as shading or colour.
[0065] The area shown in the heatmap may be adjusted in any conventional manner, for example using pinch/zoom to move in and out, and dragging a finger to move around.
[0066] The heatmap may be accompanied by a legend 126, as shown in Fig. 5, relating heat map colour or shading to particular options of the plurality of options selected from by the users in the reports. The heatmapping may show a continuous gradient of colour or shading. In another embodiment, numerical values obtained by smoothing for each point on the map may be binned into ranges, for example one range corresponding to each of the options provided, and assigned a colour, shading or separated by a contour line accordingly. In a further embodiment, there may be more binned ranges than the options provided.
[0067] The map may include an indication of what events are populated using crowd-sourced data and what events have been confirmed by official outlets.
For example, in Fig. 4, incident 106A is shown alongside a "verified" badge 110.
This "verified" badge 110 can be an icon, checkmark or timestamp that indicates that a first responder has verified the event, meaning that they have been or are currently on scene and have validated a public reporting date within the app. This information may also be represented in other ways such as by using a different icon to represent the incident. The first responder may be presented with an option to verify the incident on clicking on the incident on the reported incident map screen 102. In an embodiment, all users are presented with this option, but verifications by regular users may be treated differently than verifications by first responders. For example, a total number of verifications by regular users may be recorded and displayed, but not lead to display of a "verified" badge 110. In an embodiment, first responders may be presented with additional options to enter information on the incident. This additional information may include quantifiable input relevant to the incident type, and comments. This additional information may be displayed to users on clicking on the incident icon. This information may be displayed in addition to, or replacing, information displayed before this verification, which may include mood data. First responders may also be presented with the opportunity to enter verified information by sending a report using mood reporting button 112 or to send information to superiors or other first responders via "check in" button 194 shown in Fig.
8. Ordinary users may also be presented with the option to submit a comment on an incident. For example, user comments may be automatically entered into a forum thread that can be accessed via a screen relating to the incident. The "verified"
badge 110 may be, or may be associated with, a timestamp indicating the time of the verification. Such a timestamp may also, or alternatively, be presented to the user on clicking on the incident on the reported incident map screen 102. A first responder may also reject an event is a false alarm, clearing the event and invalidating the report(s) that it had included. This rejection option may be provided as an additional option available to the first responder on clicking on the icon.
[0068] Information added by first responders may be compared with mood data to obtain a model of the public's perception v. actual threat. In addition to first responder information, further information added by authorities after the fact may be used for this purpose. This model may then be applied to future events of a similar profile to better prepare information release timing and quality.
[0069] The map 104 may also mark on the map, or otherwise show information on, the positions of other users (not shown in the figures) that have opted to share information with the user, e.g. family members. If applicable, an indication may be provided about where the other users have checked in using "check in" button 194 shown in Fig. 8. Also, if applicable, an indication may be provided as to a location at which any reports from the other users may have been made. Other information from the reports, such as mood, may also be shown.
[0070] In addition to the map 104, reported incident map screen 102 in Fig. 4 may also include additional information or input options. Also shown in screen 102 in Fig. 4 is a mood reporting button 112 to allow the user to report their mood on the conditions they are currently experiencing. This mood reporting button 112 may bring up a data entry screen such as the mood reporting screen 130 shown in Fig. 6 and described below. This button 112 may optionally be provided on every screen in the app, so the user can find it easily from any point. Also, though in Fig. 4 the button 112 is shown in a different place than its counterparts in Figs. 6 and 8, in an embodiment it is shown consistently in the same location in every screen of the app.
[0071] Fig. 4 also shows an additional input area 114 that allows the user to select from a list of incident types. This may be used for example to report the occurrence of an incident of a particular type. For the convenience of the user, the incident types may be divided into categories, for example natural and human-caused incidents. Here and wherever a list of incident types is presented to select from, the incident types shown may be displayed based on the user's location so that only incident types that may plausibly occur at that location and time are shown. For example, an avalanche option might be shown only in areas and times of the year where snow may plausibly be present.
Incident types for which an event is occurring in the general area or city, or known to be happening and confirmed by emergency responders, may be for example, presented to the user first or highlighted; or such an incident type may be presented as a pre-selected option with the opportunity to confirm. The user may also be presented with an option, e.g. "more", which when clicked on may lead to additional options being presented, and may be presented with an option, e.g. "other", leading to an input screen to allow the users to enter a custom incident type using text input. Upon clicking on an incident type, the user may be presented with mood reporting screen 130 shown in Fig. 6.
Also, upon activating the mood reporting button 112, the list of incident types 116 may be shown initially as a pop up as shown in Fig. 5, and the mood reporting screen 130 of Fig. 6 may be shown after an incident type is selected. The user may also be presented with an option (not shown) to enter a location offset, indicating where the event is in relation to their own location. The user's own location is automatically collected using location tracking of the user device. The location offset, if it includes a distance, may be used for example by first responders to help determine urgency. The location offset may also be used, if it includes a direction, for the purpose of event aggregation. For example, a report may be counted for event aggregation purposes as having occurred at a location adjusted according to the location offset from the user's location. A location offset can also be inferred from a picture, where a picture is supplied with the report. The location offset may be inferred using AI/ML or manually, and may include distance (e.g.
inferred from size or perspective cues), direction (e.g. inferred from lighting combined with time information, or from orientation information collected by the device at the time of the picture and submitted with the report) or both.
[0072] Fig. 5 shows a reported incident map screen 102 including the additional input area 114 with list 116 of incident types as a pop up, for example on clicking the mood reporting button 112 shown in Fig. 4. The list 116 may represent incident types using icons 118. The icons 118 may be visually similar to the icons used in the map 104 to represent incidents. The map 104 in this example is not currently showing any incidents. If there is an incident in the user's area, the pop up may indicate that the user appears to be in an area impacted by an incident type (for example flooding) and ask the user to indicate whether they are affected or not, and/or whether the user is instead affected by something else. If the user indicates that they are affected by something else, a list 116 of incident types may be shown for the user to choose from. The user indicating that the user is affected may lead to the mood reporting screen 130 of Fig. 6 being shown.
The user indicating that the user is not affected may lead to the previous screen being shown. The user indicating that they are affected by something else may lead to the user being presented with a list of incident types (not shown) to select from.
[0073] As shown in Fig. 5, the reported incident map screen may also include links 120 to other screens, here shown in the form of a side-scrolling tab list 122 above map 104 that also includes a header for the current screen as underlined item 124. In the embodiment shown, this tab navigation bar 122 switches between different screens that pertain to a single area, here "Work".
[0074] Reported incident map screen 102 may also include information (not shown in the figures) showing the number of reports submitted that were used to create the heat map 104 and the time since the last report was added. For example, a text display on or adjacent to the map 104 saying, e.g. "3.5k reports ¨ Last report submitted: 8 minutes ago". This data may be presented in respect of the area defined by the visible map area at a particular time, or in respect to a particular area shown in the map which the user may select for example by clicking on it.
[0075] Fig. 6 shows a mood reporting screen 130. This mood reporting screen 130 may for example be activated when a user clicks on the mood reporting button 112 shown on the reported incident map screen 102 in Fig. 4. When clicking on mood reporting button 112, a user may first be presented by an additional input screen, or popup such as additional input area 114 shown in Fig. 5, to allow the user to select an incident type.
[0076] The mood reporting screen 130 may include an event information display 132 which shows current information about an event, for example entered by first responders or including information based on user reports aggregated into the event. The event for which information is shown may be, for example, an event that is occurring at a location at or near the user's current location, or an event that the user's report will likely be included in based on an incident type the user has selected using input area 114 in Fig.
5. The user may also be presented with a button (not shown) to change the incident type, e.g. by bringing up additional input area 114 as a popup. Alternatively, the incident type selection may be integrated into mood reporting screen 130.
[0077] The mood reporting screen 130 of this app includes means to report the user's current state of mind about the incident. The mood may be represented using a customizable selection of discrete mood-indicating options for gaining mood-based feedback on the severity of the incident the user is currently witnessing.
[0078] For example, the mood reporting screen 130 may present a plurality of buttons 134 representing the discrete mood-indicating options. The mood reporting screen can also or alternatively present a mood slider 136 to allow the user to represent their mood about the incident. The slider 136 may be configured to allow the user to select from the same discrete selection of mood-indicating options as represented by the buttons 134. If both buttons and slider are present, motion of the slider may lead to the button corresponding to the user's current selected option being highlighted, and clicking a button may result in the slider being moved to the corresponding option.
Where the slider 136 is present and buttons 134 are not, motion of the slider may lead to the display of an icon corresponding to the user's current selected option.
[0079] The mood reporting screen 130 may also include a comment button 138 to bring up or bring focus to a text entry field (not shown) to allow the user to provide a comment. The mood reporting screen 130 may also include a picture button 140 to allow the user to take a picture showing the incident being reported. Wherever a picture is mentioned in this document, alternatively a short video could be used.
[0080] User comments and pictures may be analyzed by backend machine learning to conduct picture analysis and/or additional sentiment analysis, further increasing the accuracy of the display. Comments and pictures may also be made visible to first responders, and/or to all users clicking on an incident icon on reported incident map screen 102. Pictures may also be used to verify the accuracy of the report. For example, in an embodiment, all reports that will not be aggregated into a displayed event must be submitted with a picture and verified by a human or a machine until enough reports have been submitted and verified for the event to be displayed.
[0081] The mood reporting screen 130 may also include a submission button 142 to allow the user to submit the user's mood selection and other information.
Optionally, clicking this button may lead the app to present a preparation screen 150 as shown in Fig.
7.
[0082] Fig. 7 shows a preparation screen 150. This screen is shown in Fig. 7 as taking the form of a popup over the reported incident map screen 102. Each screen described in this document may be implemented for example as a standalone screen, tab or as a popup over another screen. The preparation screen 150 is shown as including checklists 152 of things that a user can do to prepare for an event. Such a checklist may include, for example, lists of items for a user to collect before evacuating if an evacuation is scheduled but the user has time to collect items. The user may be presented with multiple checklists. For example, List I may be a concise 5-minute warning evacuation items list, and List 2 a more complete 15-minute warning item checklist. In another example, List 1 is a list of suggestions to do in a flood and List 2 is a list of items to collect before evacuation. More or fewer lists may be provided. Optionally, different lists may be provided or not provided based on the circumstances, for example, a user may be presented with a more complete list of items to collect if they have more time to evacuate. A timer may also be presented counting down based on the time limit associated with the checklist. The user may check off items in the list.
Visual or auditory warnings may be provided when the timer counts down past particular thresholds, such as 1 minute.
[0083] The preparation screen 150 may also provide the user with a button 154 that opens the 911 phone dialer on the phone to make it easier for the user to call 911 if the user needs to contact emergency services.
[0084] A checklist may also be presented in the event of certain alert types, such as an evacuation order applicable to the user's location. In such a case, a checklist may be provided with a timer counting down based on a time provided by the authority issuing the evacuation order.
[0085] Fig. 8 shows a location monitoring selection screen 160. The screen may display a list 162 of location cards 164, 166, 168 corresponding to areas for which the user is monitoring emergencies. The list may include a preset "Home" card 164 corresponding to the user's home address and a preset "Work" card 166 corresponding to the user's work address. There may also be additional location cards.
[0086] The location monitoring selection screen 160 may also include a search box 170. The search box 170 may be used to find additional locations to monitor. An option may be provided, for example when the user clicks on the search box, to use the user's current location. Selecting an area via the search box may lead to a reported incident map screen 102 corresponding to the area searched for, with an option presented to add the area to the list 162. The search box or another button (not shown) may also provide the option to show a reported incident map screen 102 corresponding to the user's current area. In an embodiment, this option may be selected by automatically when the user is in a location corresponding to a reported incident with severity (as measured e.g. by number of reports, reported moods, or by first responders) beyond a threshold.
[0087] The number of location cards shown may be variable, such that the list increases in size as more locations are added. Each location card 164-168 may include various information relating to the location. For example, a location card may include a user-customizable name 172 of the location. Optionally, the app may prompt the user to enter locations for pre-defined names, for example "Home" or "Work". The location card may include an official name or address 174 for the location, e.g. "City of Toronto, ON"
or "252 Adelaide St. East, Toronto, ON". A location card may show the number of alerts relating to the location in a portion 176 of the location card. Clicking on this portion, or optionally on the whole card, may bring up a list of alerts relating to the location, as shown in Figs. 9-10.
[0088] A location card may also show typical mood 178 of recent reports by users in the location. A location card may also show a list 180 of incident types of events present in the area, or official warnings or evacuations currently active in the area.
[0089] The information displayed on the location cards may be user-customizable, for example via an options menu accessed via an options button 182 on the location monitoring selection screen 160. The options menu accessed by options button 182 may also allow deletion of location cards. A refresh button 184 may also be provided.
[0090] In addition to or alternatively to the search box 170, a watchlist addition button 186 may be provided to add a new location to the list 162. The watchlist addition button 186 may also allow deletion of locations alternatively to or in addition to the options menu accessed by options button 182.
[0091] A reporting button 188 to report conditions may provide the same or similar functionality to reporting button 112 in Fig. 4. Optionally, a corresponding button may be provided on all screens in the same location order to enable a user to report conditions easily regardless of app navigation. In the embodiment shown, a bottom bar 190 allows navigation between different screens. This bottom bar may be provided in addition to or instead of tab navigation bar 122 in Fig. 5. Bottom bar 190 may allow side scrolling, and an icon 192 representing the current screen is highlighted. In this exemplary bottom bar, options include a check in option 194, which may allow a user to indicate their status to a pre-selected set of other people. The preselected set may include for example, one or more groups of people (e.g. a work group) which the user has joined and/or a list of friends. Also shown is a preparation option 196 which may for example bring up preparation screen 150 as shown in Fig. 7. A profile/settings option 198 may also be shown. The profile/settings option 198 may bring up settings screen 250 shown in Fig. 13. The profile/settings option 198 may also bring up profile information (not shown) that may include, for example, the user's name and address. In an embodiment, profile information may include a display of scannable information such as a QR Code to enable quick identification of the user, for example at evacuation centers.
[0092] Fig. 9 shows an example alert screen 200 showing a list 202 of alerts for an area. The list may be sorted in any suitable manner; in the example shown it is sorted by time received, with newer alerts on top. In the example shown, previously read alerts are shown below unread alerts. The alerts shown may be for example alerts matching criteria set by the user. An "alert" may include, for example an incident 204, warning 206, status change 208 of an incident or map area (e.g. sentiment change) or other alert (for example an AMBER alert 210) which meets criteria set by the user. Check-ins, as provided for example by clicking on check in option 194 in the bottom bar 190 of location monitoring selection screen 160, may also provide alerts, as shown by check-in alert 212. All, some or no alerts may also generate a notification to the user, depending on settings. In an embodiment, different alerts that generate notifications may result in different types of notification, e.g. more or less noticeable sound or vibration, depending on the settings.
[0093] In the embodiment shown, alert screen 200 includes both bottom bar 190 and tab navigation bar 122. The alert screen 200 may also show filters for the alerts. In this example, a toggle 214 is provided to select official alerts only, set to an off position in this figure. Criteria by which the alerts can be filtered can also include alert type and location. In the embodiment shown, an alert type button 216 brings up an alert type filter selection screen 230, shown in Fig. 11, to select different alert types and a recency button 218 brings up a screen (not shown) to select requirements for recency of alerts. Also shown on this screen is an alert number 220 which here indicates the number of unread alerts meeting the selected criteria. This alert number may in an example be the same as the number of alerts shown in portion 176 of a location card in Fig. 8. A
settings option 222 may bring up a location alert settings screen 240 as shown in Fig. 12.
[0094] Fig. 10 corresponds to Fig. 9 but with the toggle 214 for official alerts only set to on. As can be seen, the alerts shown and the alert number 220 change correspondingly in this example, but the number shown in the tab navigation bar 122 does not. The number of alerts shown in portion 176 of a location card in Fig.
8 may correspond to the number shown in the top navigation bar or to the alert number 220.
[0095] Fig. 11 shows a pop-up alert type filter selection screen 230 that may be used to select alert types to show. In this example, alert types are selected via checkboxes 232 and the screen includes a reset button 234 to set filters to a default setting and an update button 236 to implement changes.
[0096] Fig. 12 shows a location alert settings screen 240 providing alert settings specific to an area, here "Work". A default selection option 242, here set to "off', allows the user to choose whether to use the same settings as shown in their settings screen 250 shown in Fig. 13. In this example, settings are provided for each alert type, and the alert types are divided between official alerts 244 and other alerts 246. In the example shown weather, natural and civil alerts are shown as "official alerts" only.
Alternatively, they may also be generated from reports via the app, for example when an event accumulates a sufficient number of reports to be displayed in the app, even before official confirmation. The settings 248 for each alert type may include whether an alert type is shown at all, whether push notifications are allowed for it, whether SMS
messages may be provided even when the phone is off, and timing for when notifications may be sent.
Settings can include the form of notification, e.g. text message sound for official alert from authorities, vibrate only for crowd-sourced alerts.
[0097] Fig. 13 shows a settings screen 250. Settings shown in this example screen include notification settings 252, including setting 254 as to whether to receive push notifications, setting 256 as to whether to receive text messages 256, and/or setting 258 as to whether to receive email notifications. Also shown are default alert settings 260 which apply to all areas for which default selection option 242 in Fig. 12 is set to "on".
Also shown are a list 262 of areas corresponding to areas shown in the watchlist 162 shown in Fig. 8. Selecting a particular area in this list may bring up the location alert settings screen 240 for that area.
[0098] Settings which may be used to filter alerts can include: type of alert, level of alert, distance from user's current location, distance from other locations such as home, work, kid's schools, parent's homes, etc., alert source.
[0099] Fig. 14 shows an additional information screen 270 that may be for example part of the reported incident map screen 102, obtained by scrolling down in the reported incident map screen 102. This additional information screen 270 includes, in the embodiment shown, a list 272 of icons representing reported conditions within an area (here a radius around an address). Clicking on an icon in the list may link users to a pre-filtered list 292 in reported incidents screen 290 as shown in Fig. 15. The list 272 may scroll horizontally if there are more incident types than can be included in one screen width. Also included in the additional information screen 270 are a sentiment report 274 providing information on sentiment of people within the area, and suggestions 276 here including a customize alerts option 278 to create a customized alert for the situation, a preparation option 280, and a check-in option 282. The additional information screen 270 also here includes an FAQ 284. The preparation option 280 may for example bring up preparation screen 150. The check-in option 282 may for example have the same effect as clicking on check-in option 194 shown in Fig. 8.
[00100] Fig. 15 shows a reported incidents screen 290. Reported incidents screen 290 is here a tab sharing tab navigation bar 122 with reported incident map screen 102 and alert screen 200. The reported incidents screen 290 includes a list 292 of events which is here pre-filtered to show flooding events only, e.g. by the user clicking on a flooding icon. Here, there is shown a toggle 294 for filtering by "verified incidents only", which is depicted as on, but unverified events 296 are also shown in the figure. In the actual app, if this toggle were present and set as depicted, the verified event 298 would be shown and the unverified events 296 omitted.
[00101] Criteria for filtering events in the list 292 can include, for example, type of incident, source (e.g. government issued alerts only), or location.
[00102] Events on this list may be sorted according to a priority value which may depend on, for example, distance from the user, severity, and distance from other users who have opted to share location information with the user, such as family members.
This priority value may be assigned by AUML.
[00103] An analogous list (not shown) may be presented to select users such as dispatchers, first responders, administrators, etc. to show all events occurring in an area of interest such as a city. The area of interest may be customizable by the user. The list may be sorted according to a priority value which may depend on, for example, number of reports, areas with the moods that are the most worried, and events closest to vulnerable areas deemed highest risk and most in need. The priority value may be assigned by AI/ML. Clicking on events on this list may result in the user being presented information on the event. Information can include the current status entered by first responders in respect of events to which resources have been assigned. Events which have not been looked after may be shown differently, such as by flashing in red. An option to send communications may be available for "top tier" users such as dispatch centres. When sending communications, an option may be provided to allow sending a second communication after a fixed time, for example, a message may be sent 30 minutes later, such as "Please let us know your current situation, has it improved?
Are you feeling better/worse?"
[00104] All data may be made available to officials after the incident to debrief and improve future communications. AI/ML decisions may always be checked by staff or dispatchers.
[00105] Immaterial modifications may be made to the embodiments described here without departing from what is covered by the claims.
[00106] In the claims, the word "comprising" is used in its inclusive sense and does not exclude other elements being present. The indefinite articles "a" and "an" before a claim feature do not exclude more than one of the feature being present.
Each one of the individual features described here may be used in one or more embodiments and is not, by virtue only of being described here, to be construed as essential to all embodiments as defined by the claims.

Claims (18)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. A method of accumulating and displaying user data, comprising:
transmitting instructions to a plurality of user devices to, for each user device, carry out steps A-E:
A display an input screen configured to receive user input selecting from a plurality of options;
= receive user input selecting an option of the plurality of options;
= transmit to a server location information of the respective user device and an indication of the received user input;
= receive data from the server, the data relating to the received user input and location information of plural user devices of the plurality of user devices; and = display a heatmap based on the received data.
2. The method of claim 1 in which the received data at each user device includes anonymized reports, each report including the received user input selecting an option of the plurality of options and the transmitted location information for a respective user device of the plural user devices of the plurality of user devices; the instructions including instructions to apply a smoothing function to the reports to obtain smoothed data, the heatmap being displayed based on the smoothed data.
3. The method of claim 1 in which the received data at each user device includes aggregated data based on the received user input selecting an option of the plurality of options and the transmitted location information of the plural user devices of the plurality of user devices.
4. The method of claim 3 in which the aggregated data is smoothed data obtained by applying a smoothing function to reports, each report including information on the option selected by the received user input and the transmitted location information for a respective user device of the plural user devices of the plurality of user devices.
5. The method of claim 2 or claim 4 in which the received data includes events, each event corresponding to one or more of the reports, the reports aggregated into the events at least in part according to geographical proximity, and the instructions include instructions to visually represent the events on the heatmap.
6. The method of claim 2 in which the instructions include instructions to aggregate the reports into events, at least in part according to geographical proximity, and instructions to visually represent the events on the heatmap.
7. The method of claim 5 in which the received data includes additional information describing an event of the events, and the instructions transmitted to the plurality of user devices including instructions to display the additional information.
8. The method of claim 7 further comprising transmitting further instructions to at least one user device of the plurality of user devices to allow a corresponding user of the at least one user device to enter information regarding the event of the events and to transmit the entered information, the additional information including the entered information.
9. The method of any one of claims 1-8 in which the plurality of options correspond to user moods.
10. The method of claim 9 in which the instructions include an instruction to display the input screen in response to receiving user input indicating an incident of an unsafe condition.
11. The method of claim 9 in which the instructions include an instruction to display -the input screen in response to the received data indicating an incident of an unsafe condition.
12. The method of claim 9 in which the instructions include an instruction to display the input screen in response to receiving user input indicating an incident of an unsafe condition, and an instruction to display the input screen in response to the received data indicating the incident of an unsafe condition.
13. The method of any one of claims 10-12 in which the instructions include an instruction to receive a communication concerning the incident of the unsafe condition and to display a notification concerning the communication.
14. The method of claim 13 in which the communication is sent to user devices of the plurality of user devices based in part based on the indication of the selected option transmitted in the reports from the respective user devices.
15. The method of claim 13 in which the instructions include an instruction to display the notification based in part based on the indication of the selected option transmitted in the reports from the respective user device.
16. A non-transitory computer readable medium having instructions to cause a user device of a plurality of user devices to carry out steps A-E:
A display an input screen configured to receive user input selecting from a plurality of options;
= receive user input selecting an option of the plurality of options;
= transmit to a server location information of the respective user device and an indication of the received user input;
= receive data from the server, the data relating to received user input and location information of plural user devices of the plurality of user devices;
and = display a heatmap based on the received data.
17. A method of processing user reported data, the method comprising:
receiving at a server reports from a plurality of user devices, each report comprising an indication of a selection of an option from a plurality of options by a respective user of a respective user device, and a location of the respective user device;
applying a smoothing function to the reports to obtain smoothed data; and instructing user devices of the plurality of user devices to each display a heatmap according to the smoothed data.
18. A method of processing user reported data, the method comprising:
receiving at a server reports from a plurality of user devices, each report comprising an indication of a selection of an option from a plurality of options by a respective user of a respective user device, and a location of the respective user device;
sending the reports to the plurality of user devices; and instructing the user devices to:
apply a smoothing function to the reports to obtain smoothed data; and each display a heatmap according to the smoothed data.
CA3075015A 2020-03-10 2020-03-10 Emergency alert system Pending CA3075015A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3075015A CA3075015A1 (en) 2020-03-10 2020-03-10 Emergency alert system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA3075015A CA3075015A1 (en) 2020-03-10 2020-03-10 Emergency alert system

Publications (1)

Publication Number Publication Date
CA3075015A1 true CA3075015A1 (en) 2021-09-10

Family

ID=77663227

Family Applications (1)

Application Number Title Priority Date Filing Date
CA3075015A Pending CA3075015A1 (en) 2020-03-10 2020-03-10 Emergency alert system

Country Status (1)

Country Link
CA (1) CA3075015A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11633112B2 (en) 2021-03-08 2023-04-25 Medtronic, Inc. Automatic alert control for acute health event

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11633112B2 (en) 2021-03-08 2023-04-25 Medtronic, Inc. Automatic alert control for acute health event

Similar Documents

Publication Publication Date Title
US11204689B2 (en) Overview user interface of emergency call data of a law enforcement agency
US20210216928A1 (en) Systems and methods for dynamic risk analysis
US11915579B2 (en) Apparatus and methods for distributing and displaying communications
US10382936B2 (en) Interactive emergency information and identification systems and authentication methods
CN111052772B (en) Method and system for secure tracking and generating alerts
US10904696B2 (en) Emergency preparation and response systems and related methods
US9554260B2 (en) System and method for distributed messaging among members of a community
US8948732B1 (en) System and method for responding to service requests and facilitating communication between relevant parties
US9247408B2 (en) Interactive emergency information and identification
CA2773749C (en) Public safety analysis system
US20180025458A1 (en) Self-customizing, multi-tenanted mobile system and method for digitally gathering and disseminating real-time visual intelligence on utility asset damage enabling automated priority analysis and enhanced utility outage response
US20210081559A1 (en) Managing roadway incidents
CN104685532A (en) Personal safety and emergency services
US20160274770A1 (en) Method and System for Platform for Event Management
US10075541B2 (en) Released offender geospatial location information user application
US20220090927A1 (en) Multi-Hazard Spatio-Temporal Zone-Based Evacuation Management Platform
US20170193807A1 (en) Alarm system on a device connected to a network
CA3075015A1 (en) Emergency alert system
US20170301051A1 (en) Media sharing application with geospatial tagging for crowdsourcing of an event across a community of users
WO2012068266A2 (en) Communication management systems and methods
Sanders Have you been identified? Hidden boundary work in emergency services classifications
US20230360151A1 (en) Self-customizing, multi-tenanted mobile system and method for digitally gathering and disseminating real-time visual intelligence on utility asset damage enabling automated priority analysis and enhanced utility outage response
Niebla Communication technologies for public warning
WO2023018894A1 (en) Map-based emergency call management and dispatch
KR20170040725A (en) Location based unified reporting applicaton and service providing method thereof

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20240327

EEER Examination request

Effective date: 20240327