US20200301658A9 - Dispatch-Based Responder Camera Activation - Google Patents
Dispatch-Based Responder Camera Activation Download PDFInfo
- Publication number
- US20200301658A9 US20200301658A9 US16/436,781 US201916436781A US2020301658A9 US 20200301658 A9 US20200301658 A9 US 20200301658A9 US 201916436781 A US201916436781 A US 201916436781A US 2020301658 A9 US2020301658 A9 US 2020301658A9
- Authority
- US
- United States
- Prior art keywords
- responder
- camera
- computing device
- video
- dispatch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000004913 activation Effects 0.000 title claims abstract description 84
- 230000004044 response Effects 0.000 claims abstract description 55
- 238000000034 method Methods 0.000 claims abstract description 32
- 238000004891 communication Methods 0.000 claims description 83
- 230000003213 activating effect Effects 0.000 claims description 9
- 230000009471 action Effects 0.000 description 3
- 239000003795 chemical substances by application Substances 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- JLQUFIHWVLZVTJ-UHFFFAOYSA-N carbosulfan Chemical compound CCCCN(CCCC)SN(C)C(=O)OC1=CC=CC2=C1OC(C)(C)C2 JLQUFIHWVLZVTJ-UHFFFAOYSA-N 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/735—Filtering based on additional data, e.g. user or group profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/50—Connection management for emergency connections
Definitions
- a method of activating a responder camera includes a computing device receiving an indication of a location or an event, determining a geographic area associated with the location or event, receiving a dispatch acknowledgement from a responder, and automatically sending a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder.
- the dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area.
- the responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- the method further includes the computing device sending a dispatch request to the responder prior to receiving the dispatch acknowledgement from the responder.
- the method further includes the computing device receiving the video from the responder camera.
- the video is received from the responder camera after the responder camera has finished capturing the video.
- the video is received from the responder camera as a streaming video.
- at least a portion of the streaming video is received from the responder camera before the responder camera finishes capturing the video.
- the method further includes the computing device storing the video received from the responder camera.
- the dispatch acknowledgement is received from the responder via a responder computing device associated with the responder.
- automatically sending the camera activation signal to the responder camera associated with the responder comprises automatically sending, by the computing device, the camera activation signal to the responder computing device, where the responder computing device is configured to relay the camera activation signal to the responder camera.
- receiving the dispatch acknowledgement from the responder comprises receiving, by the computing device, the dispatch acknowledgement from the responder via a device other than the responder computing device.
- automatically sending the camera activation signal to the responder camera associated with the responder comprises automatically sending, by the computing device, the camera activation signal directly to the responder camera activation signal via one or more communication networks.
- the dispatch acknowledgement indicates that the responder is en route to the geographic area, and wherein automatically sending the camera activation signal to the responder camera causes the responder camera to begin capturing the video prior to the responder arriving at the geographic area.
- the method further includes the computing device receiving the video from the responder camera as a streaming video, where at least a portion of the streaming video is received from the responder camera prior to the responder arriving at the geographic area.
- the dispatch acknowledgement includes geospatial information of one or more of the responder camera or a responder computing device associated with the responder.
- the method further includes the computing device sending camera activation signals to a plurality of responder cameras in response to determining that each of a plurality of responders associated with the plurality of responder cameras is at the geographic area or en route to the geographic area.
- the method further includes the computing device receiving a streaming video from each of the plurality of responder cameras and simultaneously displaying at least two streaming videos received from the plurality of responder cameras.
- a non-transitory computer-readable medium has instructions embodied thereon for activating a responder camera, where the instructions, in response to execution by a computing device, cause the computing device to receive an indication of a location or an event, determine a geographic area associated with the location or event, receive a dispatch acknowledgement from a responder, and automatically send a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder.
- the dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area.
- the responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- a computing device for activating a responder camera includes a processor and a computer-readable medium having instructions embodied thereon. The instructions, in response to execution by the processor, cause the computing device to receive an indication of a location or an event, determine a geographic area associated with the location or event, receive a dispatch acknowledgement from a responder, and automatically send a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder.
- the dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area.
- the responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- the computing device further includes at least one display device and the instructions, in response to execution by the processor, further cause the computing device to receive the video from the responder camera and display the received video on the at least one display device.
- the computing device further includes at least one memory and the instructions, in response to execution by the processor, further cause the computing device to receive the video from the responder camera and store the received video in the at least one memory.
- FIG. 1 depicts an embodiment of a system for communication between computing devices of responders via a network, in accordance with the embodiments disclosed herein;
- FIG. 2 depicts an embodiment of a system and examples of communication capabilities of a responder computing device, in accordance with the embodiments disclosed herein;
- FIG. 3 depicts an embodiment of a system and examples of communication capabilities of a responder devices, in accordance with the embodiments disclosed herein;
- FIG. 4 depicts a system in which a dispatch computing system receives a dispatch acknowledgement from a responder, in accordance with the embodiments disclosed herein;
- FIGS. 5A and 5B depict embodiments of automatically sending a camera activation signal to a responder camera associated with the responder in the system depicted in FIG. 4 , in accordance with the embodiments disclosed herein;
- FIGS. 6A and 6B depict embodiments of sending video information from a responder camera associated with a responder to a dispatch computing system in the system depicted in FIG. 4 , in accordance with the embodiments disclosed herein;
- FIG. 6C depicts an embodiment of a dispatch computing system receiving and displaying video information, in accordance with the embodiments disclosed herein;
- FIG. 7 depicts an embodiment of a dispatch computing device determining a geographic area associated with a location, in accordance with the embodiments disclosed herein;
- FIG. 8 depicts another embodiment of a dispatch computing device determining a geographic area associated with an event, in accordance with the embodiments disclosed herein;
- FIG. 9 depicts an embodiment of a method performed by a computing device to automatically activate a responder camera, in accordance with the embodiments disclosed herein;
- FIG. 10 depicts a block diagram that illustrates aspects of an illustrative computing device appropriate for use in accordance with embodiments of the present disclosure.
- Video recordings are important records for responders and responder agencies.
- a responder is any individual that is part of an agency that responds to particular situations. Examples of responders include law enforcement officials, firefighting officials, paramedics, private security personnel, private responders (e.g., tow truck drivers and roadside assistance personnel), and the like. Law enforcement officials include police officers, sheriffs and sheriff deputies, state patrol officers, federal agency officers (e.g., Federal Bureau of Investigation agents, Central Intelligence Agency agents, Transportation Security Administration officers, etc.), members of the National Guard, members of the armed forces, and the like. Examples of responders also include supervisors and dispatchers of other responders. Examples of responder agencies include police departments, sheriff offices, fire departments, federal agencies, private companies of private security personnel, private responder organizations, and the like.
- videos associated with responder activities have a number of uses. For example, videos have evidentiary value, such as evidence of criminal activities that occur near responders, evidence of proper actions by responders, evidence of responder malfeasance, evidence of actions of individuals interacting with responders, and the like. Because of the evidentiary value of video, the number of responder cameras (e.g., cameras under the control of a responder or a responder agency) has increased. Examples of responder cameras include dashboard cameras in responder vehicles, body cameras worn by responders, and the like.
- One difficulty associated with the use of responder cameras is controlling the times at which the responder cameras capture video.
- a responder camera captures video constantly while the associated responder is on duty. While this example would capture any video of potential evidentiary value while the responder is on duty, constant capture of video by a responder camera has a number of disadvantages. Constant use of a responder camera results in a constant draw of power, which is especially problematic for responder cameras powered by batteries, generates a significant amount of video data, that must be stored and maintained securely, and creates a number of other issues.
- a responder camera captures video when activated manually by a responder.
- the responder activates the responder camera to begin recording video when the responder determines to activate the responder camera, such as when the responder is pursuing a suspect, the responder is responding to a call, and the like.
- manual activation of a responder camera leaves significant room for human error, such as in situations where a responder forgets to activate the responder camera, in situations where a responder does not deem a situation to be appropriate for video recording but video evidence would have been helpful, in situations where a responder intentionally fails to activate the responder camera to avoid evidence of malfeasance, and the like.
- Embodiments of the present disclosure are generally directed to automatically activating responder cameras in response to particular situations.
- a computing device receives an indication of a location or event.
- a dispatcher enters the indication of a location (e.g., an address, an intersection, etc.) or an event (e.g., a gathering of people, a demonstration route, an ongoing vehicle chase of a suspect, etc.).
- the computing device determines a geographic area associated with the location or event (e.g., a radius from a location, an area surrounding the event, an expected future path of the event, etc.).
- the computing device receives a dispatch acknowledgement from a responder indicating that the responder is at the geographic area or that the responder is en route to the geographic area.
- Such a determination may be made based on geolocation coordinates of a present or intended future location of the responder contained in the dispatch acknowledgement from the responder.
- the computing device In response to receiving the dispatch acknowledgement from the responder, the computing device automatically sends a camera activation signal to a responder camera associated with the responder.
- the responder camera is configured to begin capturing a video in response to receiving the camera activation signal. In this way, the receipt of the dispatch acknowledgement from the responder indicating that the responder is at the geographic area or that the responder is en route to the geographic area triggers the computing device to automatically activate the responder camera.
- FIG. 1 depicts an embodiment of a system 100 for communication between computing devices of responders via a network 102 .
- the system 100 includes a responder 110 that has a computing device 112 .
- the computing device 112 is capable of communicating via the network 102 .
- the network 102 is a wireless communication network using one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and the like.
- the network 102 includes a number of different communication networks, such as a wireless network and a local area network (LAN).
- the computing device 112 includes a communication application 114 that includes instructions that cause the computing device 112 to communicate with other computing devices via the network 102 .
- the system 100 also includes responders 120 , 130 , 140 , and 150 .
- Each of the responders 120 , 130 , 140 , and 150 has one of computing devices 122 , 132 , 142 , and 152 that is capable of communicating via the network 102 .
- Each of the computing devices 122 , 132 , 142 , and 152 includes one of the communication applications 124 , 134 , 144 , 154 that includes instructions that cause the computing devices 122 , 132 , 142 , and 152 to establish a communication link between computing devices of other responders via the network 102 .
- the system 100 also includes a dispatch unit 160 that includes a computing device 162 .
- the computing device 162 includes one or more of a server, a desktop computer, a laptop computer, a tablet computer, and the like.
- the computing device 162 is capable of communicating via the network 102 .
- the computing device 162 includes a communication application that includes instructions that cause the computing device 162 to establish a communication link between computing devices of other responders via the network 102 .
- the computing device 162 is used by a responder, such as a dispatcher, a supervisory responder, or any other type of responder.
- each of the computing devices 112 , 122 , 132 , 142 , 152 , and 162 includes one or more of a cell phone, tablet computer, smart wearable (e.g., a smart watch), a laptop computer, a desktop computer, and the like.
- the computing devices 112 , 122 , 132 , 142 , and 152 are personal devices of the responders 110 , 120 , 130 , 140 , and 150 and are not issued by any responder agency of the responders 110 , 120 , 130 , 140 , and 150 .
- the communication applications 114 , 124 , 134 , 144 , and 154 are configured to enable communication between the personal computing devices 112 , 122 , 132 , 142 , and 152 of the responders 110 , 120 , 130 , 140 , and 150 with each other and with computing devices of one or more responder agencies, such as computing device 162 .
- the computing devices 112 , 122 , 132 , 142 , 152 , and 162 are capable of sending communications directly to another of the computing devices 112 , 122 , 132 , 142 , 152 , and 162 (i.e., direct communication), to a subset of the computing devices 112 , 122 , 132 , 142 , 152 , and 162 (i.e., selective communication), or to all of the computing devices 112 , 122 , 132 , 142 , 152 , and 162 (i.e., broadcast communication).
- communications are sent between one or more of the computing devices 112 , 122 , 132 , 142 , 152 , and 162 via a communication link based on a priority rank among at least two of the responders 110 , 120 , 130 , 140 , and 150 .
- the responders 110 , 120 , 130 , 140 , and 150 are all associated with the same responder agency. Examples of responders from the same responder agency include police officers from the same police department, firefighters from the same fire department, private security personnel from the same organization, and the like. In other embodiments, at least some of the responders 110 , 120 , 130 , 140 , and 150 are associated with different responder agencies. Examples of responders from different responder agencies include police officers from one police department and police officers from another police department, state patrol officers and sheriffs deputies, federal agency agents and members of the armed forces, and the like.
- the system 170 includes the computing device 112 , the network 102 , the dispatch unit 160 , and the dispatch computing device 162 .
- the computing device 112 includes the communications application 114 and is capable of communicating via the network 102 .
- the computing device 112 is also capable of communicating with any number of responder devices 116 a - n .
- Examples of the responder devices 116 a - n include devices worn or carried by the responder 110 , such as a responder camera (e.g., an on-body camera, a dashboard camera, etc.), a conducted electrical weapon (CEW), a firearm holster, an on-body microphone, a radio, and the like.
- Other examples of the responder devices 116 a - n include devices associated with a vehicle of the responder 110 , such as a light bar, a dashboard camera, a microphone, an in-vehicle sensor, and the like.
- the responder devices 116 a - n can include any other device associated with the responder 110 .
- the communications application 114 includes instructions that, when executed, cause the computing device 112 to communicate, via the network 102 , with the dispatch computing device 162 or any other computing device in communication with the network 102 .
- the computing device 112 communicates via the network 102 using one or more wireless communication protocols, such as 2G, 3G, 4G, LTE, or WiMAX.
- the communications application 114 includes instructions that, when executed, cause the computing device 112 to communicate directly with one or more of the responder devices 116 a - n .
- the computing device 112 communicates directly with one or more of the responder devices 116 a - n using one or more wireless communication protocols, such as WiFi, Bluetooth, or near field communication (NFC).
- the communications application 114 includes instructions that, when executed, cause the computing device 112 to send communications to the dispatch computing device 162 via the network 102 .
- the communications sent by the computing device 112 to the dispatch computing device 162 via the network 102 include information obtained or generated by the computing device 112 .
- communications from the computing device 112 may include audio recorded by the computing device 112 , geolocation data determined by the computing device 112 , environmental data (e.g., temperature, atmospheric pressure, etc.), and the like.
- the communications application 114 includes instructions that, when executed, cause the computing device 112 to relay communications between the dispatch computing device 162 and the responder devices 116 a - n .
- the communications can include video from an on body camera, audio from an on-body microphone, an indication from a light bar of a vehicle that the light bar has been activated, an indication from a holster that the holster has been unlocked to allow removal of a firearm, an indication from a biometric sensor (e.g., heart rate monitor, body temperature sensor, blood pressure sensor, etc.) of biometric data about the responder 110 , and the like.
- a biometric sensor e.g., heart rate monitor, body temperature sensor, blood pressure sensor, etc.
- the computing device 112 communicates with one or more of the responder devices 116 a - n using a first wireless communication protocol (e.g., WiFi, Bluetooth, etc.) and the computing device 112 communicates via the network 102 using a second wireless communication protocol (e.g., 2G, 3G, 4G, LTE, WiMAX, etc.).
- a first wireless communication protocol e.g., WiFi, Bluetooth, etc.
- a second wireless communication protocol e.g., 2G, 3G, 4G, LTE, WiMAX, etc.
- the communications application 114 includes instructions that, when executed, cause the computing device 112 to process information prior to sending it via the network 102 .
- the communications application 114 causes the computing device 112 to reduce a resolution of the information (e.g., pictures, recorded video, video streams, etc.) prior to sending the information via the network 102 .
- the communications application 114 causes the computing device 112 to tag the information with metadata (e.g., a time of capture of the information, a location of capture of the information, etc.) prior to sending the information via the network 102 .
- the communications application 114 causes the computing device 112 to compile multiple forms of information (e.g., text and images) into a single transmission via the network 102 .
- FIG. 2 includes an embodiment of the computing device 112 .
- this embodiment is not limited only to the computing device 112 .
- Any of the other computing devices described herein, such as computing devices 122 , 132 , 142 , 152 , and 162 may have similar capabilities to communicate via the network 102 and to communicate with responder devices associated with the computing devices.
- FIG. 3 An embodiment of a system 180 and examples of communication capabilities of the responder devices 116 a - n are depicted in FIG. 3 .
- Each of the responder devices 116 a - n is configured to communicate, via the network 102 , with the dispatch computing device 162 or any other computing device in communication with the network 102 .
- the responder devices 116 a - n communicate via the network 102 using one or more wireless communication protocols, such as 2G, 3G, 4G, LTE, or WiMAX.
- the communication between the responder devices 116 a - n and the dispatch computing device 162 includes either or both of communication from the responder devices 116 a - n to the dispatch computing device 162 (e.g., video information, audio information, responder device status information, biometric data, geolocation information, etc.) and communication from the dispatch computing device 162 to the responder devices 116 a - n (e.g., an activation signal to activate one of the responder devices 116 a - n ).
- the responder devices 116 a - n and the dispatch computing device 162 are capable of communicating with each other.
- the responder devices 116 a - n and the dispatch computing device 162 are capable of communicating with each other using a hybrid of the systems 170 and 180 depicted in FIGS. 2 and 3 .
- one of the responder devices 116 a - n and the dispatch computing device 162 communicate via the network 102 , as shown in FIG.
- FIGS. 4 to 6B An example of automatic activation of a responder camera 116 m using a system 200 is depicted in FIGS. 4 to 6B .
- the system 200 includes the responder, the network 102 , and the dispatch unit 160 .
- the responder 110 has the computing device 112 with the communications application 114 and the responder 110 has the responder camera 116 a .
- the responder camera 116 m depicted in FIGS. 4 to 6B is an on-body camera, the responder camera 116 m can be any type of responder camera, such as a dashboard camera in a responder vehicle.
- the dispatch unit 160 includes the dispatch computing device 162 .
- the embodiments of methods shown in FIGS. 4 to 6B depict examples of a dispatch computing system 162 automatically activating the responder camera 116 m.
- the dispatch computing system 162 receives a dispatch acknowledgement 202 .
- the dispatch acknowledgement 202 indicates that the responder 110 is at a geographic area or en route to a geographic area.
- the dispatch computing system 162 previously received an indication of a location or event and determined the geographic area associated with the location or event.
- the dispatch computing system 162 receives the dispatch acknowledgement 202 from the computing device 112 via the network 102 .
- the computing device 112 may send the dispatch acknowledgement 202 in response to an input from the responder 110 accepting dispatch instructions to respond to the location or event or to respond to the geographic area.
- the computing device 112 adds information to the dispatch acknowledgement 202 before sending the dispatch acknowledgement 202 , such as geospatial information indicating the current location of the computing device 112 or an intended destination of the responder 110 .
- the dispatch computing system 162 is capable of receiving the dispatch acknowledgement 202 from the responder in other ways.
- the responder 110 communicates with a dispatcher, such as via a two-way radio system, to accept a dispatch request and to provide the dispatcher with a current location and/or intended destination of the responder 110 .
- the dispatcher enters the dispatch acknowledgement 202 into the dispatch computing system 162 with an indication that the responder 110 is at the geographic area or that the responder 110 is en route to the geographic area.
- the responder 110 provides information that is received by the dispatch computing system 162 as the dispatch acknowledgement 202 .
- the dispatch computing system 162 In response to receiving the dispatch acknowledgement 202 , the dispatch computing system 162 automatically sends a camera activation signal to the responder camera 116 m associated with the responder 110 .
- the responder camera 116 m is configured to begin capturing a video in response to receiving the camera activation signal.
- the dispatch computing system 162 sending the camera activation signal in response to receiving the dispatch acknowledgement 202 indicating that the responder 110 is at the geographic area or that the responder is en route to the geographic area
- the responder camera 116 m is automatically activated to capture video of events happening at and/or en route to the geographic area.
- Two embodiments of the dispatch computing system 162 automatically sending a camera activation signal to the responder camera 116 m associated with the responder 110 are depicted in FIGS. 5A and 5B .
- the dispatch computing system 162 automatically sends a camera activation signal 204 a to the computing device 112 via the network 102 and the computing device 112 sends a camera activation signal 204 b to the responder camera 116 m .
- the responder camera 116 m begins capturing a video.
- the automatic sending of the camera activation signal 204 a by the dispatch computing system 162 causes the responder camera 116 m to begin capturing a video.
- the camera activation signal 204 b may be the same as the camera activation signal 204 a or the camera activation signal 204 b may be a modified version of the activation signal 204 a (e.g., in the case where the computing device 112 processes the camera activation signal 204 a to generate the camera activation signal 204 b ).
- the dispatch computing system 162 automatically sends a camera activation signal 204 c to the responder camera 116 m via the network 102 .
- the camera activation signal 204 c does not pass through the computing device 112 .
- Sending the camera activation signal 204 c from the dispatch computing system 162 to the responder camera 116 m without the camera activation signal 204 c passing through the computing device 112 avoids any issues that may be associated with the computing device 112 , such as the computing device 112 being without power (e.g., the batter of the computing device 112 is not charged), the computing device 112 not being in communication with the responder camera 116 m (e.g., the computing device 112 is not paired with the responder camera 116 m ), and the like.
- the responder camera 116 m In response to receiving either the camera activation signal 204 b or the camera activation signal 204 c , the responder camera 116 m begins capturing video. In some embodiments, video information from the video captured by the responder camera 116 m is sent to the dispatch computing system 162 . Examples of sending video information from the responder camera 116 m to the dispatch computing system 162 are depicted in FIGS. 6A and 6B .
- video information 206 a is sent from the responder camera 116 m to the computing device 112 .
- the video information 206 a is sent via a short range communication protocol, such as Bluetooth or WiFi.
- the computing device 112 sends the video information 206 b to the dispatch computing system 162 via the network 102 .
- the video information 206 b is sent via a long range communication protocol, such as 2G, 3G, 4G, LTE, or WiMAX.
- the computing device 112 acts as a relay between the responder camera 116 m and the dispatch computing system 162 .
- video information 206 c is sent from the responder camera 116 m to the dispatch computing system 162 via the network 102 .
- the video information 206 c is sent via a long range communication protocol, such as 2G, 3G, 4G, LTE, or WiMAX.
- the video information 206 c is sent from the responder camera 116 m to the dispatch computing system 162 without the video information 206 c passing through the computing device 112 .
- This embodiment avoids any issues that may be associated with the computing device 112 , such as the computing device 112 being without power, the computing device 112 not being in communication with the responder camera 116 m , and the like.
- video information is sent from the responder camera 116 m to the dispatch computing system 162 after completion of recording a video.
- the responder camera 116 m may record a video for a particular amount of time and then send the video information for the video after the recording is completed.
- at least a portion of the video information is sent from the responder camera 116 m to the dispatch computing system 162 such that the dispatch computing system 162 receives the portion of the video information before the responder camera 116 m finishes recording the video.
- This action by the responder camera 116 m is sometimes referred to as sending a “live stream” of the video or as sending a “streaming video,” even though the dispatch computing system 162 may not receive the video information instantaneously as it is recorded by the responder camera 116 m . Delays in receiving a live stream of a video may be due to delays in processing by the responder camera 116 m , the computing device 112 , and/or the dispatch computing system 162 , latency in the network 102 , or any other cause of delay. Both of the embodiments of sending video information from the responder camera 116 m to the dispatch computing system 162 depicted in FIGS. 6A and 6B are usable to send completed videos or to send live streams of videos.
- video information recorded by the responder camera 116 m can be conveyed by wired mechanisms, such as by placing the responder camera 116 m in a cradle coupled to the dispatch computing system 162 (e.g., at the end of a responder's shift upon returning to the dispatch unit 160 ).
- FIG. 6C An example of the dispatch computing system 162 receiving video information 206 d is depicted in FIG. 6C .
- the dispatch computing system 162 receives the video information 206 d .
- the video information 206 d may include video information from a single responder camera or a number of different responder cameras.
- the dispatch computing system 162 includes at least one memory and the dispatch computing system 162 stores the received video information 206 d .
- the dispatch computing system 162 is coupled to one or more display devices 164 and the dispatch computing system 162 displays the received video information 206 d on the one or more display devices 164 .
- the dispatch computing system 162 may display the live streams on different portions 166 of the one or more display devices 164 . In this way, a dispatcher or other user viewing the one or more display devices 164 will be able to see the live streams from the different responder cameras at the same time.
- FIGS. 7 and 8 Examples of a dispatch computing device determining a geographic area associated with a location or event are depicted in FIGS. 7 and 8 .
- a map 300 is depicted in FIG. 7 with indications of a location or event 302 and responders 304 a - g .
- the location or event 302 may be a car accident, a reported crime, a person in need of medical attention, a disabled vehicle, a business, a residence, or any other location or event.
- a dispatch computing device receives the indication of the location or event 302 .
- the dispatch computing device receives the indication of the location or event 302 in the form of an address, an intersection, geospatial information, or any other information that identifies the location or event.
- the indication of the location or event 302 is received by the dispatch computing device from a dispatcher in communication with a responder, from a computing device of a responder, or from any other source.
- the computing device determines a geographic area 306 associated with the location or event 302 .
- the geographic area 306 is a substantially circular area within a predetermined distance of the location or event 302 .
- a geographic area associated with a location or event can have any other shape (e.g., rectangle, oval, etc.), have an irregular shape, be generated based on user preferences (e.g., a default geographic area shape or size), be based on a user input (e.g., an area drawn on a map by a user), or be generated in any other manner.
- the geographic area represents an area of interest.
- the dispatch computing device In an effort to capture evidence about the geographic area 306 , the dispatch computing device automatically activates responder cameras associated with one or more of the responders 304 a - g in response to receiving one or more dispatch acknowledgements from the one or more of the responders 304 a - g.
- the dispatch computing device automatically sends a camera activation signal to a responder camera associated with a responder that is at the location or event 302 .
- the responder 304 b may have reported the location or event 302 and sent a dispatch acknowledgement that the responder 304 b is at the location or event 302 .
- the dispatch computing device sends a dispatch request to the responder 304 b prior to receiving the dispatch acknowledgement from the responder 304 .
- the dispatch computing device automatically sends a camera activation signal to a responder camera associated with the responder 304 b in response to receiving the dispatch acknowledgement from the responder 304 b .
- the responder camera of the responder 304 b is configured to begin capturing video in response to receiving the camera activation signal.
- the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are within the geographic area 306 .
- the responders 304 a and 304 c - d may have sent dispatch acknowledgements that the responders 304 a and 304 c - d are at their respective locations within the geographic area 306 shown in FIG. 7 .
- the dispatch computing device automatically sends camera activation signals to responder cameras associated with the responders 304 a and 304 c - d in response to receiving the dispatch acknowledgements from the responders 304 a and 304 c - d .
- the responder cameras of the responders 304 a and 304 c - d are configured to begin capturing video in response to receiving the camera activation signals.
- the dispatch acknowledgements from the responders 304 a and 304 c - 304 d are received before the dispatch computing device determines the geographic area 306 and the dispatch computing device sending the camera activation signals to responder cameras associated with the responders 304 a and 304 c - 304 d in response to receiving the dispatch acknowledgements from the responders 304 a and 304 c - 304 d includes sending the camera activation signals to responder cameras associated with the responders 304 a and 304 c - 304 d in response to receiving the dispatch acknowledgements from the responders 304 a and 304 c - 304 d and determining the geographic area 306 .
- the dispatch computing device automatically sends a camera activation signal to a responder camera associated with a responder that is en route to the location or event 302 .
- the responder 304 e may have sent a dispatch acknowledgement that the responder 304 e is en route to the location or event 302 as backup for the responder 304 b .
- the dispatch computing device automatically sends a camera activation signal to a responder camera associated with the responder 304 e in response to receiving the dispatch acknowledgement from the responder 304 e .
- the responder camera of the responder 304 e is configured to begin capturing video in response to receiving the camera activation signal.
- the responder camera of the responder 304 e begins capturing video before the responder 304 e enters the geographic area 306 . In this way, video is recorded while the responder 304 e is en route to the geographic area 306 .
- the dispatch computing device receives the video from the responder camera associated with the responder 304 e as a streaming video, and at least a portion of the streaming video is received by the dispatch computing device from the responder camera associated with the responder 304 e prior to the responder 304 e arriving at the geographic area 306 .
- the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are en route to the geographic area 306 .
- the responder 304 f may have sent a dispatch acknowledgement that the responder 304 f is en route to the location where responder 304 d is located within the geographic area 306 as backup for responder 304 d .
- the dispatch computing device automatically sends camera activation signals to the responder camera associated with the responder 304 f in response to receiving the dispatch acknowledgement from the responder 304 f
- the responder camera of the responder 304 f is configured to begin capturing video in response to receiving the camera activation signal.
- the dispatch computing device receives the video from the responder camera associated with the responder 304 f as a streaming video, and at least a portion of the streaming video is received by the dispatch computing device from the responder camera associated with the responder 304 f prior to the responder 304 f arriving at the geographic area 306 .
- a map 400 is depicted in FIG. 8 with indications of an event in the form of a parade 402 a , a parade route 402 b , and responders 404 a - j .
- a dispatch computing device receives the indication of the parade route 402 b .
- the dispatch computing device receives the indication of the parade route 402 b in the form of a series of addresses, a series of intersections, geospatial information data of the parade route 402 b , or any other information that identifies the parade route 402 b .
- the indication of the parade route 402 b is received by the dispatch computing device from a staff person at a responder agency (e.g., police department staff) or from any other source.
- the computing device determines a geographic area 406 associated with the parade route 402 b .
- the geographic area 406 parallels the parade route 402 b within a predetermined distance of the parade route 402 b .
- the geographic area 406 represents an area of interest.
- the dispatch computing device automatically activates responder cameras associated with one or more of the responders 404 a - j in response to receiving one or more dispatch acknowledgements from the one or more of the responders 404 a - j.
- the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are within the geographic area 406 .
- the responders 404 a - g may have sent dispatch acknowledgements that the responders 404 a - g are at their respective locations within the geographic area 406 shown in FIG. 7 .
- the dispatch computing device automatically sends camera activation signals to responder cameras associated with the responders 404 a - g in response to receiving the dispatch acknowledgements from the responders 404 a - g .
- the responder cameras of the responders 404 a - g are configured to begin capturing video in response to receiving the camera activation signals.
- the dispatch acknowledgements from the responders 404 a - g are received before the dispatch computing device determines the geographic area 406 and the dispatch computing device sending the camera activation signals to responder cameras associated with the responders 404 a - g in response to receiving the dispatch acknowledgements from the responders 404 a - g includes sending the camera activation signals to responder cameras associated with the responders 404 a - g in response to receiving the dispatch acknowledgements from the responders 404 a - g and determining the geographic area 406 .
- the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are en route to the geographic area 406 .
- the responder 404 j may have been informed that responder 404 d requires assistance and the responder 404 j may have sent a dispatch acknowledgement that the responder 404 j is en route to the location where responder 404 d is located within the geographic area 406 .
- the dispatch computing device automatically sends camera activation signals to the responder camera associated with the responder 404 j in response to receiving the dispatch acknowledgement from the responder 404 d .
- the responder camera of the responder 404 d is configured to begin capturing video in response to receiving the camera activation signal.
- FIGS. 7 and 8 include particular scenarios, such as the parade 402 a and the parade route 402 b depicted in FIG. 8
- the concepts discussed above with respect to FIGS. 7 and 8 apply to any number of other stations.
- the embodiment discussed above with respect to FIG. 7 can be adapted for larger events, such as a sporting event, a fair, a demonstration, or any other event.
- an event may take place over a larger area (e.g., over one or more city blocks) and the dispatch computing device determines a geographic area associated with the event based on the larger area of the event itself.
- the dispatch computing device 8 can be adapted for other events that have a route, such as a vehicle chase with an anticipated chase route, a funeral processional with an expected route, or any other event that occurs over a route.
- the dispatch computing device determines the geographic area based on a probability of an event taking a particular route (e.g., determining a geographic area associated with a vehicle chase based on a probability of the vehicle taking a particular route).
- the computing device receives an indication of a location or an event.
- the indication can be received from a responder computing device via a network, from a dispatcher in communication with a responder, from a responder agency staff member, or from any other source.
- the computing device determines a geographic area associated with the location or event. As described above, the geographic area can be a regular shape, an irregular shape, a user-defined area, a predetermined distance from the location or event, or any other geographic area associated with the location or event.
- the computing device receives a dispatch acknowledgement from a responder, where the dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area.
- the dispatch acknowledgement can be received from a responder computing device via a network, from a dispatcher entering the dispatch acknowledgement in response to a communication between the responder and the dispatcher, or from any other source conveying the dispatch acknowledgement from the responder.
- the computing device automatically sends a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder.
- the responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- the camera activation signal may be sent from the computing device to the responder camera via a network, from the computing device to a responder computing device where the responder computing device is configured to relay the camera activation signal to the responder camera, or from the computing device to the responder camera in any other way.
- described techniques and tools may be implemented by any suitable computing device or set of computing devices.
- a data store contains data as described herein and may be hosted, for example, by a database management system (DBMS) to allow a high level of data throughput between the data store and other components of a described system.
- the DBMS may also allow the data store to be reliably backed up and to maintain a high level of availability.
- a data store may be accessed by other system components via a network, such as a private network in the vicinity of the system, a secured transmission channel over the public Internet, a combination of private and public networks, and the like.
- a data store may include structured data stored as files in a traditional file system. Data stores may reside on computing devices that are part of or separate from components of systems described herein. Separate data stores may be combined into a single data store, or a single data store may be split into two or more separate data stores.
- server devices may include suitable computing devices configured to provide information and/or services described herein.
- Server devices may include any suitable computing devices, such as dedicated server devices.
- Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device.
- client can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server.
- a single device may act as a server, a client, or both a server and a client, depending on context and configuration.
- Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location.
- FIG. 10 depicts a block diagram that illustrates aspects of an illustrative computing device 600 appropriate for use in accordance with embodiments of the present disclosure.
- the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other currently available or yet to be developed devices that may be used in accordance with embodiments of the present disclosure.
- the computing device 600 includes at least one processor 602 and a system memory 604 connected by a communication bus 606 .
- the system memory 604 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology.
- ROM read only memory
- RAM random access memory
- EEPROM electrically erasable programmable read-only memory
- flash memory or other memory technology.
- system memory 604 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 602 .
- the processor 602 may serve as a computational center of the computing device 600 by supporting the execution of instructions.
- the computing device 600 may include a network interface 610 comprising one or more components for communicating with other devices over a network.
- Embodiments of the present disclosure may access basic services that utilize the network interface 610 to perform communications using common network protocols.
- the network interface 610 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and/or the like.
- the computing device 600 also includes a storage medium 608 .
- services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 608 depicted in FIG. 10 is optional.
- the storage medium 608 may be volatile or nonvolatile, removable or non-removable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic tape, magnetic disk storage, and/or the like.
- computer-readable medium includes volatile and nonvolatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, or other data.
- system memory 604 and storage medium 608 depicted in FIG. 10 are examples of computer-readable media.
- FIG. 10 does not show some of the typical components of many computing devices.
- the computing device 600 may include input devices, such as a keyboard, keypad, mouse, trackball, microphone, video camera, touchpad, touchscreen, electronic pen, stylus, and/or the like.
- Such input devices may be coupled to the computing device 600 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, USB, or other suitable connection protocols using wireless or physical connections.
- data can be captured by input devices and transmitted or stored for future processing.
- the processing may include encoding data streams, which can be subsequently decoded for presentation by output devices.
- Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device).
- Input devices can be separate from and communicatively coupled to computing device 600 (e.g., a client device), or can be integral components of the computing device 600 .
- multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone).
- the computing device 600 may also include output devices such as a display, speakers, printer, etc.
- the output devices may include video output devices such as a display or touchscreen.
- the output devices also may include audio output devices such as external speakers or earphones.
- the output devices can be separate from and communicatively coupled to the computing device 600 , or can be integral components of the computing device 600 .
- Input functionality and output functionality may be integrated into the same input/output device (e.g., a touchscreen). Any suitable input device, output device, or combined input/output device either currently known or developed in the future may be used with described systems.
- functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVATM, PHP, Perl, Python, Ruby, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft.NETTM languages such as C#, and/or the like.
- Computing logic may be compiled into executable programs or written in interpreted programming languages.
- functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub modules.
- the computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general purpose or special purpose processors, thus creating a special purpose computing device configured to provide functionality described herein.
- a computer-readable medium e.g., a non-transitory medium such as a memory or storage medium
- computer storage device e.g., a non-transitory medium such as a memory or storage medium
- modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems.
- modules or subsystems can be omitted or supplemented with other modules or subsystems.
- functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems.
- processing stages in the various techniques can be separated into additional stages or combined into fewer stages.
- processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages.
- processing stages that are described as occurring in a particular order can instead occur in a different order.
- processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages.
- processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.
- Embodiments disclosed herein include a computer-implemented method for performing one or more of the above-described techniques; a computing device comprising a processor and computer-readable storage media having stored thereon computer executable instructions configured to cause the server computer to perform one or more of the above described techniques; a computer-readable storage medium having stored thereon computer executable instructions configured to cause a computing device to perform one or more of the above-described techniques; a computing system comprising a server that provides one or more of the above-described services.
- the computer system may further comprise plural client computing devices; and a client computing device in communication with a server that provides one or more of the above-described services, the client computing device comprising a processing unit and computer-readable storage media having stored thereon computer executable instructions configured to cause the client computing device to perform one or more of the above described techniques.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computational Linguistics (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Telephonic Communication Services (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- General Health & Medical Sciences (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
- This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- In one embodiment, a method of activating a responder camera includes a computing device receiving an indication of a location or an event, determining a geographic area associated with the location or event, receiving a dispatch acknowledgement from a responder, and automatically sending a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder. The dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area. The responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- In one example, the method further includes the computing device sending a dispatch request to the responder prior to receiving the dispatch acknowledgement from the responder. In another example, the method further includes the computing device receiving the video from the responder camera. In another example, the video is received from the responder camera after the responder camera has finished capturing the video. In another example, the video is received from the responder camera as a streaming video. In another example, at least a portion of the streaming video is received from the responder camera before the responder camera finishes capturing the video. In another example, the method further includes the computing device storing the video received from the responder camera.
- In another example, the dispatch acknowledgement is received from the responder via a responder computing device associated with the responder. In another example, automatically sending the camera activation signal to the responder camera associated with the responder comprises automatically sending, by the computing device, the camera activation signal to the responder computing device, where the responder computing device is configured to relay the camera activation signal to the responder camera. In another example, receiving the dispatch acknowledgement from the responder comprises receiving, by the computing device, the dispatch acknowledgement from the responder via a device other than the responder computing device. In another example, automatically sending the camera activation signal to the responder camera associated with the responder comprises automatically sending, by the computing device, the camera activation signal directly to the responder camera activation signal via one or more communication networks.
- In another example, the dispatch acknowledgement indicates that the responder is en route to the geographic area, and wherein automatically sending the camera activation signal to the responder camera causes the responder camera to begin capturing the video prior to the responder arriving at the geographic area. In another example, the method further includes the computing device receiving the video from the responder camera as a streaming video, where at least a portion of the streaming video is received from the responder camera prior to the responder arriving at the geographic area.
- In another example, the dispatch acknowledgement includes geospatial information of one or more of the responder camera or a responder computing device associated with the responder. In another example, the method further includes the computing device sending camera activation signals to a plurality of responder cameras in response to determining that each of a plurality of responders associated with the plurality of responder cameras is at the geographic area or en route to the geographic area. In another example, the method further includes the computing device receiving a streaming video from each of the plurality of responder cameras and simultaneously displaying at least two streaming videos received from the plurality of responder cameras.
- In another embodiment, a non-transitory computer-readable medium has instructions embodied thereon for activating a responder camera, where the instructions, in response to execution by a computing device, cause the computing device to receive an indication of a location or an event, determine a geographic area associated with the location or event, receive a dispatch acknowledgement from a responder, and automatically send a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder. The dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area. The responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- In another embodiment, a computing device for activating a responder camera includes a processor and a computer-readable medium having instructions embodied thereon. The instructions, in response to execution by the processor, cause the computing device to receive an indication of a location or an event, determine a geographic area associated with the location or event, receive a dispatch acknowledgement from a responder, and automatically send a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder. The dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area. The responder camera is configured to begin capturing a video in response to receiving the camera activation signal.
- In another example, the computing device further includes at least one display device and the instructions, in response to execution by the processor, further cause the computing device to receive the video from the responder camera and display the received video on the at least one display device. In another example, the computing device further includes at least one memory and the instructions, in response to execution by the processor, further cause the computing device to receive the video from the responder camera and store the received video in the at least one memory.
- The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
-
FIG. 1 depicts an embodiment of a system for communication between computing devices of responders via a network, in accordance with the embodiments disclosed herein; -
FIG. 2 depicts an embodiment of a system and examples of communication capabilities of a responder computing device, in accordance with the embodiments disclosed herein; -
FIG. 3 depicts an embodiment of a system and examples of communication capabilities of a responder devices, in accordance with the embodiments disclosed herein; -
FIG. 4 depicts a system in which a dispatch computing system receives a dispatch acknowledgement from a responder, in accordance with the embodiments disclosed herein; -
FIGS. 5A and 5B depict embodiments of automatically sending a camera activation signal to a responder camera associated with the responder in the system depicted inFIG. 4 , in accordance with the embodiments disclosed herein; -
FIGS. 6A and 6B depict embodiments of sending video information from a responder camera associated with a responder to a dispatch computing system in the system depicted inFIG. 4 , in accordance with the embodiments disclosed herein; -
FIG. 6C depicts an embodiment of a dispatch computing system receiving and displaying video information, in accordance with the embodiments disclosed herein; -
FIG. 7 depicts an embodiment of a dispatch computing device determining a geographic area associated with a location, in accordance with the embodiments disclosed herein; -
FIG. 8 depicts another embodiment of a dispatch computing device determining a geographic area associated with an event, in accordance with the embodiments disclosed herein; -
FIG. 9 depicts an embodiment of a method performed by a computing device to automatically activate a responder camera, in accordance with the embodiments disclosed herein; and -
FIG. 10 depicts a block diagram that illustrates aspects of an illustrative computing device appropriate for use in accordance with embodiments of the present disclosure. - Video recordings are important records for responders and responder agencies. A responder is any individual that is part of an agency that responds to particular situations. Examples of responders include law enforcement officials, firefighting officials, paramedics, private security personnel, private responders (e.g., tow truck drivers and roadside assistance personnel), and the like. Law enforcement officials include police officers, sheriffs and sheriff deputies, state patrol officers, federal agency officers (e.g., Federal Bureau of Investigation agents, Central Intelligence Agency agents, Transportation Security Administration officers, etc.), members of the National Guard, members of the armed forces, and the like. Examples of responders also include supervisors and dispatchers of other responders. Examples of responder agencies include police departments, sheriff offices, fire departments, federal agencies, private companies of private security personnel, private responder organizations, and the like.
- Videos associated with responder activities have a number of uses. For example, videos have evidentiary value, such as evidence of criminal activities that occur near responders, evidence of proper actions by responders, evidence of responder malfeasance, evidence of actions of individuals interacting with responders, and the like. Because of the evidentiary value of video, the number of responder cameras (e.g., cameras under the control of a responder or a responder agency) has increased. Examples of responder cameras include dashboard cameras in responder vehicles, body cameras worn by responders, and the like.
- One difficulty associated with the use of responder cameras is controlling the times at which the responder cameras capture video. In one example, a responder camera captures video constantly while the associated responder is on duty. While this example would capture any video of potential evidentiary value while the responder is on duty, constant capture of video by a responder camera has a number of disadvantages. Constant use of a responder camera results in a constant draw of power, which is especially problematic for responder cameras powered by batteries, generates a significant amount of video data, that must be stored and maintained securely, and creates a number of other issues.
- In another example controlling the times at which the responder cameras capture video, a responder camera captures video when activated manually by a responder. In this example, the responder activates the responder camera to begin recording video when the responder determines to activate the responder camera, such as when the responder is pursuing a suspect, the responder is responding to a call, and the like. However, relying on manual activation of a responder camera leaves significant room for human error, such as in situations where a responder forgets to activate the responder camera, in situations where a responder does not deem a situation to be appropriate for video recording but video evidence would have been helpful, in situations where a responder intentionally fails to activate the responder camera to avoid evidence of malfeasance, and the like.
- Due to the disadvantages of controlling the times at which the responder cameras capture video described above, an automated method of activating responder cameras in needed to avoid the disadvantages of constantly capturing video or relying on manual activation by responders. Embodiments of the present disclosure are generally directed to automatically activating responder cameras in response to particular situations.
- In one or more embodiments disclosed herein, a computing device (e.g., a dispatch computing device) receives an indication of a location or event. In some examples, a dispatcher enters the indication of a location (e.g., an address, an intersection, etc.) or an event (e.g., a gathering of people, a demonstration route, an ongoing vehicle chase of a suspect, etc.). The computing device determines a geographic area associated with the location or event (e.g., a radius from a location, an area surrounding the event, an expected future path of the event, etc.). The computing device receives a dispatch acknowledgement from a responder indicating that the responder is at the geographic area or that the responder is en route to the geographic area. Such a determination may be made based on geolocation coordinates of a present or intended future location of the responder contained in the dispatch acknowledgement from the responder. In response to receiving the dispatch acknowledgement from the responder, the computing device automatically sends a camera activation signal to a responder camera associated with the responder. The responder camera is configured to begin capturing a video in response to receiving the camera activation signal. In this way, the receipt of the dispatch acknowledgement from the responder indicating that the responder is at the geographic area or that the responder is en route to the geographic area triggers the computing device to automatically activate the responder camera.
- In the following description, numerous specific details are set forth in order to provide a thorough understanding of illustrative embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that many embodiments of the present disclosure may be practiced without some or all of the specific details. In some instances, well-known process steps have not been described in detail in order not to unnecessarily obscure various aspects of the present disclosure. Further, it will be appreciated that embodiments of the present disclosure may employ any combination of features described herein. The illustrative examples provided herein are not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed.
-
FIG. 1 depicts an embodiment of asystem 100 for communication between computing devices of responders via anetwork 102. Thesystem 100 includes aresponder 110 that has acomputing device 112. Thecomputing device 112 is capable of communicating via thenetwork 102. In some embodiments, thenetwork 102 is a wireless communication network using one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and the like. In some embodiments, thenetwork 102 includes a number of different communication networks, such as a wireless network and a local area network (LAN). In the depicted embodiment, thecomputing device 112 includes acommunication application 114 that includes instructions that cause thecomputing device 112 to communicate with other computing devices via thenetwork 102. - The
system 100 also includesresponders responders computing devices network 102. Each of thecomputing devices communication applications computing devices network 102. - In the depicted embodiment, the
system 100 also includes adispatch unit 160 that includes acomputing device 162. In some examples, thecomputing device 162 includes one or more of a server, a desktop computer, a laptop computer, a tablet computer, and the like. Thecomputing device 162 is capable of communicating via thenetwork 102. Thecomputing device 162 includes a communication application that includes instructions that cause thecomputing device 162 to establish a communication link between computing devices of other responders via thenetwork 102. In one embodiment, thecomputing device 162 is used by a responder, such as a dispatcher, a supervisory responder, or any other type of responder. - In some embodiments, each of the
computing devices computing devices responders responders communication applications personal computing devices responders computing device 162. - In another example, when communicating via the
network 102, thecomputing devices computing devices computing devices computing devices computing devices responders - In some embodiments, the
responders responders - An embodiment of a
system 170 and examples of communication capabilities of thecomputing device 112 are depicted inFIG. 2 . Thesystem 170 includes thecomputing device 112, thenetwork 102, thedispatch unit 160, and thedispatch computing device 162. As described above, thecomputing device 112 includes thecommunications application 114 and is capable of communicating via thenetwork 102. Thecomputing device 112 is also capable of communicating with any number of responder devices 116 a-n. Examples of the responder devices 116 a-n include devices worn or carried by theresponder 110, such as a responder camera (e.g., an on-body camera, a dashboard camera, etc.), a conducted electrical weapon (CEW), a firearm holster, an on-body microphone, a radio, and the like. Other examples of the responder devices 116 a-n include devices associated with a vehicle of theresponder 110, such as a light bar, a dashboard camera, a microphone, an in-vehicle sensor, and the like. The responder devices 116 a-n can include any other device associated with theresponder 110. - In some embodiments, the
communications application 114 includes instructions that, when executed, cause thecomputing device 112 to communicate, via thenetwork 102, with thedispatch computing device 162 or any other computing device in communication with thenetwork 102. In some examples, thecomputing device 112 communicates via thenetwork 102 using one or more wireless communication protocols, such as 2G, 3G, 4G, LTE, or WiMAX. In some embodiments, thecommunications application 114 includes instructions that, when executed, cause thecomputing device 112 to communicate directly with one or more of the responder devices 116 a-n. In some examples, thecomputing device 112 communicates directly with one or more of the responder devices 116 a-n using one or more wireless communication protocols, such as WiFi, Bluetooth, or near field communication (NFC). - In some embodiments, the
communications application 114 includes instructions that, when executed, cause thecomputing device 112 to send communications to thedispatch computing device 162 via thenetwork 102. In some examples, the communications sent by thecomputing device 112 to thedispatch computing device 162 via thenetwork 102 include information obtained or generated by thecomputing device 112. For example, communications from thecomputing device 112 may include audio recorded by thecomputing device 112, geolocation data determined by thecomputing device 112, environmental data (e.g., temperature, atmospheric pressure, etc.), and the like. - In some embodiments, the
communications application 114 includes instructions that, when executed, cause thecomputing device 112 to relay communications between thedispatch computing device 162 and the responder devices 116 a-n. In some examples, the communications can include video from an on body camera, audio from an on-body microphone, an indication from a light bar of a vehicle that the light bar has been activated, an indication from a holster that the holster has been unlocked to allow removal of a firearm, an indication from a biometric sensor (e.g., heart rate monitor, body temperature sensor, blood pressure sensor, etc.) of biometric data about theresponder 110, and the like. In some embodiments, thecomputing device 112 communicates with one or more of the responder devices 116 a-n using a first wireless communication protocol (e.g., WiFi, Bluetooth, etc.) and thecomputing device 112 communicates via thenetwork 102 using a second wireless communication protocol (e.g., 2G, 3G, 4G, LTE, WiMAX, etc.). - In another embodiment, the
communications application 114 includes instructions that, when executed, cause thecomputing device 112 to process information prior to sending it via thenetwork 102. In one example, thecommunications application 114 causes thecomputing device 112 to reduce a resolution of the information (e.g., pictures, recorded video, video streams, etc.) prior to sending the information via thenetwork 102. In another example, thecommunications application 114 causes thecomputing device 112 to tag the information with metadata (e.g., a time of capture of the information, a location of capture of the information, etc.) prior to sending the information via thenetwork 102. In another example, thecommunications application 114 causes thecomputing device 112 to compile multiple forms of information (e.g., text and images) into a single transmission via thenetwork 102. - The depiction in
FIG. 2 includes an embodiment of thecomputing device 112. However, this embodiment is not limited only to thecomputing device 112. Any of the other computing devices described herein, such ascomputing devices network 102 and to communicate with responder devices associated with the computing devices. - An embodiment of a
system 180 and examples of communication capabilities of the responder devices 116 a-n are depicted inFIG. 3 . Each of the responder devices 116 a-n is configured to communicate, via thenetwork 102, with thedispatch computing device 162 or any other computing device in communication with thenetwork 102. In some examples, the responder devices 116 a-n communicate via thenetwork 102 using one or more wireless communication protocols, such as 2G, 3G, 4G, LTE, or WiMAX. In some embodiments, the communication between the responder devices 116 a-n and thedispatch computing device 162 includes either or both of communication from the responder devices 116 a-n to the dispatch computing device 162 (e.g., video information, audio information, responder device status information, biometric data, geolocation information, etc.) and communication from thedispatch computing device 162 to the responder devices 116 a-n (e.g., an activation signal to activate one of the responder devices 116 a-n). - In both the embodiments of
systems FIGS. 2 and 3 , the responder devices 116 a-n and thedispatch computing device 162 are capable of communicating with each other. In another embodiment, the responder devices 116 a-n and thedispatch computing device 162 are capable of communicating with each other using a hybrid of thesystems FIGS. 2 and 3 . In one embodiment, one of the responder devices 116 a-n and thedispatch computing device 162 communicate via thenetwork 102, as shown inFIG. 3 , to communicate low-bandwidth messages (e.g., a sensor reading from one of the responder devices 116 a-n to thedispatch computing device 162, an activation message from thedispatch computing device 162 to the one of the responder devices 116 a-n, etc.) and the responder devices 116 a-n and thedispatch computing device 162 communicate viacomputing device 112 and thenetwork 102, as shown inFIG. 2 , to communicate high-bandwidth messages (e.g., video information from one of the responder devices 116 a-n to thedispatch computing device 162, etc.). Any other combination of using the twosystems FIGS. 2 and 3 is possible. - Communications between a responder device and a dispatch computing system can be used to automatically activate the responder device. An example of automatic activation of a
responder camera 116 m using asystem 200 is depicted inFIGS. 4 to 6B . Thesystem 200 includes the responder, thenetwork 102, and thedispatch unit 160. Theresponder 110 has thecomputing device 112 with thecommunications application 114 and theresponder 110 has theresponder camera 116 a. While theresponder camera 116 m depicted inFIGS. 4 to 6B is an on-body camera, theresponder camera 116 m can be any type of responder camera, such as a dashboard camera in a responder vehicle. Thedispatch unit 160 includes thedispatch computing device 162. The embodiments of methods shown inFIGS. 4 to 6B depict examples of adispatch computing system 162 automatically activating theresponder camera 116 m. - In
FIG. 4 , thedispatch computing system 162 receives adispatch acknowledgement 202. Thedispatch acknowledgement 202 indicates that theresponder 110 is at a geographic area or en route to a geographic area. In one embodiment, thedispatch computing system 162 previously received an indication of a location or event and determined the geographic area associated with the location or event. - In the embodiment shown in
FIG. 4 , thedispatch computing system 162 receives thedispatch acknowledgement 202 from thecomputing device 112 via thenetwork 102. In this embodiment, thecomputing device 112 may send thedispatch acknowledgement 202 in response to an input from theresponder 110 accepting dispatch instructions to respond to the location or event or to respond to the geographic area. In one embodiment, thecomputing device 112 adds information to thedispatch acknowledgement 202 before sending thedispatch acknowledgement 202, such as geospatial information indicating the current location of thecomputing device 112 or an intended destination of theresponder 110. - In some embodiments not shown in
FIG. 4 , thedispatch computing system 162 is capable of receiving thedispatch acknowledgement 202 from the responder in other ways. In one embodiment, theresponder 110 communicates with a dispatcher, such as via a two-way radio system, to accept a dispatch request and to provide the dispatcher with a current location and/or intended destination of theresponder 110. The dispatcher enters thedispatch acknowledgement 202 into thedispatch computing system 162 with an indication that theresponder 110 is at the geographic area or that theresponder 110 is en route to the geographic area. There are any number of other ways by which theresponder 110 provides information that is received by thedispatch computing system 162 as thedispatch acknowledgement 202. - In response to receiving the
dispatch acknowledgement 202, thedispatch computing system 162 automatically sends a camera activation signal to theresponder camera 116 m associated with theresponder 110. Theresponder camera 116 m is configured to begin capturing a video in response to receiving the camera activation signal. In this way, by thedispatch computing system 162 sending the camera activation signal in response to receiving thedispatch acknowledgement 202 indicating that theresponder 110 is at the geographic area or that the responder is en route to the geographic area, theresponder camera 116 m is automatically activated to capture video of events happening at and/or en route to the geographic area. Two embodiments of thedispatch computing system 162 automatically sending a camera activation signal to theresponder camera 116 m associated with theresponder 110 are depicted inFIGS. 5A and 5B . - In the embodiment shown in
FIG. 5A , thedispatch computing system 162 automatically sends acamera activation signal 204 a to thecomputing device 112 via thenetwork 102 and thecomputing device 112 sends acamera activation signal 204 b to theresponder camera 116 m. In response to receiving thecamera activation signal 204 b, theresponder camera 116 m begins capturing a video. In this way, the automatic sending of thecamera activation signal 204 a by thedispatch computing system 162 causes theresponder camera 116 m to begin capturing a video. Thecamera activation signal 204 b may be the same as thecamera activation signal 204 a or thecamera activation signal 204 b may be a modified version of theactivation signal 204 a (e.g., in the case where thecomputing device 112 processes thecamera activation signal 204 a to generate thecamera activation signal 204 b). - In the embodiment shown in
FIG. 5B , thedispatch computing system 162 automatically sends acamera activation signal 204 c to theresponder camera 116 m via thenetwork 102. Thecamera activation signal 204 c does not pass through thecomputing device 112. Sending thecamera activation signal 204 c from thedispatch computing system 162 to theresponder camera 116 m without thecamera activation signal 204 c passing through thecomputing device 112 avoids any issues that may be associated with thecomputing device 112, such as thecomputing device 112 being without power (e.g., the batter of thecomputing device 112 is not charged), thecomputing device 112 not being in communication with theresponder camera 116 m (e.g., thecomputing device 112 is not paired with theresponder camera 116 m), and the like. - In response to receiving either the
camera activation signal 204 b or thecamera activation signal 204 c, theresponder camera 116 m begins capturing video. In some embodiments, video information from the video captured by theresponder camera 116 m is sent to thedispatch computing system 162. Examples of sending video information from theresponder camera 116 m to thedispatch computing system 162 are depicted inFIGS. 6A and 6B . - In
FIG. 6A ,video information 206 a is sent from theresponder camera 116 m to thecomputing device 112. In some embodiments, thevideo information 206 a is sent via a short range communication protocol, such as Bluetooth or WiFi. Thecomputing device 112 sends thevideo information 206 b to thedispatch computing system 162 via thenetwork 102. In some embodiments, thevideo information 206 b is sent via a long range communication protocol, such as 2G, 3G, 4G, LTE, or WiMAX. In the embodiment shown inFIG. 6A , thecomputing device 112 acts as a relay between theresponder camera 116 m and thedispatch computing system 162. - In
FIG. 6B ,video information 206 c is sent from theresponder camera 116 m to thedispatch computing system 162 via thenetwork 102. In some embodiments, thevideo information 206 c is sent via a long range communication protocol, such as 2G, 3G, 4G, LTE, or WiMAX. In the embodiment shown inFIG. 6B , thevideo information 206 c is sent from theresponder camera 116 m to thedispatch computing system 162 without thevideo information 206 c passing through thecomputing device 112. This embodiment avoids any issues that may be associated with thecomputing device 112, such as thecomputing device 112 being without power, thecomputing device 112 not being in communication with theresponder camera 116 m, and the like. - In some embodiments, video information is sent from the
responder camera 116 m to thedispatch computing system 162 after completion of recording a video. For example, theresponder camera 116 m may record a video for a particular amount of time and then send the video information for the video after the recording is completed. In other embodiments, at least a portion of the video information is sent from theresponder camera 116 m to thedispatch computing system 162 such that thedispatch computing system 162 receives the portion of the video information before theresponder camera 116 m finishes recording the video. This action by theresponder camera 116 m is sometimes referred to as sending a “live stream” of the video or as sending a “streaming video,” even though thedispatch computing system 162 may not receive the video information instantaneously as it is recorded by theresponder camera 116 m. Delays in receiving a live stream of a video may be due to delays in processing by theresponder camera 116 m, thecomputing device 112, and/or thedispatch computing system 162, latency in thenetwork 102, or any other cause of delay. Both of the embodiments of sending video information from theresponder camera 116 m to thedispatch computing system 162 depicted inFIGS. 6A and 6B are usable to send completed videos or to send live streams of videos. In other embodiments, video information recorded by theresponder camera 116 m can be conveyed by wired mechanisms, such as by placing theresponder camera 116 m in a cradle coupled to the dispatch computing system 162 (e.g., at the end of a responder's shift upon returning to the dispatch unit 160). - An example of the
dispatch computing system 162 receivingvideo information 206 d is depicted inFIG. 6C . As shown inFIG. 6C , thedispatch computing system 162 receives thevideo information 206 d. Thevideo information 206 d may include video information from a single responder camera or a number of different responder cameras. In some embodiments, thedispatch computing system 162 includes at least one memory and thedispatch computing system 162 stores the receivedvideo information 206 d. In some embodiments, including the embodiment depicted inFIG. 6C , thedispatch computing system 162 is coupled to one ormore display devices 164 and thedispatch computing system 162 displays the receivedvideo information 206 d on the one ormore display devices 164. In the case where thevideo information 206 d includes live streams of video information from a number of different responder cameras, thedispatch computing system 162 may display the live streams ondifferent portions 166 of the one ormore display devices 164. In this way, a dispatcher or other user viewing the one ormore display devices 164 will be able to see the live streams from the different responder cameras at the same time. - Examples of a dispatch computing device determining a geographic area associated with a location or event are depicted in
FIGS. 7 and 8 . Amap 300 is depicted inFIG. 7 with indications of a location orevent 302 and responders 304 a-g. In some embodiments, the location orevent 302 may be a car accident, a reported crime, a person in need of medical attention, a disabled vehicle, a business, a residence, or any other location or event. In one embodiment, a dispatch computing device receives the indication of the location orevent 302. In some examples, the dispatch computing device receives the indication of the location orevent 302 in the form of an address, an intersection, geospatial information, or any other information that identifies the location or event. In some examples, the indication of the location orevent 302 is received by the dispatch computing device from a dispatcher in communication with a responder, from a computing device of a responder, or from any other source. - The computing device determines a
geographic area 306 associated with the location orevent 302. In the embodiment shown inFIG. 7 , thegeographic area 306 is a substantially circular area within a predetermined distance of the location orevent 302. In other embodiments, a geographic area associated with a location or event can have any other shape (e.g., rectangle, oval, etc.), have an irregular shape, be generated based on user preferences (e.g., a default geographic area shape or size), be based on a user input (e.g., an area drawn on a map by a user), or be generated in any other manner. In some embodiments, the geographic area represents an area of interest. In an effort to capture evidence about thegeographic area 306, the dispatch computing device automatically activates responder cameras associated with one or more of the responders 304 a-g in response to receiving one or more dispatch acknowledgements from the one or more of the responders 304 a-g. - In one embodiment, the dispatch computing device automatically sends a camera activation signal to a responder camera associated with a responder that is at the location or
event 302. For example, theresponder 304 b may have reported the location orevent 302 and sent a dispatch acknowledgement that theresponder 304 b is at the location orevent 302. In another embodiment, the dispatch computing device sends a dispatch request to theresponder 304 b prior to receiving the dispatch acknowledgement from the responder 304. The dispatch computing device automatically sends a camera activation signal to a responder camera associated with theresponder 304 b in response to receiving the dispatch acknowledgement from theresponder 304 b. The responder camera of theresponder 304 b is configured to begin capturing video in response to receiving the camera activation signal. - In another embodiment, the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are within the
geographic area 306. For example, theresponders responders geographic area 306 shown inFIG. 7 . The dispatch computing device automatically sends camera activation signals to responder cameras associated with theresponders responders responders responders geographic area 306 and the dispatch computing device sending the camera activation signals to responder cameras associated with theresponders responders responders responders geographic area 306. - In another embodiment, the dispatch computing device automatically sends a camera activation signal to a responder camera associated with a responder that is en route to the location or
event 302. For example, theresponder 304 e may have sent a dispatch acknowledgement that theresponder 304 e is en route to the location orevent 302 as backup for theresponder 304 b. The dispatch computing device automatically sends a camera activation signal to a responder camera associated with theresponder 304 e in response to receiving the dispatch acknowledgement from theresponder 304 e. The responder camera of theresponder 304 e is configured to begin capturing video in response to receiving the camera activation signal. In some embodiments, the responder camera of theresponder 304 e begins capturing video before theresponder 304 e enters thegeographic area 306. In this way, video is recorded while theresponder 304 e is en route to thegeographic area 306. In some embodiments, the dispatch computing device receives the video from the responder camera associated with theresponder 304 e as a streaming video, and at least a portion of the streaming video is received by the dispatch computing device from the responder camera associated with theresponder 304 e prior to theresponder 304 e arriving at thegeographic area 306. - In another embodiment, the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are en route to the
geographic area 306. For example, theresponder 304 f may have sent a dispatch acknowledgement that theresponder 304 f is en route to the location whereresponder 304 d is located within thegeographic area 306 as backup forresponder 304 d. The dispatch computing device automatically sends camera activation signals to the responder camera associated with theresponder 304 f in response to receiving the dispatch acknowledgement from theresponder 304 f The responder camera of theresponder 304 f is configured to begin capturing video in response to receiving the camera activation signal. Even though theresponder 304 f is not en route to the location orevent 302, theresponder 304 f is en route to thegeographic area 306 and video captured by the responder camera of theresponder 304 f may assist with the efforts of responders responding to the location orevent 302. In some embodiments, the dispatch computing device receives the video from the responder camera associated with theresponder 304 f as a streaming video, and at least a portion of the streaming video is received by the dispatch computing device from the responder camera associated with theresponder 304 f prior to theresponder 304 f arriving at thegeographic area 306. - A
map 400 is depicted inFIG. 8 with indications of an event in the form of aparade 402 a, aparade route 402 b, and responders 404 a-j. In one embodiment, a dispatch computing device receives the indication of theparade route 402 b. In some examples, the dispatch computing device receives the indication of theparade route 402 b in the form of a series of addresses, a series of intersections, geospatial information data of theparade route 402 b, or any other information that identifies theparade route 402 b. In some examples, the indication of theparade route 402 b is received by the dispatch computing device from a staff person at a responder agency (e.g., police department staff) or from any other source. - The computing device determines a
geographic area 406 associated with theparade route 402 b. In the embodiment shown inFIG. 8 , thegeographic area 406 parallels theparade route 402 b within a predetermined distance of theparade route 402 b. In some embodiments, thegeographic area 406 represents an area of interest. In an effort to capture evidence about thegeographic area 406, the dispatch computing device automatically activates responder cameras associated with one or more of the responders 404 a-j in response to receiving one or more dispatch acknowledgements from the one or more of the responders 404 a-j. - In one embodiment, the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are within the
geographic area 406. For example, the responders 404 a-g may have sent dispatch acknowledgements that the responders 404 a-g are at their respective locations within thegeographic area 406 shown inFIG. 7 . The dispatch computing device automatically sends camera activation signals to responder cameras associated with the responders 404 a-g in response to receiving the dispatch acknowledgements from the responders 404 a-g. The responder cameras of the responders 404 a-g are configured to begin capturing video in response to receiving the camera activation signals. In some embodiments, the dispatch acknowledgements from the responders 404 a-g are received before the dispatch computing device determines thegeographic area 406 and the dispatch computing device sending the camera activation signals to responder cameras associated with the responders 404 a-g in response to receiving the dispatch acknowledgements from the responders 404 a-g includes sending the camera activation signals to responder cameras associated with the responders 404 a-g in response to receiving the dispatch acknowledgements from the responders 404 a-g and determining thegeographic area 406. - In another embodiment, the dispatch computing device automatically sends a camera activation signal to one or more responder cameras associated with one or more responders that are en route to the
geographic area 406. For example, theresponder 404 j may have been informed thatresponder 404 d requires assistance and theresponder 404 j may have sent a dispatch acknowledgement that theresponder 404 j is en route to the location whereresponder 404 d is located within thegeographic area 406. The dispatch computing device automatically sends camera activation signals to the responder camera associated with theresponder 404 j in response to receiving the dispatch acknowledgement from theresponder 404 d. The responder camera of theresponder 404 d is configured to begin capturing video in response to receiving the camera activation signal. - While the embodiments depicted in
FIGS. 7 and 8 include particular scenarios, such as theparade 402 a and theparade route 402 b depicted inFIG. 8 , the concepts discussed above with respect toFIGS. 7 and 8 apply to any number of other stations. In one example, the embodiment discussed above with respect toFIG. 7 can be adapted for larger events, such as a sporting event, a fair, a demonstration, or any other event. In some embodiments, an event may take place over a larger area (e.g., over one or more city blocks) and the dispatch computing device determines a geographic area associated with the event based on the larger area of the event itself. In another example, the embodiment discussed above with respect toFIG. 8 can be adapted for other events that have a route, such as a vehicle chase with an anticipated chase route, a funeral processional with an expected route, or any other event that occurs over a route. In some embodiments, the dispatch computing device determines the geographic area based on a probability of an event taking a particular route (e.g., determining a geographic area associated with a vehicle chase based on a probability of the vehicle taking a particular route). - An embodiment of a
method 500 performed by a computing device to automatically activate a responder camera is depicted inFIG. 9 . Atblock 502, the computing device receives an indication of a location or an event. As described above, the indication can be received from a responder computing device via a network, from a dispatcher in communication with a responder, from a responder agency staff member, or from any other source. Atblock 504, the computing device determines a geographic area associated with the location or event. As described above, the geographic area can be a regular shape, an irregular shape, a user-defined area, a predetermined distance from the location or event, or any other geographic area associated with the location or event. - At
block 506, the computing device receives a dispatch acknowledgement from a responder, where the dispatch acknowledgement indicates that the responder is at the geographic area or that the responder is en route to the geographic area. As described above, the dispatch acknowledgement can be received from a responder computing device via a network, from a dispatcher entering the dispatch acknowledgement in response to a communication between the responder and the dispatcher, or from any other source conveying the dispatch acknowledgement from the responder. Atblock 508, the computing device automatically sends a camera activation signal to a responder camera associated with the responder in response to receiving the dispatch acknowledgement from the responder. The responder camera is configured to begin capturing a video in response to receiving the camera activation signal. As described above, the camera activation signal may be sent from the computing device to the responder camera via a network, from the computing device to a responder computing device where the responder computing device is configured to relay the camera activation signal to the responder camera, or from the computing device to the responder camera in any other way. - Unless otherwise specified in the context of specific examples, described techniques and tools may be implemented by any suitable computing device or set of computing devices.
- In any of the described examples, a data store contains data as described herein and may be hosted, for example, by a database management system (DBMS) to allow a high level of data throughput between the data store and other components of a described system. The DBMS may also allow the data store to be reliably backed up and to maintain a high level of availability. For example, a data store may be accessed by other system components via a network, such as a private network in the vicinity of the system, a secured transmission channel over the public Internet, a combination of private and public networks, and the like. Instead of, or in addition to a DBMS, a data store may include structured data stored as files in a traditional file system. Data stores may reside on computing devices that are part of or separate from components of systems described herein. Separate data stores may be combined into a single data store, or a single data store may be split into two or more separate data stores.
- Some of the functionality described herein may be implemented in the context of a client-server relationship. In this context, server devices may include suitable computing devices configured to provide information and/or services described herein. Server devices may include any suitable computing devices, such as dedicated server devices. Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device. The term “client” can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server. At various times, a single device may act as a server, a client, or both a server and a client, depending on context and configuration. Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location.
-
FIG. 10 depicts a block diagram that illustrates aspects of anillustrative computing device 600 appropriate for use in accordance with embodiments of the present disclosure. The description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other currently available or yet to be developed devices that may be used in accordance with embodiments of the present disclosure. - In its most basic configuration, the
computing device 600 includes at least oneprocessor 602 and asystem memory 604 connected by acommunication bus 606. Depending on the exact configuration and type of device, thesystem memory 604 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology. Those of ordinary skill in the art and others will recognize thatsystem memory 604 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by theprocessor 602. In this regard, theprocessor 602 may serve as a computational center of thecomputing device 600 by supporting the execution of instructions. - As further illustrated in
FIG. 10 , thecomputing device 600 may include anetwork interface 610 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure may access basic services that utilize thenetwork interface 610 to perform communications using common network protocols. Thenetwork interface 610 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and/or the like. - In the illustrative embodiment depicted in
FIG. 10 , thecomputing device 600 also includes astorage medium 608. However, services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, thestorage medium 608 depicted inFIG. 10 is optional. In any event, thestorage medium 608 may be volatile or nonvolatile, removable or non-removable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic tape, magnetic disk storage, and/or the like. - As used herein, the term “computer-readable medium” includes volatile and nonvolatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, or other data. In this regard, the
system memory 604 andstorage medium 608 depicted inFIG. 10 are examples of computer-readable media. - For ease of illustration and because it is not important for an understanding of the claimed subject matter,
FIG. 10 does not show some of the typical components of many computing devices. In this regard, thecomputing device 600 may include input devices, such as a keyboard, keypad, mouse, trackball, microphone, video camera, touchpad, touchscreen, electronic pen, stylus, and/or the like. Such input devices may be coupled to thecomputing device 600 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, USB, or other suitable connection protocols using wireless or physical connections. - In any of the described examples, data can be captured by input devices and transmitted or stored for future processing. The processing may include encoding data streams, which can be subsequently decoded for presentation by output devices. Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device). Input devices can be separate from and communicatively coupled to computing device 600 (e.g., a client device), or can be integral components of the
computing device 600. In some embodiments, multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone). Thecomputing device 600 may also include output devices such as a display, speakers, printer, etc. The output devices may include video output devices such as a display or touchscreen. The output devices also may include audio output devices such as external speakers or earphones. The output devices can be separate from and communicatively coupled to thecomputing device 600, or can be integral components of thecomputing device 600. Input functionality and output functionality may be integrated into the same input/output device (e.g., a touchscreen). Any suitable input device, output device, or combined input/output device either currently known or developed in the future may be used with described systems. - In general, functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, Python, Ruby, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft.NET™ languages such as C#, and/or the like. Computing logic may be compiled into executable programs or written in interpreted programming languages. Generally, functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub modules. The computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general purpose or special purpose processors, thus creating a special purpose computing device configured to provide functionality described herein.
- Many alternatives to the systems and devices described herein are possible. For example, individual modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems. As another example, modules or subsystems can be omitted or supplemented with other modules or subsystems. As another example, functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems. Although some examples in the present disclosure include descriptions of devices comprising specific hardware components in specific arrangements, techniques and tools described herein can be modified to accommodate different hardware components, combinations, or arrangements. Further, although some examples in the present disclosure include descriptions of specific usage scenarios, techniques and tools described herein can be modified to accommodate different usage scenarios. Functionality that is described as being implemented in software can instead be implemented in hardware, or vice versa.
- Many alternatives to the techniques described herein are possible. For example, processing stages in the various techniques can be separated into additional stages or combined into fewer stages. As another example, processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages. As another example, processing stages that are described as occurring in a particular order can instead occur in a different order. As another example, processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages. As another example, processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.
- Embodiments disclosed herein include a computer-implemented method for performing one or more of the above-described techniques; a computing device comprising a processor and computer-readable storage media having stored thereon computer executable instructions configured to cause the server computer to perform one or more of the above described techniques; a computer-readable storage medium having stored thereon computer executable instructions configured to cause a computing device to perform one or more of the above-described techniques; a computing system comprising a server that provides one or more of the above-described services. The computer system may further comprise plural client computing devices; and a client computing device in communication with a server that provides one or more of the above-described services, the client computing device comprising a processing unit and computer-readable storage media having stored thereon computer executable instructions configured to cause the client computing device to perform one or more of the above described techniques.
- The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the claimed subject matter.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/436,781 US11356591B2 (en) | 2015-11-12 | 2019-06-10 | Dispatch-based responder camera activation |
US17/834,861 US11902654B2 (en) | 2015-11-12 | 2022-06-07 | Dispatch-based responder camera activation |
US18/440,814 US20240187729A1 (en) | 2015-11-12 | 2024-02-13 | Dispatch-Based Responder Camera Activation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/939,543 US10321039B2 (en) | 2015-11-12 | 2015-11-12 | Dispatch-based responder camera activation |
US16/436,781 US11356591B2 (en) | 2015-11-12 | 2019-06-10 | Dispatch-based responder camera activation |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US201514939453A Continuation | 2007-10-04 | 2015-11-12 | |
US14/939,543 Continuation US10321039B2 (en) | 2015-11-12 | 2015-11-12 | Dispatch-based responder camera activation |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/834,861 Continuation US11902654B2 (en) | 2015-11-12 | 2022-06-07 | Dispatch-based responder camera activation |
Publications (3)
Publication Number | Publication Date |
---|---|
US20190294411A1 US20190294411A1 (en) | 2019-09-26 |
US20200301658A9 true US20200301658A9 (en) | 2020-09-24 |
US11356591B2 US11356591B2 (en) | 2022-06-07 |
Family
ID=58692220
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/939,543 Active 2035-12-30 US10321039B2 (en) | 2015-11-12 | 2015-11-12 | Dispatch-based responder camera activation |
US16/436,781 Active 2036-03-17 US11356591B2 (en) | 2015-11-12 | 2019-06-10 | Dispatch-based responder camera activation |
US17/834,861 Active US11902654B2 (en) | 2015-11-12 | 2022-06-07 | Dispatch-based responder camera activation |
US18/440,814 Pending US20240187729A1 (en) | 2015-11-12 | 2024-02-13 | Dispatch-Based Responder Camera Activation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/939,543 Active 2035-12-30 US10321039B2 (en) | 2015-11-12 | 2015-11-12 | Dispatch-based responder camera activation |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/834,861 Active US11902654B2 (en) | 2015-11-12 | 2022-06-07 | Dispatch-based responder camera activation |
US18/440,814 Pending US20240187729A1 (en) | 2015-11-12 | 2024-02-13 | Dispatch-Based Responder Camera Activation |
Country Status (1)
Country | Link |
---|---|
US (4) | US10321039B2 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10460590B2 (en) * | 2016-07-26 | 2019-10-29 | Tyco Integrated Security, LLC | Method and system for mobile duress alarm |
US10349148B2 (en) * | 2016-12-20 | 2019-07-09 | Intergraph Corporation | Computer-aided dispatch systems and methods utilizing biometrics to assess responder condition and suitability |
US10661799B2 (en) * | 2017-12-27 | 2020-05-26 | Motorola Solutions, Inc. | Device, system and method for autonomous tactical vehicle control |
US10105108B1 (en) * | 2018-04-06 | 2018-10-23 | Leeps Pro, Inc. | First responder dispatch system and methods of operation thereof |
US10692304B1 (en) | 2019-06-27 | 2020-06-23 | Feniex Industries, Inc. | Autonomous communication and control system for vehicles |
US11184734B1 (en) | 2020-08-19 | 2021-11-23 | T-Mobile Usa, Inc. | Using geofencing areas to improve road safety use cases in a V2X communication environment |
WO2023287402A1 (en) * | 2021-07-13 | 2023-01-19 | Parker Jr Donald W | System and method for monitoring and processing the usage of law enforcement equipment |
Family Cites Families (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030028536A1 (en) | 2001-02-27 | 2003-02-06 | Singh Hartej P. | Proactive emergency response system |
US6621422B2 (en) | 2001-10-01 | 2003-09-16 | Advanced Public Safety, Inc. | Apparatus for communicating with law enforcement during vehicle travel and associated methods |
US7034678B2 (en) | 2002-07-02 | 2006-04-25 | Tri-Sentinel, Inc. | First responder communications system |
US7091852B2 (en) | 2002-07-02 | 2006-08-15 | Tri-Sentinel, Inc. | Emergency response personnel automated accountability system |
US20070103292A1 (en) | 2002-07-02 | 2007-05-10 | Burkley Raymond T | Incident control system with multi-dimensional display |
US7091851B2 (en) | 2002-07-02 | 2006-08-15 | Tri-Sentinel, Inc. | Geolocation system-enabled speaker-microphone accessory for radio communication devices |
US7245216B2 (en) | 2002-07-02 | 2007-07-17 | Tri-Sentinel, Inc. | First responder communications system |
US7925246B2 (en) | 2002-12-11 | 2011-04-12 | Leader Technologies, Inc. | Radio/telephony interoperability system |
US20040179092A1 (en) | 2003-03-14 | 2004-09-16 | Lapoint Donald A. | Videoconferencing communication system |
US7177623B2 (en) | 2003-07-02 | 2007-02-13 | The United States Of America As Represented By The Secretary Of The Army | Localized cellular awareness and tracking of emergencies |
US7158026B2 (en) | 2004-02-06 | 2007-01-02 | @Security Broadband Corp. | Security system configured to provide video and/or audio information to public or private safety personnel at a call center or other fixed or mobile emergency assistance unit |
WO2005099420A2 (en) | 2004-04-14 | 2005-10-27 | Interop-Solutions, Llc | System and method for managing communication interoperability switches |
US7508840B2 (en) | 2004-05-28 | 2009-03-24 | Bae Systems Information And Electronic Systems Integration Inc. | Mobile temporary incident area network for local communications interoperability |
WO2006053111A2 (en) | 2004-11-10 | 2006-05-18 | Bae Systems Information And Electronic Systems Integration Inc. | Device for recording and communicating crisis incidents |
US8665087B2 (en) | 2004-11-10 | 2014-03-04 | Bae Systems Information And Electronic Systems Integration Inc. | Wearable or portable device including sensors and an image input for establishing communications interoperability and situational awareness of events at an incident site |
US7881862B2 (en) | 2005-03-28 | 2011-02-01 | Sap Ag | Incident command post |
US20070282631A1 (en) | 2005-09-08 | 2007-12-06 | D Ambrosia Robert Matthew | System and method for aggregating and providing subscriber medical information to medical units |
US8260338B2 (en) | 2006-02-28 | 2012-09-04 | Cisco Technology, Inc. | Method and system for providing interoperable communications with dynamic event area allocation |
US7652571B2 (en) | 2006-07-10 | 2010-01-26 | Scott Technologies, Inc. | Graphical user interface for emergency apparatus and method for operating same |
WO2008027750A2 (en) | 2006-08-29 | 2008-03-06 | At & T Mobility Ii Llc | First responder ad-hoc communications |
US8280364B1 (en) | 2006-08-31 | 2012-10-02 | At&T Mobility Ii Llc | Interoperability of first responder devices |
US7680947B2 (en) | 2006-11-10 | 2010-03-16 | ICOP Digital IN | System and method for collaborating emergency response efforts through multimedia data dissemination |
US20080181132A1 (en) | 2007-01-26 | 2008-07-31 | Large Screen Display Rentals, Llc. | Command Anywhere Mobile Mesh System |
US8013734B2 (en) | 2007-05-14 | 2011-09-06 | Autocart Llc | Personal safety mobile notification system |
CN101939586B (en) * | 2008-04-18 | 2013-11-13 | 夏普株式会社 | Illumination device and liquid crystal display device |
US8314683B2 (en) * | 2008-05-09 | 2012-11-20 | The Israelife Foundation | Incident response system |
US9509867B2 (en) | 2008-07-08 | 2016-11-29 | Sony Corporation | Methods and apparatus for collecting image data |
US8275404B2 (en) | 2008-10-29 | 2012-09-25 | Google Inc. | Managing and monitoring emergency services sector resources |
US8368754B2 (en) | 2009-03-12 | 2013-02-05 | International Business Machines Corporation | Video pattern recognition for automating emergency service incident awareness and response |
US10419722B2 (en) | 2009-04-28 | 2019-09-17 | Whp Workflow Solutions, Inc. | Correlated media source management and response control |
US8311983B2 (en) | 2009-04-28 | 2012-11-13 | Whp Workflow Solutions, Llc | Correlated media for distributed sources |
CN102770046B (en) | 2009-12-24 | 2015-04-29 | 诺罗克咖啡桌私人有限公司 | Stabilization of objects |
US8504090B2 (en) | 2010-03-29 | 2013-08-06 | Motorola Solutions, Inc. | Enhanced public safety communication system |
US8923799B2 (en) * | 2010-05-14 | 2014-12-30 | The Cordero Group | Method and system for an automated dispatch protocol |
US8412254B2 (en) | 2010-06-02 | 2013-04-02 | R&L Carriers, Inc. | Intelligent wireless dispatch systems |
US8600338B2 (en) | 2011-01-28 | 2013-12-03 | Brent Perrott | Fire alarm text response system |
US9154740B2 (en) * | 2011-06-29 | 2015-10-06 | Zap Group Llc | System and method for real time video streaming from a mobile device or other sources through a server to a designated group and to enable responses from those recipients |
KR101882442B1 (en) * | 2011-12-21 | 2018-07-26 | 엘지전자 주식회사 | Mobile terminal, server, method for controlling of the mobile terminal, mehtod for controlling of the server |
US9019431B2 (en) * | 2012-09-28 | 2015-04-28 | Digital Ally, Inc. | Portable video and imaging system |
US8873719B2 (en) | 2013-01-31 | 2014-10-28 | Jeffrey J. Clawson | Active assailant protocol for emergency dispatch |
US20140243036A1 (en) | 2013-02-27 | 2014-08-28 | Phillip A. Kouwe | System and method for emergency response |
US20140274152A1 (en) * | 2013-03-14 | 2014-09-18 | Tim Lichti | System and Method for Tracking of Mobile Resources |
US9467662B2 (en) | 2013-03-21 | 2016-10-11 | Jeffrey Childers | Emergency response system and method |
US9253452B2 (en) * | 2013-08-14 | 2016-02-02 | Digital Ally, Inc. | Computer program, method, and system for managing multiple data recording devices |
US20150086175A1 (en) * | 2013-09-25 | 2015-03-26 | Mobile-Vision, Inc. | Integrated video and audio recording and transmission |
US9767677B1 (en) * | 2014-01-26 | 2017-09-19 | Prescott N Paulin | Data driven emergency notification application and system |
US10171980B2 (en) | 2014-04-08 | 2019-01-01 | Jason Friesen | Systems and methods for emergency response dispatch |
US20150347079A1 (en) | 2014-05-29 | 2015-12-03 | LifeSaver Int'l Inc | Electronic device for determining when an officer is in a foot pursuit, a fight, has been incapacitated, or shots have been fired |
JP6212446B2 (en) * | 2014-07-16 | 2017-10-11 | 本田技研工業株式会社 | Automatic transmission hydraulic pressure abnormality detection device |
US20160042767A1 (en) | 2014-08-08 | 2016-02-11 | Utility Associates, Inc. | Integrating data from multiple devices |
US20160094773A1 (en) * | 2014-09-29 | 2016-03-31 | Apple Inc. | Collaborative Image Collection And Processing Using Portable Cameras |
US10025829B2 (en) * | 2014-12-19 | 2018-07-17 | Conduent Business Services, Llc | Computer-implemented system and method for analyzing organizational performance from episodic data |
US20160366327A1 (en) * | 2015-06-09 | 2016-12-15 | Collateral Opportunities, Llc | Method and system for determining whether a law enforcement instrument has been removed and concurrently activating a body camera |
US20170124505A1 (en) * | 2015-11-03 | 2017-05-04 | Motorola Solutions, Inc. | Dispatch controller and method for assigning a role of pursuit vehicle |
-
2015
- 2015-11-12 US US14/939,543 patent/US10321039B2/en active Active
-
2019
- 2019-06-10 US US16/436,781 patent/US11356591B2/en active Active
-
2022
- 2022-06-07 US US17/834,861 patent/US11902654B2/en active Active
-
2024
- 2024-02-13 US US18/440,814 patent/US20240187729A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20170142316A1 (en) | 2017-05-18 |
US10321039B2 (en) | 2019-06-11 |
US11902654B2 (en) | 2024-02-13 |
US20190294411A1 (en) | 2019-09-26 |
US20240187729A1 (en) | 2024-06-06 |
US11356591B2 (en) | 2022-06-07 |
US20220303449A1 (en) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11638124B2 (en) | Event-based responder dispatch | |
US11902654B2 (en) | Dispatch-based responder camera activation | |
US11510044B2 (en) | Communication between responders | |
US20200042945A1 (en) | Collaborative work environment for computer-aided dispatch | |
US9641965B1 (en) | Method, system and computer program product for law enforcement | |
JP2020064451A (en) | Information processing apparatus, information processing system, and information processing method | |
EP3375183B1 (en) | Dispatch-based responder camera activation | |
US20200184738A1 (en) | Rideshare Safety System | |
CA3037619C (en) | Event-based responder dispatch | |
WO2023133348A1 (en) | Using multiple geofences to initiate recording devices | |
WO2017034554A1 (en) | Communication between responders | |
US20240037947A1 (en) | System and method for identifying human interaction limitations based on historical information | |
EP4454315A1 (en) | System and method for redaction of data that is incidentally recorded |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: AXON ENTERPRISE, INC., ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BOHLANDER, MICHAEL J.;FORTNA, RAYMOND T.;HUANG, ANTHONY G.;AND OTHERS;REEL/FRAME:051392/0572 Effective date: 20151111 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ALLOWED -- NOTICE OF ALLOWANCE NOT YET MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |