US20210200826A1 - Appearance search for incident response - Google Patents

Appearance search for incident response Download PDF

Info

Publication number
US20210200826A1
US20210200826A1 US16/727,327 US201916727327A US2021200826A1 US 20210200826 A1 US20210200826 A1 US 20210200826A1 US 201916727327 A US201916727327 A US 201916727327A US 2021200826 A1 US2021200826 A1 US 2021200826A1
Authority
US
United States
Prior art keywords
update
search criteria
search
command server
appearance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/727,327
Inventor
Francesca Schuler
Larry D. Freeman
Christine K. Rapala
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motorola Solutions Inc
Original Assignee
Motorola Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Solutions Inc filed Critical Motorola Solutions Inc
Priority to US16/727,327 priority Critical patent/US20210200826A1/en
Assigned to MOTOROLA SOLUTIONS, INC. reassignment MOTOROLA SOLUTIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAPALA, Christine K., SCHULER, FRANCESCA, FREEMAN, LARRY D.
Priority to EP20834071.1A priority patent/EP4081913A1/en
Priority to PCT/US2020/063817 priority patent/WO2021133545A1/en
Publication of US20210200826A1 publication Critical patent/US20210200826A1/en
Priority to US18/465,543 priority patent/US20240004942A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • G06K9/00671
    • G06K9/00771
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • Appearance searches are performed with search engines at intelligence centers (for example, real time crime centers, campus security services, departments responsible for amber and/or silver alerts, and the like). As the name implies, the objective of such a search is to locate a person or object of interest across multiple databases or sources of information. In some instances, an appearance search generates a playlist including a specified vehicle or person identified across images and videos in a particular geographical area. Appearance searches are performed during post incident analysis and are generally used for evidentiary purposes.
  • intelligence centers for example, real time crime centers, campus security services, departments responsible for amber and/or silver alerts, and the like.
  • an appearance search generates a playlist including a specified vehicle or person identified across images and videos in a particular geographical area. Appearance searches are performed during post incident analysis and are generally used for evidentiary purposes.
  • FIG. 1 is an appearance search system for incident response in accordance with some embodiments.
  • FIG. 2 is a block diagram of a command server of the system of FIG. 1 in accordance with some embodiments.
  • FIG. 3 is a block diagram of a responding device of the system of FIG. 1 in accordance with some embodiments.
  • FIG. 4 is a flowchart of a method for incident response in accordance with some embodiments.
  • FIG. 5 is a chart showing an example of confidence level calculation performed by the command server of FIG. 2 in accordance with some embodiments.
  • search criteria for appearance search initiated at an intelligence center are used minimally during incident response by dispatchers and responders. That is, analytics for the search criteria of the appearance search are pre-provisioned and dispatched prior to or one-time during incident response.
  • the search criteria for the appearance search may need to be updated during the incident response due to changing circumstances.
  • the command server includes a transceiver enabling communication between the command server and a plurality of responding devices and an electronic processor coupled to the transceiver.
  • the electronic processor is configured to receive a search criteria and initiate an appearance search based on the search criteria.
  • the appearance search is performed on the plurality of responding devices.
  • the electronic processor is further configured to receive an update to the search criteria and determine a confidence level of the update based on weights assigned to a plurality of parameters of the update.
  • the electronic processor is also configured to create an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold and push the updated search criteria for the appearance search to the plurality of responding devices.
  • Another embodiment provides a method for appearance searching for incident response.
  • the method includes receiving a search criteria at a command server and initiating, using the command server, an appearance search based on the search criteria.
  • the appearance search is performed on a plurality of responding devices communicatively connected to the command server.
  • the method further includes receiving, at the command server, an update to the search criteria and determining, using the command server, a confidence level of the update based on weights assigned to each a plurality of parameters of the update.
  • the method also includes creating, using the command server, an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold and pushing the updated search criteria for the appearance search to the plurality of responding devices.
  • a system 100 for appearance search for incident response includes a command server 110 , data sources 120 , and a plurality of responding devices 130 A-E.
  • the plurality of responding devices 130 A-E communicate with the command server 110 over a communication network 140 .
  • the system 100 may include more or fewer components that those illustrated in FIG. 1 and may perform additional functions other than those described herein.
  • the command server 110 is a computing device implemented in a cloud infrastructure or located at an intelligence center or other location.
  • the intelligence center is, for example, a control point for coordinating an incident response (for example, a real time crime center (RTCC), a dispatch center, campus security center, and the like).
  • RTCC real time crime center
  • the plurality of responding devices 130 A-E include, for example, body worn devices of a first responder 130 A (for example, a body-worn camera, a portable two-way radio, and the like), an unmanned aerial vehicle device 130 B, devices in vehicles responding to an incident 130 C (for example, a dashboard camera, a mobile two-way radio, and the like), smart telephones of responding officers or citizens 130 D, fixed video cameras 130 E (for example, closed-circuit television cameras (CCTVs), surveillance cameras, traffic cameras, and the like), and the like.
  • the plurality of responding devices 130 A-E may be singularly referred to as a responding device 130 .
  • FIG. 2 is a block diagram of one embodiment of the command server 110 .
  • the command server 110 includes an electronic processor 210 , a memory 220 , a transceiver 230 , and an input/output interface 240 .
  • the electronic processor 210 , the memory 220 , the transceiver 230 , and the input/output interface 240 communicate over one or more control and/or data buses (for example, a communication bus 250 ).
  • FIG. 2 illustrates only one example embodiment of the command server 110 .
  • the command server 110 may include more or fewer components and may perform functions other than those explicitly described herein.
  • the electronic processor 210 is implemented as a microprocessor with separate memory, such as the memory 220 .
  • the electronic processor 210 may be implemented as a microcontroller (with memory 220 on the same chip).
  • the electronic processor 210 may be implemented using multiple processors.
  • the electronic processor 210 may be implemented partially or entirely as, for example, a field-programmable gate array (FPGA), an applications specific integrated circuit (ASIC), and the like and the memory 220 may not be needed or be modified accordingly.
  • the memory 220 includes non-transitory, computer-readable memory that stores instructions that are received and executed by the electronic processor 210 to carry out the functionality of the command server 110 described herein.
  • the memory 220 may include, for example, a program storage area and a data storage area.
  • the program storage area and the data storage area may include combinations of different types of memory, such as read-only memory and random-access memory.
  • the command server 110 may include one electronic processor 210 , and/or a plurality of electronic processors 210 in a cloud computer cluster arrangement, one or more of which may be executing none, all, or a portion of the applications of the command server 110 provided below, sequentially or in parallel across the one or more electronic processors 210 .
  • the one or more electronic processor 210 comprising the command server 110 may be geographically co-located or may be separated by inches, meters, kilometers or miles, and interconnected via electronic and/or optical interconnects.
  • One or more proxy servers or load balancing server may control which one or more electronic processors 210 perform any part or all of the applications provided below.
  • the transceiver 230 enables wired and/or wireless communication between the command server 110 and the plurality of responding devices 130 over the communication network 160 .
  • the transceiver 230 may comprise separate transmitting and receiving components.
  • the input/output interface 240 may include one or more input mechanisms (for example, a touch pad, a keypad, and the like), one or more output mechanisms (for example, a display, a speaker, and the like), or a combination thereof, or a combined input and output mechanism such as a touch screen.
  • the memory 220 stores several applications that are executed by the electronic processor 210 .
  • the memory 220 includes a video monitoring application 260 , a dispatch application 270 , and a search criteria application 280 .
  • the video monitoring application 260 is executed to perform an appearance search to analyze image and/or video files for identifying objects or persons of interest. The objects or persons of interest are identified based on a search criteria provided to or determined by the command server 110 .
  • the video monitoring application 260 is also used to generate a synopsis or playlist based on the appearance search.
  • the dispatch application 270 is, for example, a computer-aided dispatch application and is executed to dispatch all point bulletins (APBs), be-on-the-look-outs (BOLOs), and/or appearance searches to the plurality of responding devices 130 A-E.
  • the search criteria application 280 is executed to determine an initial search criteria for an appearance search or to update the search criteria for a current appearance search as further described below with respect to method 400 .
  • a single device is illustrated as including all the components and the applications of the command server 110 .
  • one or more of the components and one or more of the applications may be combined or divided into separate software, firmware and/or hardware. Regardless of how they are combined or divided, these components and application may be executed on the same computing device or may be distributed among different computing devices connected by one or more networks or other suitable communication means.
  • all the components and applications of the command server 110 are implemented in a cloud infrastructure accessible through several terminal devices, with the processing power located at a server location.
  • the components and applications of the command server 110 may be divided between separate intelligence center computing device and dispatch device co-located at an intelligence center or dispatch center of a responding organization (e.g., a police department). In yet another example, the components and applications of the command server 110 may be divided between separate computing devices not co-located with each other but communicatively connected with each other over a suitable communication network.
  • FIG. 3 is a block diagram of one embodiment of the responding device 130 .
  • the responding device 130 includes a device electronic processor 310 , a device memory 320 , a device transceiver 330 , and a device input/output interface 340 .
  • the device electronic processor 310 , the device memory 320 , the device transceiver 330 , and the device input/output interface 340 communicate over one or more control and/or data buses (for example, a device communication bus 350 ).
  • FIG. 3 illustrates only one example embodiment of the responding device 130 .
  • the responding device 130 may include more or fewer components and may perform functions other than those explicitly described herein.
  • the device electronic processor 310 , the device memory 320 , and the device input/output interface 340 are implemented same as or similar to the electronic processor 210 , the memory 220 , and the input/output interface 240 .
  • the device transceiver 330 is implemented same as or similar to the transceiver 230 and enables wired or wireless communication between the responding device 130 and the communication network 140 .
  • the responding device 130 also includes an image capturing device 360 and a microphone 370 .
  • the image capturing device 360 and the microphone 370 are illustrated as being separate components. However, the image capturing device 360 and the microphone 370 may be part of the device input/output interface 340 .
  • the image capturing device 360 is a body-worn camera and the microphone 370 is part of a portable two-way radio device or a smart telephone of the first responder.
  • the body-worn camera may be communicatively coupled to the portable two-way radio, the smart telephone, or another computing device worn by the first responder.
  • the portable two-way radio, the smart telephone, or the other computing device includes the device electronic processor 310 and the device transceiver 330 to receive the images and/or videos captured by the body-worn camera and audio captured by the microphone 370 and to transfer the images, videos, and/or speech to the command server 110 through the communication network 140 .
  • the image capturing device 360 and the microphone 370 may be integrated into or mounted to the housing of the unmanned aerial vehicle.
  • the image capturing device 360 and the microphone 370 capture images, audio, and/or video at an incident scene as the unmanned aerial vehicle is navigated around the incident scene.
  • the captured, images, audio, and/or video are received and transmitted to the command server 110 through the communication network 140 .
  • the image capturing device 360 is a dashboard mounted camera and the microphone 370 is part of vehicle mounted mobile two-way radio device.
  • the dashboard mounted camera may be communicatively coupled to the mobile two-way radio or another computing device of the responding vehicle.
  • the mobile two-way radio or the another computing device includes the device electronic processor 310 and the device transceiver 330 to receive the images or videos captured by the dashboard mounted camera and audio captured by the microphone 370 and to transfer the images, videos, and/or speech to the command server 110 through the communication network 140 .
  • FIG. 4 illustrates a flowchart of an example method 400 for appearance search for incident response.
  • the method 400 includes receiving search criteria at the command server 110 (at block 410 ).
  • the command server 110 monitors the data sources 120 and the plurality of responding devices 130 for data regarding an ongoing incident.
  • the data sources 120 may include an incident report received at a terminal of the command server 110 or audio and/or video files analyzed at the intelligence center.
  • the data sources 120 may include a report including a tip generated and input at the intelligence center by an officer.
  • the search criteria may come from a citizen calling a 911 emergency call taker and the 911 emergency call taker providing information received from the citizen into the data source 120 as a tip sheet.
  • the search criteria may come from a specific first responder (for example, responding police officer) requesting a computer aided dispatch (CAD) dispatcher to initiate a search and the CAD dispatcher providing the information received from the first responder to the data source 120 .
  • CAD computer aided dispatch
  • an intelligence center analyst may also refer to a tip database that is part of the data sources 120 for initiating or updating the search criteria.
  • a search criteria for an appearance search may be generated and/or input to the command server 110 .
  • the appearance search generally includes a subject that is, for example, a person, a vehicle, an object, and the like (that is, an object of interest).
  • the search criteria defines the characteristics of the subject of the appearance search such that the devices performing the appearance search can identify subjects having the characteristics in images or videos. That is, the appearance search includes a responding device 130 of the plurality of responding devices 130 determining whether an image and/or video captured by the responding device 130 includes an object of interest that is the subject of the appearance search based on the search criteria.
  • an intelligence center analyst monitoring live video detects an incident occurring involving a vehicle.
  • the vehicle moves out of the view of a camera being monitored by the analyst.
  • the intelligence center analyst may not be sure whether the vehicle is still in the area.
  • the analyst may then input a request for an appearance search with the command server 110 .
  • the analyst may provide a description or characteristics of the vehicle (for example, make, model, year, color, license plate number, and the like) as the search criteria. This search criteria is received by the command server 110 for the appearance search.
  • the method 400 also includes initiating, using the command server 110 , an appearance search based on the search criteria (at block 420 ).
  • the appearance search is performed on the plurality of responding devices 130 communicatively connected to the command server 110 .
  • the command server 110 issues an appearance search based on the search criteria received for an incident.
  • the command server 110 dispatches the search criteria to the various responding devices 130 responding to the incident.
  • the responding devices 130 receive the search criteria and perform the appearance search based on the search criteria.
  • the image and video files are analyzed at the responding device 130 based on the search criteria to perform the appearance search.
  • the image and video files are transmitted from the responding device 130 to another computing device (for example, the command server 110 ), which in turn performs the appearance search on the video files.
  • the appearance search may be partially conducted on the responding device 130 and partially on the command server 110 using the video monitoring application 260 .
  • the command server 110 initiates an appearance search for the missing vehicle based on the search criteria received for the missing vehicle.
  • the search criteria is dispatched to the responding devices 130 that may include responding vehicles and officers in the area or an unmanned aerial vehicle deployed for the incident.
  • the responding devices 130 use the search criteria to identify the vehicle and transmit an indication to the command server 110 if the vehicle is identified by a responding device 130 .
  • the responding device 130 may also transmit the image or a portion of a captured video including the identified vehicle to the command server 110 .
  • the method 400 includes receiving, at the command server 110 , an update to the search criteria (at block 430 ).
  • the search criteria may have to be updated to accurately perform the appearance search.
  • the updated search criteria may be received same or similarly as the original search criteria.
  • information may be provided to the data sources 120 based on a tip call from a citizen to a 911 emergency call taker or from a responding officer to a CAD dispatcher.
  • the new information may be discovered based on a historical appearance search or during the active appearance search. For example, an analyst looking at a video sample of the appearance search may notice a change to the search criteria (for example, a suspect changed clothes, a suspect changed license plates on a vehicle, and the like).
  • the new information may also be discovered by an officer responding to the incident.
  • the officer may provide the new information via audio through the responding officer's microphone 370 or via text through the officer two-way radio or smart telephone.
  • the new information may automatically trigger an update to the search criteria at the command server 110 .
  • the new information is received as an update to the search criteria at the command server 110 .
  • the command server 110 automatically initiates the update in response to analyzing data from a plurality of data sources.
  • the command server 110 analyzes new tips received from the data sources 120 , analyzes other audio, video, and/or image sources to determine new information or changed circumstances and automatically initiates the update based on the new information or changed circumstances.
  • the method 400 includes determining, using the command server 110 , a confidence level of the update based on weights assigned to each a plurality of parameters of the update (at block 440 ). Although new information may be discovered regarding an incident, all new information may not trigger an update to the search criteria.
  • the command server 110 vets the update to the search criteria to determine whether pushing the update to the plurality of responding devices 130 is warranted based on the update.
  • the update includes plurality of parameters that are weighted by the intelligence center.
  • the plurality of parameters include, for example, a number of instances of the update match in history, a number of instances of the update match in active search, whether the update was derived via data capture, whether the update was manually observed by a role (for example, a responding officer), whether the update was observed across multiple of a single role type (for example, witnessed by multiple responding officers, captured on multiple dashboard mounted cameras, or the like), and whether the update was observed across multiple role types (for example, both witnessed by a responding officer, and observed on a dashboard mounted camera).
  • roles for example, a responding officer
  • Examples of different role types include an intelligence center analyst, a computer aided dispatch (CAD) dispatcher, a 911 emergency call taker, emergency responders including emergency medical technicians (EMTs), police officer, and firefighters, and the like.
  • Another example of an update being observed across multiple role types includes an intelligence center analyst witnessing an update using a dashboard camera of a responding vehicle and confirming the update to the command server 110 in addition to a responding police officer manually observing the update and confirming the update to the command server 110 .
  • Each of the plurality of parameters may be assigned a certain weight by the command server 110 . These weights may be pre-stored. In one example as shown in FIG. 5 , the weights are integers. In other examples, the weights include, but not limited to, percentages, decimals, and the like.
  • the command server 110 determines a calculated weight number for each parameter based on whether the update satisfies the parameter or the amount to which the update satisfied the parameter.
  • the command server 110 adds the calculated weight numbers to determine the confidence level of the update.
  • the calculated weight number may be determined using, but not limited to, summation, multiplication, division, and/or other mathematical combinational process to determine a confidence level.
  • the threshold confidence level varies based on the form factor (for example, integers, percentages, decimals, and the like) used for the weights.
  • the threshold confidence level may be a threshold integer, a threshold percentage, a threshold decimal or the like.
  • the threshold confidence level may include multiple threshold confidence levels.
  • the threshold confidence level may include a 80% threshold and a 50% threshold. When the confidence level is above the 80% threshold, the update to the appearance search may be automatically applied and/or executed.
  • the update may not be automatically applied and/or executed and an intelligence center analyst may be prompted to update the appearance search.
  • the update may be automatically discarded and/or saved/shelved for future consideration.
  • FIG. 5 illustrates one example chart 500 for determining a confidence level of an update.
  • the command server 110 determines the confidence level of an update requested by a responding officer based on an observation by multiple responding officer.
  • the responding officer may provide the new information to the command server 110 (for example, by using an application on the two-way radio or smart telephone communicating with the command server 110 ), which may automatically trigger the update based on the new information.
  • the responding office may not specifically request an update, but may provide a report of an observation. The report may turn into a request for an update or may trigger the update when the calculated weight number reaches the threshold confidence level.
  • the chart 500 includes a first column showing the plurality of parameters, a second column showing predetermined parameter weights assigned to the plurality of parameters, a third column showing whether and/or the amount to which the update satisfies a parameter, and a fourth column showing the calculated weight number for the update for the plurality of parameters. Since the update was based on manual observation by a responding officer, the command server 110 assigns zero calculated weight numbers for the parameters: (i) number of instances of update match in history; (ii) number of instances of update match in active search; and (iii) whether the update was derived via data capture. The command server 110 assigns weight numbers of 2, 2, and 4 respectively based on the update being manually observed, observed by multiple responding officers, and observed by responding officers with different roles. The command server 110 adds the calculated weight numbers to determine that the confidence level is 8 or 50% of the maximum of 16.
  • the method 400 includes creating, using the command server 110 , an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold (at block 450 ).
  • the predetermined confidence threshold may be preset into the command server 110 .
  • a different predetermined confidence threshold may be stored in correspondence with different types of incidents.
  • the predetermined confidence threshold may be programmatically calculated based on predetermined criteria (for example, incident type, incident location, and the like) stored in the command server 110 .
  • the predetermined confidence threshold may be lower for an amber alert and higher for a prisoner escape incident.
  • the command server 110 determines that the confidence level meets or exceeds the predetermined confidence level, the command server 110 generates the updated search criteria based on the original search criteria and the update.
  • the update may include removing the previous license plate number and adding the new license plate number.
  • the updated search criteria may include the previous search criteria with the license plate number updated to the new license plate number. Accordingly, the update may include one or more of an amendment, a deletion, and an addition of a characteristic provided in the original or previously updated search criteria.
  • the command server 110 may discard the update and does not create the updated search criteria, or may cause a prompt to be provided to a dispatcher or other person in the manner already set forth above.
  • the command server 110 saves the update for future implementation when the confidence level does not meet or exceed the predetermined confidence level.
  • the saved or shelved update may be used as a parameter to weigh the confidence level of a future update as described above. For example, if a similar update to the search criteria is received in the future, the future update may result in a higher confidence level based the saved or shelved update.
  • the appearance search is conducted based on the original search criteria or a previously updated search criteria when the confidence level of the current update does not meet or exceed the predetermined confidence level.
  • the method 400 also includes pushing the updated search criteria for the appearance search to the plurality of responding devices 130 (at block 460 ).
  • the command server 110 executes the dispatch application 270 to push the updated search criteria to the plurality of responding device 130 .
  • the plurality of responding devices 130 now conduct the appearance search based on the updated search criteria and may disregard the original search criteria.
  • the command server 110 receives, via the transceiver 230 , an acknowledgement from the plurality of responding devices 130 in response to the plurality of responding devices 130 receiving the updated search criteria.
  • the method 400 repeats for any additional update received to the search criteria.
  • the command server 110 generates a historic report for the appearance search.
  • the command server 110 stores the initial search criteria and every update received for the search criteria.
  • the updates may be stored with the time at which the update was received and the reason for receiving the update to the search criteria. For example, the command server 110 stores whether the update was received based on a tip from a citizen to a 911 emergency call taker or from a responding police officer informing a CAD dispatcher.
  • the command server 110 may also store whether the update was implemented, discarded, or saved for future implementation.
  • the command server 110 may store the confidence level including the weights received based on each parameter for the update to the search criteria.
  • the historic report generated by the command server may include the timestamp for each update to the search criteria and the reason for receiving the update to the search criteria.
  • the historic report may also include the weights for each parameter for each of the updates and the confidence level for each of the updates.
  • Manual feedback of when an update should have been triggered but was not, or when an update was triggered but should not have been, may be fed into a machine learning program, along with the parameters, weights, and threshold levels set forth above, to create a trained model for updating appearance searches in the future, among other possibilities.
  • processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
  • processors or “processing devices” such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
  • FPGAs field programmable gate arrays
  • unique stored program instructions including both software and firmware
  • an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (for example, comprising a processor) to perform a method as described and claimed herein.
  • Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Telephonic Communication Services (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Apparatus and method for appearance search for incident response. One embodiment provides a method for appearance searching for incident response. The method includes receiving a search criteria at a command server and initiating an appearance search based on the search criteria. The appearance search is performed on a plurality of responding devices communicatively connected to the command server. The method further includes receiving, at the command server, an update to the search criteria and determining, using the command server, a confidence level of the update based on weights assigned to each a plurality of parameters of the update. The method also includes creating, using the command server, an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold and pushing the updated search criteria for the appearance search to the plurality of responding devices.

Description

    BACKGROUND OF THE INVENTION
  • Appearance searches are performed with search engines at intelligence centers (for example, real time crime centers, campus security services, departments responsible for amber and/or silver alerts, and the like). As the name implies, the objective of such a search is to locate a person or object of interest across multiple databases or sources of information. In some instances, an appearance search generates a playlist including a specified vehicle or person identified across images and videos in a particular geographical area. Appearance searches are performed during post incident analysis and are generally used for evidentiary purposes.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
  • FIG. 1 is an appearance search system for incident response in accordance with some embodiments.
  • FIG. 2 is a block diagram of a command server of the system of FIG. 1 in accordance with some embodiments.
  • FIG. 3 is a block diagram of a responding device of the system of FIG. 1 in accordance with some embodiments.
  • FIG. 4 is a flowchart of a method for incident response in accordance with some embodiments.
  • FIG. 5 is a chart showing an example of confidence level calculation performed by the command server of FIG. 2 in accordance with some embodiments.
  • Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
  • The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Currently, real time appearance search is more actively being used during incident response. Search criteria for appearance search initiated at an intelligence center are used minimally during incident response by dispatchers and responders. That is, analytics for the search criteria of the appearance search are pre-provisioned and dispatched prior to or one-time during incident response. However, the search criteria for the appearance search may need to be updated during the incident response due to changing circumstances.
  • Accordingly, there is a need for real-time analysis, vetting, and distributing updates to search criteria received during incident response to critical roles (for example, intelligence center analysts, computer aided dispatch (CAD) dispatchers, first responders, and the like).
  • One embodiment provides a command server for appearance searching for incident response. The command server includes a transceiver enabling communication between the command server and a plurality of responding devices and an electronic processor coupled to the transceiver. The electronic processor is configured to receive a search criteria and initiate an appearance search based on the search criteria. The appearance search is performed on the plurality of responding devices. The electronic processor is further configured to receive an update to the search criteria and determine a confidence level of the update based on weights assigned to a plurality of parameters of the update. The electronic processor is also configured to create an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold and push the updated search criteria for the appearance search to the plurality of responding devices.
  • Another embodiment provides a method for appearance searching for incident response. The method includes receiving a search criteria at a command server and initiating, using the command server, an appearance search based on the search criteria. The appearance search is performed on a plurality of responding devices communicatively connected to the command server. The method further includes receiving, at the command server, an update to the search criteria and determining, using the command server, a confidence level of the update based on weights assigned to each a plurality of parameters of the update. The method also includes creating, using the command server, an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold and pushing the updated search criteria for the appearance search to the plurality of responding devices.
  • With reference to FIG. 1, a system 100 for appearance search for incident response includes a command server 110, data sources 120, and a plurality of responding devices 130A-E. The plurality of responding devices 130A-E communicate with the command server 110 over a communication network 140. The system 100 may include more or fewer components that those illustrated in FIG. 1 and may perform additional functions other than those described herein. The command server 110 is a computing device implemented in a cloud infrastructure or located at an intelligence center or other location. The intelligence center is, for example, a control point for coordinating an incident response (for example, a real time crime center (RTCC), a dispatch center, campus security center, and the like). The plurality of responding devices 130A-E include, for example, body worn devices of a first responder 130A (for example, a body-worn camera, a portable two-way radio, and the like), an unmanned aerial vehicle device 130B, devices in vehicles responding to an incident 130C (for example, a dashboard camera, a mobile two-way radio, and the like), smart telephones of responding officers or citizens 130D, fixed video cameras 130E (for example, closed-circuit television cameras (CCTVs), surveillance cameras, traffic cameras, and the like), and the like. The plurality of responding devices 130A-E may be singularly referred to as a responding device 130.
  • FIG. 2 is a block diagram of one embodiment of the command server 110. In the example illustrated, the command server 110 includes an electronic processor 210, a memory 220, a transceiver 230, and an input/output interface 240. The electronic processor 210, the memory 220, the transceiver 230, and the input/output interface 240 communicate over one or more control and/or data buses (for example, a communication bus 250). FIG. 2 illustrates only one example embodiment of the command server 110. The command server 110 may include more or fewer components and may perform functions other than those explicitly described herein.
  • In some embodiments, the electronic processor 210 is implemented as a microprocessor with separate memory, such as the memory 220. In other embodiments, the electronic processor 210 may be implemented as a microcontroller (with memory 220 on the same chip). In other embodiments, the electronic processor 210 may be implemented using multiple processors. In addition, the electronic processor 210 may be implemented partially or entirely as, for example, a field-programmable gate array (FPGA), an applications specific integrated circuit (ASIC), and the like and the memory 220 may not be needed or be modified accordingly. In the example illustrated, the memory 220 includes non-transitory, computer-readable memory that stores instructions that are received and executed by the electronic processor 210 to carry out the functionality of the command server 110 described herein. The memory 220 may include, for example, a program storage area and a data storage area. The program storage area and the data storage area may include combinations of different types of memory, such as read-only memory and random-access memory. In some embodiments, the command server 110 may include one electronic processor 210, and/or a plurality of electronic processors 210 in a cloud computer cluster arrangement, one or more of which may be executing none, all, or a portion of the applications of the command server 110 provided below, sequentially or in parallel across the one or more electronic processors 210. The one or more electronic processor 210 comprising the command server 110 may be geographically co-located or may be separated by inches, meters, kilometers or miles, and interconnected via electronic and/or optical interconnects. One or more proxy servers or load balancing server may control which one or more electronic processors 210 perform any part or all of the applications provided below.
  • The transceiver 230 enables wired and/or wireless communication between the command server 110 and the plurality of responding devices 130 over the communication network 160. In some embodiments, the transceiver 230 may comprise separate transmitting and receiving components. The input/output interface 240 may include one or more input mechanisms (for example, a touch pad, a keypad, and the like), one or more output mechanisms (for example, a display, a speaker, and the like), or a combination thereof, or a combined input and output mechanism such as a touch screen.
  • The memory 220 stores several applications that are executed by the electronic processor 210. In the example illustrated, the memory 220 includes a video monitoring application 260, a dispatch application 270, and a search criteria application 280. The video monitoring application 260 is executed to perform an appearance search to analyze image and/or video files for identifying objects or persons of interest. The objects or persons of interest are identified based on a search criteria provided to or determined by the command server 110. The video monitoring application 260 is also used to generate a synopsis or playlist based on the appearance search. The dispatch application 270 is, for example, a computer-aided dispatch application and is executed to dispatch all point bulletins (APBs), be-on-the-look-outs (BOLOs), and/or appearance searches to the plurality of responding devices 130A-E. The search criteria application 280 is executed to determine an initial search criteria for an appearance search or to update the search criteria for a current appearance search as further described below with respect to method 400.
  • In the example illustrated in FIG. 2, a single device is illustrated as including all the components and the applications of the command server 110. However, it should be understood that one or more of the components and one or more of the applications may be combined or divided into separate software, firmware and/or hardware. Regardless of how they are combined or divided, these components and application may be executed on the same computing device or may be distributed among different computing devices connected by one or more networks or other suitable communication means. In one example, all the components and applications of the command server 110 are implemented in a cloud infrastructure accessible through several terminal devices, with the processing power located at a server location. In another example, the components and applications of the command server 110 may be divided between separate intelligence center computing device and dispatch device co-located at an intelligence center or dispatch center of a responding organization (e.g., a police department). In yet another example, the components and applications of the command server 110 may be divided between separate computing devices not co-located with each other but communicatively connected with each other over a suitable communication network.
  • FIG. 3 is a block diagram of one embodiment of the responding device 130. In the example illustrated, the responding device 130 includes a device electronic processor 310, a device memory 320, a device transceiver 330, and a device input/output interface 340. The device electronic processor 310, the device memory 320, the device transceiver 330, and the device input/output interface 340 communicate over one or more control and/or data buses (for example, a device communication bus 350). FIG. 3 illustrates only one example embodiment of the responding device 130. The responding device 130 may include more or fewer components and may perform functions other than those explicitly described herein.
  • The device electronic processor 310, the device memory 320, and the device input/output interface 340 are implemented same as or similar to the electronic processor 210, the memory 220, and the input/output interface 240. The device transceiver 330 is implemented same as or similar to the transceiver 230 and enables wired or wireless communication between the responding device 130 and the communication network 140.
  • The responding device 130 also includes an image capturing device 360 and a microphone 370. The image capturing device 360 and the microphone 370 are illustrated as being separate components. However, the image capturing device 360 and the microphone 370 may be part of the device input/output interface 340. In the example of a first responder, the image capturing device 360 is a body-worn camera and the microphone 370 is part of a portable two-way radio device or a smart telephone of the first responder. The body-worn camera may be communicatively coupled to the portable two-way radio, the smart telephone, or another computing device worn by the first responder. The portable two-way radio, the smart telephone, or the other computing device includes the device electronic processor 310 and the device transceiver 330 to receive the images and/or videos captured by the body-worn camera and audio captured by the microphone 370 and to transfer the images, videos, and/or speech to the command server 110 through the communication network 140. In the example of an unmanned aerial vehicle, the image capturing device 360 and the microphone 370 may be integrated into or mounted to the housing of the unmanned aerial vehicle. The image capturing device 360 and the microphone 370 capture images, audio, and/or video at an incident scene as the unmanned aerial vehicle is navigated around the incident scene. The captured, images, audio, and/or video are received and transmitted to the command server 110 through the communication network 140. In the example of a responding vehicle, the image capturing device 360 is a dashboard mounted camera and the microphone 370 is part of vehicle mounted mobile two-way radio device. The dashboard mounted camera may be communicatively coupled to the mobile two-way radio or another computing device of the responding vehicle. The mobile two-way radio or the another computing device includes the device electronic processor 310 and the device transceiver 330 to receive the images or videos captured by the dashboard mounted camera and audio captured by the microphone 370 and to transfer the images, videos, and/or speech to the command server 110 through the communication network 140.
  • FIG. 4 illustrates a flowchart of an example method 400 for appearance search for incident response. In the example illustrated, the method 400 includes receiving search criteria at the command server 110 (at block 410). The command server 110 monitors the data sources 120 and the plurality of responding devices 130 for data regarding an ongoing incident. The data sources 120 may include an incident report received at a terminal of the command server 110 or audio and/or video files analyzed at the intelligence center. In one example, the data sources 120 may include a report including a tip generated and input at the intelligence center by an officer. For example, the search criteria may come from a citizen calling a 911 emergency call taker and the 911 emergency call taker providing information received from the citizen into the data source 120 as a tip sheet. In another example, the search criteria may come from a specific first responder (for example, responding police officer) requesting a computer aided dispatch (CAD) dispatcher to initiate a search and the CAD dispatcher providing the information received from the first responder to the data source 120. In some embodiments, an intelligence center analyst may also refer to a tip database that is part of the data sources 120 for initiating or updating the search criteria. Based on the several data sources, a search criteria for an appearance search may be generated and/or input to the command server 110. The appearance search generally includes a subject that is, for example, a person, a vehicle, an object, and the like (that is, an object of interest). The search criteria defines the characteristics of the subject of the appearance search such that the devices performing the appearance search can identify subjects having the characteristics in images or videos. That is, the appearance search includes a responding device 130 of the plurality of responding devices 130 determining whether an image and/or video captured by the responding device 130 includes an object of interest that is the subject of the appearance search based on the search criteria.
  • In one example, an intelligence center analyst monitoring live video detects an incident occurring involving a vehicle. The vehicle moves out of the view of a camera being monitored by the analyst. The intelligence center analyst may not be sure whether the vehicle is still in the area. The analyst may then input a request for an appearance search with the command server 110. The analyst may provide a description or characteristics of the vehicle (for example, make, model, year, color, license plate number, and the like) as the search criteria. This search criteria is received by the command server 110 for the appearance search.
  • The method 400 also includes initiating, using the command server 110, an appearance search based on the search criteria (at block 420). The appearance search is performed on the plurality of responding devices 130 communicatively connected to the command server 110. The command server 110 issues an appearance search based on the search criteria received for an incident. The command server 110 dispatches the search criteria to the various responding devices 130 responding to the incident. The responding devices 130 receive the search criteria and perform the appearance search based on the search criteria. In one example, the image and video files are analyzed at the responding device 130 based on the search criteria to perform the appearance search. In another embodiment, the image and video files are transmitted from the responding device 130 to another computing device (for example, the command server 110), which in turn performs the appearance search on the video files. In yet another example, the appearance search may be partially conducted on the responding device 130 and partially on the command server 110 using the video monitoring application 260.
  • Continuing the above example of a vehicle incident, the command server 110 initiates an appearance search for the missing vehicle based on the search criteria received for the missing vehicle. The search criteria is dispatched to the responding devices 130 that may include responding vehicles and officers in the area or an unmanned aerial vehicle deployed for the incident. The responding devices 130 use the search criteria to identify the vehicle and transmit an indication to the command server 110 if the vehicle is identified by a responding device 130. The responding device 130 may also transmit the image or a portion of a captured video including the identified vehicle to the command server 110.
  • The method 400 includes receiving, at the command server 110, an update to the search criteria (at block 430). When new information is discovered regarding the current incident, the search criteria may have to be updated to accurately perform the appearance search. The updated search criteria may be received same or similarly as the original search criteria. For example, information may be provided to the data sources 120 based on a tip call from a citizen to a 911 emergency call taker or from a responding officer to a CAD dispatcher. The new information may be discovered based on a historical appearance search or during the active appearance search. For example, an analyst looking at a video sample of the appearance search may notice a change to the search criteria (for example, a suspect changed clothes, a suspect changed license plates on a vehicle, and the like). In addition, the new information may also be discovered by an officer responding to the incident. The officer may provide the new information via audio through the responding officer's microphone 370 or via text through the officer two-way radio or smart telephone. The new information may automatically trigger an update to the search criteria at the command server 110. For example, the new information is received as an update to the search criteria at the command server 110. In some embodiments, the command server 110 automatically initiates the update in response to analyzing data from a plurality of data sources. For example, the command server 110 analyzes new tips received from the data sources 120, analyzes other audio, video, and/or image sources to determine new information or changed circumstances and automatically initiates the update based on the new information or changed circumstances.
  • The method 400 includes determining, using the command server 110, a confidence level of the update based on weights assigned to each a plurality of parameters of the update (at block 440). Although new information may be discovered regarding an incident, all new information may not trigger an update to the search criteria. The command server 110 vets the update to the search criteria to determine whether pushing the update to the plurality of responding devices 130 is warranted based on the update. The update includes plurality of parameters that are weighted by the intelligence center.
  • The plurality of parameters include, for example, a number of instances of the update match in history, a number of instances of the update match in active search, whether the update was derived via data capture, whether the update was manually observed by a role (for example, a responding officer), whether the update was observed across multiple of a single role type (for example, witnessed by multiple responding officers, captured on multiple dashboard mounted cameras, or the like), and whether the update was observed across multiple role types (for example, both witnessed by a responding officer, and observed on a dashboard mounted camera). Examples of different role types include an intelligence center analyst, a computer aided dispatch (CAD) dispatcher, a 911 emergency call taker, emergency responders including emergency medical technicians (EMTs), police officer, and firefighters, and the like. Another example of an update being observed across multiple role types includes an intelligence center analyst witnessing an update using a dashboard camera of a responding vehicle and confirming the update to the command server 110 in addition to a responding police officer manually observing the update and confirming the update to the command server 110. Each of the plurality of parameters may be assigned a certain weight by the command server 110. These weights may be pre-stored. In one example as shown in FIG. 5, the weights are integers. In other examples, the weights include, but not limited to, percentages, decimals, and the like. The command server 110 determines a calculated weight number for each parameter based on whether the update satisfies the parameter or the amount to which the update satisfied the parameter. The command server 110 adds the calculated weight numbers to determine the confidence level of the update. For example, the calculated weight number may be determined using, but not limited to, summation, multiplication, division, and/or other mathematical combinational process to determine a confidence level. The threshold confidence level varies based on the form factor (for example, integers, percentages, decimals, and the like) used for the weights. For example, the threshold confidence level may be a threshold integer, a threshold percentage, a threshold decimal or the like. In some embodiments, the threshold confidence level may include multiple threshold confidence levels. For example, the threshold confidence level may include a 80% threshold and a 50% threshold. When the confidence level is above the 80% threshold, the update to the appearance search may be automatically applied and/or executed. When the confidence level is between 50% and 80%, the update may not be automatically applied and/or executed and an intelligence center analyst may be prompted to update the appearance search. When the confidence level is below 50%, the update may be automatically discarded and/or saved/shelved for future consideration.
  • FIG. 5 illustrates one example chart 500 for determining a confidence level of an update. In the example illustrated, the command server 110 determines the confidence level of an update requested by a responding officer based on an observation by multiple responding officer. The responding officer may provide the new information to the command server 110 (for example, by using an application on the two-way radio or smart telephone communicating with the command server 110), which may automatically trigger the update based on the new information. In some embodiments, the responding office may not specifically request an update, but may provide a report of an observation. The report may turn into a request for an update or may trigger the update when the calculated weight number reaches the threshold confidence level. The chart 500 includes a first column showing the plurality of parameters, a second column showing predetermined parameter weights assigned to the plurality of parameters, a third column showing whether and/or the amount to which the update satisfies a parameter, and a fourth column showing the calculated weight number for the update for the plurality of parameters. Since the update was based on manual observation by a responding officer, the command server 110 assigns zero calculated weight numbers for the parameters: (i) number of instances of update match in history; (ii) number of instances of update match in active search; and (iii) whether the update was derived via data capture. The command server 110 assigns weight numbers of 2, 2, and 4 respectively based on the update being manually observed, observed by multiple responding officers, and observed by responding officers with different roles. The command server 110 adds the calculated weight numbers to determine that the confidence level is 8 or 50% of the maximum of 16.
  • The method 400 includes creating, using the command server 110, an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold (at block 450). The predetermined confidence threshold may be preset into the command server 110. For example, a different predetermined confidence threshold may be stored in correspondence with different types of incidents. In some embodiments, the predetermined confidence threshold may be programmatically calculated based on predetermined criteria (for example, incident type, incident location, and the like) stored in the command server 110. In one example, the predetermined confidence threshold may be lower for an amber alert and higher for a prisoner escape incident. When the command server 110 determines that the confidence level meets or exceeds the predetermined confidence level, the command server 110 generates the updated search criteria based on the original search criteria and the update. For example, when a suspect changes a license plate of a vehicle, the update may include removing the previous license plate number and adding the new license plate number. The updated search criteria may include the previous search criteria with the license plate number updated to the new license plate number. Accordingly, the update may include one or more of an amendment, a deletion, and an addition of a characteristic provided in the original or previously updated search criteria. When the command server 110 determines that the confidence level does not meet or exceed the predetermined confidence level, the command server 110 may discard the update and does not create the updated search criteria, or may cause a prompt to be provided to a dispatcher or other person in the manner already set forth above. In some embodiments, the command server 110 saves the update for future implementation when the confidence level does not meet or exceed the predetermined confidence level. The saved or shelved update may be used as a parameter to weigh the confidence level of a future update as described above. For example, if a similar update to the search criteria is received in the future, the future update may result in a higher confidence level based the saved or shelved update. The appearance search is conducted based on the original search criteria or a previously updated search criteria when the confidence level of the current update does not meet or exceed the predetermined confidence level.
  • The method 400 also includes pushing the updated search criteria for the appearance search to the plurality of responding devices 130 (at block 460). The command server 110 executes the dispatch application 270 to push the updated search criteria to the plurality of responding device 130. The plurality of responding devices 130 now conduct the appearance search based on the updated search criteria and may disregard the original search criteria. In some embodiments, the command server 110 receives, via the transceiver 230, an acknowledgement from the plurality of responding devices 130 in response to the plurality of responding devices 130 receiving the updated search criteria. The method 400 repeats for any additional update received to the search criteria.
  • In some embodiments, the command server 110 generates a historic report for the appearance search. The command server 110 stores the initial search criteria and every update received for the search criteria. The updates may be stored with the time at which the update was received and the reason for receiving the update to the search criteria. For example, the command server 110 stores whether the update was received based on a tip from a citizen to a 911 emergency call taker or from a responding police officer informing a CAD dispatcher. The command server 110 may also store whether the update was implemented, discarded, or saved for future implementation. Additionally, the command server 110 may store the confidence level including the weights received based on each parameter for the update to the search criteria. The historic report generated by the command server may include the timestamp for each update to the search criteria and the reason for receiving the update to the search criteria. The historic report may also include the weights for each parameter for each of the updates and the confidence level for each of the updates.
  • Manual feedback of when an update should have been triggered but was not, or when an update was triggered but should not have been, may be fed into a machine learning program, along with the parameters, weights, and threshold levels set forth above, to create a trained model for updating appearance searches in the future, among other possibilities.
  • In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
  • The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
  • Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has,” “having,” “includes,” “including,” “contains,” “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a,” “has . . . a,” “includes . . . a,” or “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially,” “essentially,” “approximately,” “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
  • It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
  • Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (for example, comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
  • The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Claims (20)

We claim:
1. A command server for appearance searching for incident response, the command server comprising:
a transceiver enabling communication between the command server and a plurality of responding devices; and
an electronic processor coupled to the transceiver and configured to:
receive a search criteria,
initiate an appearance search based on the search criteria, the appearance search performed on the plurality of responding devices,
receive an update to the search criteria,
determine a confidence level of the update based on weights assigned to a plurality of parameters of the update,
create an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold, and
push the updated search criteria for the appearance search to the plurality of responding devices.
2. The command server of claim 1, wherein the appearance search includes a responding device of the plurality of responding devices determining whether a video captured by the responding device includes an object of interest that is a subject of the appearance search based on the search criteria.
3. The command server of claim 1, wherein the plurality of parameters include one or more selected from a group consisting of: a number of instances of the update match in history, a number of instances of the update match in active search, whether the update was derived via data capture, whether the update was manually observed by a role, whether the update was observed across multiple of a single role type, and whether the update was observed across multiple role types.
4. The command server of claim 1, wherein the update includes one or more of: an amendment, a deletion, and an addition of a characteristic provided in the search criteria.
5. The command server of claim 1, wherein the electronic processor is further configured to discard the update when the confidence level of the update does not exceed the predetermined confidence threshold.
6. The command server of claim 1, wherein the electronic processor is further configured to save the update for future implementation when the confidence level of the update does not exceed the predetermined confidence threshold.
7. The command server of claim 1, wherein the electronic processor is further configured to automatically initiate the update in response to analyzing data from a plurality of data sources.
8. The command server of claim 1, wherein the electronic processor is further configured to receive, via the transceiver, an acknowledgement from the plurality of responding devices in response to the plurality of responding devices receiving the updated search criteria.
9. The command server of claim 1, wherein the electronic processor is further configured to generate a historic report of the appearance search, the historic report including the search criteria and each update to the search criteria considered for the appearance search for a duration of the incident.
10. The command server of claim 9, wherein the historic report further includes a timestamp for each update to the search criteria and a reason for receiving the update to the search criteria.
11. A method for appearance searching for incident response, the method comprising:
receiving a search criteria at a command server;
initiating, using the command server, an appearance search based on the search criteria, the appearance search performed on a plurality of responding devices communicatively connected to the command server;
receiving, at the command server, an update to the search criteria;
determining, using the command server, a confidence level of the update based on weights assigned to each a plurality of parameters of the update;
creating, using the command server, an updated search criteria based on the search criteria and the update when the confidence level of the update exceeds a predetermined confidence threshold; and
pushing the updated search criteria for the appearance search to the plurality of responding devices.
12. The method of claim 11, wherein the appearance search includes a responding device of the plurality of responding devices determining whether a video captured by the responding device includes an object of interest that is a subject of the appearance search based on the search criteria.
13. The method of claim 11, wherein the plurality of parameters include one or more selected from a group consisting of: a number of instances of the update match in history, a number of instances of the update match in active search, whether the update was derived via data capture, whether the update was manually observed by a role, whether the update was observed across multiple of a single role type, and whether the update was observed across multiple role types.
14. The method of claim 11, wherein the update includes one or more of: an amendment; a deletion, and an addition of a characteristic provided in the search criteria.
15. The method of claim 11, further comprising discarding the update when the confidence level of the update does not exceed the predetermined confidence threshold.
16. The method of claim 11, further comprising saving the update for future implementation when the confidence level of the update does not exceed the predetermined confidence threshold.
17. The method of claim 11, further comprising automatically initiating the update in response to analyzing data from a plurality of data sources.
18. The method of claim 11, further comprising receiving, at the command server, an acknowledgement from the plurality of responding devices in response to the plurality of responding devices receiving the updated search criteria.
19. The method of claim 11, further comprising generating a historic report of the appearance search, the historic report including the search criteria and each update to the search criteria considered for the appearance search for a duration of the incident.
20. The method of claim 19, wherein the historic report further includes a timestamp for each update to the search criteria and a reason for receiving the update to the search criteria.
US16/727,327 2019-12-26 2019-12-26 Appearance search for incident response Abandoned US20210200826A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/727,327 US20210200826A1 (en) 2019-12-26 2019-12-26 Appearance search for incident response
EP20834071.1A EP4081913A1 (en) 2019-12-26 2020-12-08 Appearance search for incident response
PCT/US2020/063817 WO2021133545A1 (en) 2019-12-26 2020-12-08 Appearance search for incident response
US18/465,543 US20240004942A1 (en) 2019-12-26 2023-09-12 Appearance search for incident response

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/727,327 US20210200826A1 (en) 2019-12-26 2019-12-26 Appearance search for incident response

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/465,543 Continuation US20240004942A1 (en) 2019-12-26 2023-09-12 Appearance search for incident response

Publications (1)

Publication Number Publication Date
US20210200826A1 true US20210200826A1 (en) 2021-07-01

Family

ID=74106179

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/727,327 Abandoned US20210200826A1 (en) 2019-12-26 2019-12-26 Appearance search for incident response
US18/465,543 Pending US20240004942A1 (en) 2019-12-26 2023-09-12 Appearance search for incident response

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/465,543 Pending US20240004942A1 (en) 2019-12-26 2023-09-12 Appearance search for incident response

Country Status (3)

Country Link
US (2) US20210200826A1 (en)
EP (1) EP4081913A1 (en)
WO (1) WO2021133545A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11184734B1 (en) * 2020-08-19 2021-11-23 T-Mobile Usa, Inc. Using geofencing areas to improve road safety use cases in a V2X communication environment
US20220036300A1 (en) * 2020-07-31 2022-02-03 Trackonomy Systems, Inc. System and methods of electronics sampling to optimize system performance, cost, and confidence levels

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8837906B2 (en) * 2012-12-14 2014-09-16 Motorola Solutions, Inc. Computer assisted dispatch incident report video search and tagging systems and methods
CN110235138B (en) * 2016-12-05 2023-09-05 摩托罗拉解决方案公司 System and method for appearance search
US10866950B2 (en) * 2017-12-06 2020-12-15 Motorola Solutions, Inc. Method and system for modifying a search request corresponding to a person, object, or entity (POE) of interest

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220036300A1 (en) * 2020-07-31 2022-02-03 Trackonomy Systems, Inc. System and methods of electronics sampling to optimize system performance, cost, and confidence levels
US12001993B2 (en) * 2020-07-31 2024-06-04 Trackonomy Systems, Inc. System and methods of electronics sampling to optimize system performance, cost, and confidence levels
US11184734B1 (en) * 2020-08-19 2021-11-23 T-Mobile Usa, Inc. Using geofencing areas to improve road safety use cases in a V2X communication environment
US11706586B2 (en) 2020-08-19 2023-07-18 T-Mobile Usa, Inc. Using geofencing areas to improve road safety use cases in a V2X communication environment

Also Published As

Publication number Publication date
EP4081913A1 (en) 2022-11-02
US20240004942A1 (en) 2024-01-04
WO2021133545A1 (en) 2021-07-01

Similar Documents

Publication Publication Date Title
US20240004942A1 (en) Appearance search for incident response
US8923799B2 (en) Method and system for an automated dispatch protocol
US10922776B2 (en) Platform for real-time views on consolidated data
US11620470B2 (en) Device, system and method for generating an alert and an automatic search for a candidate subject
US8368754B2 (en) Video pattern recognition for automating emergency service incident awareness and response
EP2805316B1 (en) Emergency messaging system and method of responding to an emergency
CA2972451C (en) Method and apparatus for prediction of a destination and movement of a person of interest
US10820029B2 (en) Alerting groups of user devices to similar video content of interest based on role
US11443613B2 (en) Real-time crime center solution with text-based tips and panic alerts
US20200020160A1 (en) Method, apparatus and system for mapping an incident type to data displayed at an incident scene
US9641965B1 (en) Method, system and computer program product for law enforcement
US10667094B2 (en) Device, system and method for determining a prioritized list of communication groups
US20210367991A1 (en) System and method for sending and rendering an image by a device based on receiver's context
US10685414B1 (en) Method and system for generating an automated police report
KR20160032464A (en) Social security network method and system
US20240211098A1 (en) Structured graphical user interface for public safety policy procedure management
US20230111833A1 (en) Modifying future workflow based on information received at current day
US20220414082A1 (en) Device, system, and method for providing an indication that media has not yet been uploaded to a data store
US20230089499A1 (en) Method And System For Linking Unsolicited Electronic Tips To Public-safety Data
CN112686487A (en) Warning situation auxiliary processing method and device, medium and electronic equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOTOROLA SOLUTIONS, INC., ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHULER, FRANCESCA;FREEMAN, LARRY D.;RAPALA, CHRISTINE K.;SIGNING DATES FROM 20200106 TO 20200113;REEL/FRAME:051659/0901

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION