WO2021262213A1 - System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident - Google Patents

System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident Download PDF

Info

Publication number
WO2021262213A1
WO2021262213A1 PCT/US2020/053816 US2020053816W WO2021262213A1 WO 2021262213 A1 WO2021262213 A1 WO 2021262213A1 US 2020053816 W US2020053816 W US 2020053816W WO 2021262213 A1 WO2021262213 A1 WO 2021262213A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
identifier
incident
time
subsequent
Prior art date
Application number
PCT/US2020/053816
Other languages
French (fr)
Inventor
William Holloway PETREY JR.
Original Assignee
Petrey Jr William Holloway
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/910,949 external-priority patent/US11270129B2/en
Priority claimed from US17/039,505 external-priority patent/US20210019645A1/en
Application filed by Petrey Jr William Holloway filed Critical Petrey Jr William Holloway
Priority to EP20941898.7A priority Critical patent/EP4172886A1/en
Priority to IL299461A priority patent/IL299461A/en
Publication of WO2021262213A1 publication Critical patent/WO2021262213A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B31/00Predictive alarm systems characterised by extrapolation or other computation using updated historic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/186Fuzzy logic; neural networks

Definitions

  • This disclosure relates generally to determining probabilities occurrence of subsequent incidents. More specifically, this disclosure relates to a system and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident and performing a preventative action.
  • various incidents may occur at certain locations and there may be no information that is usable to correlate a person to the incident at the location.
  • the people that participated in incident may perform a subsequent incident at a later date, and it may be desirable to prevent such subsequent incident from occurring.
  • the present disclosure provides a system and method for correlating wireless network information.
  • a system for monitoring vehicle traffic may include at least one camera positioned to capture a set of images within a license plate detection zone, at least some of the captured images representing license plates of a set of vehicles appearing within the camera’s field of view.
  • the system may also include at least one electronic device identification sensor configured to detect and store a set of electronic device identifiers of electronic devices located within one or more electronic device detection zones.
  • the system may also include one or more non-transitory computer-readable storage media having stored thereon computer-executable instructions that, when executed by one or more processors, cause a computing system to: detect, using the set of images, a license plate ID of a vehicle; compare the license plate ID of the vehicle to a database of trusted vehicle license plate IDs; identify the vehicle as a suspicious vehicle, the identification based at least in part on the comparison of the license plate ID of the vehicle to the database of trusted vehicle license plate IDs; and correlate the license plate ID of the vehicle with at least one of the set of stored electronic device identifiers.
  • a method for using artificial intelligence to determine a probability of occurrence of a subsequent incident is disclosed.
  • the method may include receiving, at a processor, an identifier associated with a person, where the identifier is received from a location where the person was present at a first time.
  • the method may also include receiving information pertaining to an incident that occurred at the location where the person was present at the first time.
  • the method may also include receiving, at a second time subsequent to the first time, the identifier associated with the person.
  • the method may also include determining, by the processor via a trained machine learning model using the identifier and the information, the probability of occurrence of the subsequent incident.
  • the method may also include performing, based on the probability of occurrence of the subsequent incident, a preventative action.
  • a tangible, non-transitory computer-readable medium stores instructions that, when executed, cause a processing device to perform any of the operations, steps, and/or functions of any of the methods disclosed herein.
  • a system includes a memory device storing instructions, and a processing device communicatively coupled to the memory device, where the processing device executes the instructions to perform any of the operations, steps, and/or functions of any of the methods disclosed herein.
  • Couple and its derivatives refer to any direct or indirect communication between two or more elements, whether or not those elements are in physical contact with one another.
  • transmit and “communicate,” as well as derivatives thereof, encompass both direct and indirect communication.
  • the term “or” is inclusive, meaning and/or.
  • controller means any device, system or part thereof that controls at least one operation. Such a controller may be implemented in hardware or a combination of hardware and software and/or firmware. The functionality associated with any particular controller may be centralized or distributed, whether locally or remotely.
  • phrases “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed.
  • “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
  • various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium.
  • application and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code.
  • computer readable program code includes any type of computer code, including source code, object code, and executable code.
  • computer readable medium includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), solid state drives (SSDs), flash, or any other type of memory.
  • ROM read only memory
  • RAM random access memory
  • CD compact disc
  • DVD digital video disc
  • SSDs solid state drives
  • flash or any other type of memory.
  • a “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals.
  • a non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
  • MAC address may refer to a MAC, international mobile subscriber identity (IMSI), mobile station international subscriber directory number (MSISDN), enhanced network selection (ENS), or any other form of unique identifying number.
  • IMSI international mobile subscriber identity
  • MSISDN mobile station international subscriber directory number
  • ENS enhanced network selection
  • FIGURE 1 illustrates a high-level component diagram of an illustrative system architecture, according to certain embodiments of this disclosure
  • FIGURE 2 illustrates details pertaining to various components of the illustrative system architecture of FIGURE 1, according to certain embodiments of this disclosure
  • FIGURE 3 illustrates example method for monitoring vehicle traffic, according to certain embodiments of this disclosure
  • FIGURE 4 illustrates another example method for monitoring vehicle traffic, according to certain embodiments of this disclosure
  • FIGURE 5 illustrates example use interfaces presented on computing devices during monitoring vehicle traffic, according to certain embodiments of this disclosure
  • FIGURE 6 illustrates another high-level component diagram of an illustrative system architecture including an artificial intelligence engine, according to certain embodiments of this disclosure
  • FIGURE 7 illustrates an example scenario where a preventative action is performed based on a probability of occurrence of a subsequent incident, according to certain embodiments of this disclosure
  • FIGURE 8 illustrates an example method for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure
  • FIGURE 9 illustrates another example method for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure
  • FIGURE 10 illustrates an example method of various performing one or more preventative action, according to certain embodiments of this disclosure.
  • FIGURE 11 illustrates an example computer system according to certain embodiments of this disclosure.
  • Improvement is desired in the field of public safety for certain areas (e.g., neighborhood, airport, business park, border checkpoint, city, etc.).
  • areas e.g., neighborhood, airport, business park, border checkpoint, city, etc.
  • measures that may be conventionally used, such as gated communities, neighborhood crime watch groups, and so forth.
  • the conventional measures lack efficiency and accuracy in identifying suspicious vehicles / individuals and reporting of the suspicious vehicles / individuals, among other things.
  • the conventional measures may fail to report the suspicious vehicle / individual, altogether.
  • the causes of the inefficient and/or failed reporting may be at least in part attributable to people (e.g., neighbors in a neighborhood) not having access to verified vehicle and/or personal information of an individual.
  • the conventional measures lack the ability to quickly, accurately, and automatically identify the vehicle as a suspicious vehicle, correlate vehicle information (e.g., license plate identifier (ID)), electronic device information (e.g., electronic device identifier (ID)), face information, etc., and/or perform a preventative action based on the identification.
  • vehicle information e.g., license plate identifier (ID)
  • electronic device information e.g., electronic device identifier (ID)
  • face information e.g., face information, etc.
  • a neighbor may witness an unknown vehicle drive through the neighborhood several times within a given time period during a day.
  • the neighbor may not recognize the license plate ID or driver and may think about reporting the unknown vehicle to law enforcement. Instead, the neighbor may decide to proceed to do another activity. Subsequently, the person may burglarize a house in the neighborhood. Even if the neighbor attempted to lookup the license plate ID, and was able to find out information about an owner of the vehicle, the neighbor may not be able to determine whether the driver of the vehicle is the actual owner, the neighbor may not be able to determine whether the owner or driver is on a crime watch list, and so forth.
  • the neighbor may not be privy to the electronic device identifier of the electronic device the suspicious individual is carrying or that is installed in the vehicle, which may be used to track the whereabouts of the individual / vehicle in a monitored area. Even if a neighbor obtains an electronic device identifier, there currently is no technique for determining personal information associated with the electronic device identifier. To reiterate, conventional techniques for public safety lack the ability to identify a suspicious vehicle / individual and/or to correlate vehicle information, facial information, and/or electronic device identifiers of electronic devices of the driver to make an informed decision quickly, accurately, and automatically.
  • the present disclosure relates to a system and method for correlating electronic device identifiers with vehicle information.
  • the system may include one or more license plate detection zones, one or more electronic device detection zones, and/or one or more facial detection zones.
  • the zones may be partially or wholly overlapping and there may be multiple zones established that span a desired area (e.g., a neighborhood, a city block, a public / private parking lot, any street, etc.).
  • the license plate detection zones, the electronic device detection zones, and/or the facial detection zones may include devices that are communicatively coupled to one or more computing systems via a network.
  • the license plate detection zones may include one or more cameras configured to capture images of at least license plates on vehicles that enter the license plate detection zone.
  • the electronic device detection zone may include one or more electronic device identification sensors, such as a WiFi signal detection device or a Bluetooth® signal detection device.
  • the electronic device identification sensors may be configured to detect and store WiFi Machine Access Control (MAC) addresses, Bluetooth MAC addresses, and/or cellular MAC addresses (e.g., International Mobile Subscriber Identity (IMSI), Mobile Station International Subscriber Directory Number (MSISDN), and Electronic Serial Numbers (ESN)) of electronic devices that enter the electronic device detection zone based on the signals emitted by the electronic devices.
  • the facial detection zones may include one or more cameras configured to capture images or digital frames that are used to recognize a face.
  • a MAC address may be any combination of the IDs described herein (e.g., MAC, MSISIDN, IMSI, ESN, etc.).
  • the computing system may analyze the images captured by the cameras and detect a license plate identifier (ID) of a vehicle.
  • the license plate ID may be compared with trusted license plate IDs that are stored in a database. When there is not a trusted license plate ID that matches the license plate ID, the computing system may identify the vehicle as a suspicious vehicle. Then, the computing system may correlate the license plate ID of the vehicle with at least one of the stored electronic device identifiers.
  • the license plate ID and the at least one of the stored electronic device identifiers may be correlated with a face of the individual.
  • personal information such as name, address, Bluetooth MAC address, WiFi MAC address, criminal record, whether the suspicious individual is on a crime watch list, etc. may be retrieved using the license plate ID or the at least one of the stored electronic device identifiers that is correlated with the license plate ID of the suspicious vehicle.
  • the system may include several computer applications that may be accessed by registered users of the system.
  • a client application may be accessed by a computing device of a user, such as a neighbor in a neighborhood implementing the system.
  • the client application may present a user interface including an alert when a suspicious vehicle and/or individual is detected.
  • the user interface may present several preventative actions for the user. For example, the user may contact the suspicious individual using the personal information (e.g., send a threatening text message), notify law enforcement, and so forth.
  • a client application may be accessed by a computing device of a law enforcer.
  • the client application may present a user interface including the notification that a suspicious vehicle and/or individual is detected in the particular zones.
  • license plate detection zones and electronic device detection zones may be placed to cover both lanes at both entrances.
  • a facial detection zone may be placed at the entrances with the other zones.
  • Each vehicle may be correlated with each electronic device that enters the neighborhood. Further, the recognized face may be correlated with the electronic device and the vehicle information.
  • the houses inside the neighborhood may setup electronic device detection zones and/or a facial detection zone inside their property to detect electronic device IDs and/or faces and compare them with electronic device IDs and/or faces in a database that stores every correlation that has been made by the system to date (including the most recent correlations of electronic device IDs, faces, and/or vehicles entering the neighborhood).
  • the home owner may be notified via the client application on their computing device if an electronic device and/or face is detected on their property. Further, in some embodiments, the individual associated with the electronic device and/or face may be notified on the electronic device that the homeowner is aware of their presence. If a known criminal with a warrant is detected at either the zones at the entrance or at the zones at the homeowner’s property, the appropriate law enforcement agency may be notified of their whereabouts.
  • the disclosed techniques provide numerous benefits over conventional systems.
  • the system provides efficient, accurate, and automatic identification of suspicious vehicles and/or individuals.
  • the system enables correlating vehicle license plate IDs with electronic device identifiers to enable enhanced detection and/or preventative actions, such as directly communicating with the electronic device of the suspicious individual and/or notifying law enforcement using the client application in real-time or near real-time when the suspicious vehicle enters one or more zones.
  • a correlation may be obtained with a license plate ID to obtain personal information about the owner that enables contacting the owner directly and/or determining whether the owner is a criminal.
  • the client application provides pertinent information pertaining to both the suspicious vehicle and/or individual in a single user interface without the user having to perform any searches of the license plate ID or electronic device identifier.
  • the disclosed techniques reduce processing, memory, and/or network resources by reducing searches that the user may perform to find the information.
  • the disclosed techniques provide an enhanced user interface that presents the suspicious vehicle and/or individual information in single location, which may improve a user’s experience using the computing device.
  • FIGURES 1 through 6, discussed below, and the various embodiments used to describe the principles of this disclosure are by way of illustration only and should not be construed in any way to limit the scope of the disclosure.
  • FIGURE 1 illustrates a high-level component diagram of an illustrative system architecture 100 according to certain embodiments of this disclosure.
  • the system architecture 100 may include a computing device 102 communicatively coupled to a cloud- based computing system 116, one or more cameras 120, one or more electronic device identification sensors 130, and/or one or more electronic device 140 of a suspicious individual.
  • the cloud-based computing system 116 may include one or more servers 118.
  • Each of the computing device 102, the servers 118, the cameras 120, the electronic device identification sensors 130, and the electronic device 140 may include one or more processing devices, memory devices, and network interface devices.
  • the electronic device 140 may be referred to as a computing device herein.
  • the electronic device 140 may be a smartphone, a wearable device (e.g., smart watch), a laptop, or any suitable portable electronic device including one or more processing devices, memory devices, and network interface devices.
  • the network interface devices may enable communication via a wireless protocol for transmitting data over short distances, such as Bluetooth, ZigBee, etc. Additionally, the network interface devices may enable communicating data over long distances, and in one example, the computing device 102 may communicate with a network 112.
  • Network 112 may be a public network (e.g., connected to the Internet via wired (Ethernet) or wireless (WiFi)), a private network (e.g., a local area network (LAN) or wide area network (WAN)), or a combination thereof.
  • the computing device 102 may be any suitable computing device, such as a laptop, tablet, smartphone, or computer.
  • the computing device may be configured to execute a client application 104 that presents a user interface.
  • the client application 104 may be implemented in computer instructions stored on one or more memory devices and executed by one or more processing devices of the computing device 102.
  • the client application 104 may be a standalone application installed on the computing device 102 or may be an application that is executed by another application (e.g., a website in a web browser).
  • the computing device 102 may include a display that is capable of presenting the user interface of the client application 104.
  • the user interface may present various screens to a user depending on what type of user is logged into the client application 104. For example, a user, such as a neighbor or person interested in a particular license plate detection zone 122 and/or electronic device detection zone 132, may be presented with a user interface for logging into the system where the user enters credentials (username and password), a user interface that displays alerts of suspicious vehicles and/or individuals in the zones 122 and/or 132 where the user interface includes options for preventative actions, a user interface that presents logged events over time, and so forth.
  • the client application 104 may enable the user to directly contact (e.g., send text message, send email, call) the electronic device 140 of a suspicious individual 142 using personal information obtained about the individual 142.
  • Another user such as a law enforcer, may be presented with a user interface for logging into the system where the user enters credentials (username and password), a user interface that displays notifications when the user selects to notify law enforcement where the notifications may include information related to the suspicious vehicle and/or individual 142.
  • the cameras 120 may be located in the license plate detection zones 122. Although just one camera 120 and one license plate detection zone 122 are depicted, it should be noted that any suitable number of cameras 120 may be located in any suitable number of license plate detection zones 122.
  • license plate detection zones 122 may be used to cover a desired area.
  • a license plate detection zone 122 may refer to an area of coverage that is within the cameras’ 120 field of view.
  • the cameras 120 may be any suitable camera and/or video camera capable of capturing a set of images 123 that at least represent license plates of a vehicle 126 that enters the license plate detection zone 122.
  • the set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112.
  • the electronic device identification sensors 130 may be located in the electronic device detection zones 132.
  • the license plate detection zone 122 and the electronic device detection zone 132-1 may partially or wholly overlap.
  • the combination of license plate detection zones 122 and the electronic device detection zones 132 may be setup at entrances / exits to certain areas, and/or any other suitable area in a monitored area, to correlate each vehicle information with respective electronic device identifiers 133 of electronic devices 140 being carried in respective vehicles 126.
  • Each of the license plate detection zones 122 and electronic device detection zones 132 may have unique geographic identifiers so the data can be tracked by location. It should be noted that any suitable number of electronic device identification sensors 130 may be located in any suitable number of electronic device detection zones 132.
  • multiple electronic device detection zones 132 may be used to cover a desired area.
  • An electronic device detection zone 132 may refer to an area of coverage that is within the electronic device identification sensor 130 detection area.
  • an electronic device detection zone 132-2 and/or a facial detection zone 150 may be setup at a home of a homeowner, such that an electronic device 140 and/or a face of a suspicious individual 142 may be detected and stored when the suspicious individual 142 enters the zone 132-2.
  • the electronic device ID 133 and/or an image of the face may be transmitted to the cloud-based computing device 116 or the computing device 102 via the network 112.
  • the suspicious individual 142 may be contacted on their electronic device 140 with a message indicating the homeowner is aware of their presence and to leave the premises.
  • a known criminal individual 142 with a warrant is detected at the combination of zones 122 and 132-1 at an entrance or at the zone 132-2 and 150 at the home, then the proper law enforcement agency may be contacted with the whereabouts of the individual 142.
  • the cameras 120 may be located in the facial detection zones 150. Although just one camera 120 and one facial detection zone 150 are depicted, it should be noted that any suitable number of cameras 120 may be located in any suitable number of facial detection zones 122. For example, multiple facial detection zones 150 may be used to cover a desired area. A facial detection zone 150 may refer to an area of coverage that is within the cameras’ 120 field of view.
  • the cameras 120 may be any suitable camera and/or video camera capable of capturing a set of images 123 that at least represent faces of an individual 142 that enters the facial detection zone 150.
  • the set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112.
  • the cloud-based computing system 116 and/or the computing device 102 may perform facial recognition by comparing a face detected in the image to a database of faces to find a match and/or perform biometric artificial intelligence that may uniquely identify an individual 142 by analyzing patterns based on the individual’s facial textures and shape.
  • the electronic device identification sensors 130 may be configured to detect a set of electronic device IDs 133 (e.g., WiFi MAC addresses, Bluetooth MAC addresses, and/or cellular MAC addresses) of electronic device 140 within the electronic device detection zone 132. As depicted, the electronic device 140 of a suspicious individual is within the vehicle 126 passing through the electronic device detection zone 132.
  • the electronic device identification sensors 130 may be any suitable WiFi signal detection device capable of detecting WiFi MAC addresses and/or Bluetooth signal detection device capable of detecting Bluetooth MAC addresses of electronic devices 140 that enter the electronic device detection zone 132.
  • the set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112.
  • the electronic device identification sensor 130 may store the set of electronic device IDs 133 locally in a memory.
  • the electronic device identification sensor 130 may also transmit the set of electronic device IDs 133 to the cloud-based computing system 116 and/or the computing device 102 via the network 112 for storage.
  • the cloud-based computing system 116 may include the one or more servers 118 that form a distributed computing architecture.
  • Each of the servers 118 may be any suitable computing system and may include one or more processing devices, memory devices, data storage, and/or network interface devices.
  • the servers 118 may be in communication with one another via any suitable communication protocol.
  • the servers 118 may each include at least one trusted vehicle license plate IDs database 117 and at least one personal identification database 119.
  • the databases 117 and 119 may be stored on the computing device 102
  • the trusted vehicle license plate IDs database 117 may be populated by a processing device adding license plate IDs of vehicles that commonly enter the license plate detection zone 122. In some embodiments, the trusted vehicle license plate IDs database 117 may be populated at least in part by manual entry of license plate IDs associated with vehicles trusted to be within the license plate detection zone 122. These license plate IDs may be associated with vehicles owned by neighbors in a neighborhood, or family members of the neighbors, friends of the neighbors, visitors of the neighbors, contractors hired by the neighbors, any suitable person that is trusted, etc.
  • the personal identification database 119 may be populated by a processing device adding personal identification information associated with electronic device IDs 133 of electronic devices carried by people that commonly enter the electronic device detection zone 132 (e.g., a list of trusted electronic device IDs). In some embodiments, the personal identification database 119 may be populated at least in part by manual entry of personal identification information associated with electronic device IDs 133 associated with electronic devices 140 trusted to be within the electronic device detection zone 132 (e.g., a list of trusted electronic device IDs). These electronic device IDs 133 may be associated with electronic devices 140 owned by neighbors in a neighborhood, or family members of the neighbors, friends of the neighbors, visitors of the neighbors, contractors hired by the neighbors, etc. Further, in some embodiments, the personal identification database 119 may be populated by entering a list of known suspect individuals from the police department, people entering or exiting border checkpoints, etc.
  • the personal identification information for untrusted electronic device IDs may also be entered into the personal identification database 119.
  • the personal identification database 119 may also be populated by a processing device adding personal identification information associated with electronic device IDs 133 of electronic devices carried by people that commonly enter the facial detection zone 132 (e.g., face images of trusted individuals).
  • the personal identification information may include names, addresses, faces, email addresses, phone numbers, electronic device identifiers associated with electronic devices owned by the people (e.g., Bluetooth MAC addresses, WiFi MAC addresses), correlated license plate IDs with the electronic device identifiers, etc.
  • the correlations between the license plate IDs, the electronic device identifiers, and/or the faces may be performed by a processing device using the data obtained from the cameras 120 and the electronic device identification sensors 130. Some of this information may be obtained from public sources, phone books, the Internet, and/or companies that distribute electronic devices.
  • the personal identification information added to the personal identification database 119 may be associated with people selected based on their residing in or near a certain radius of a geographic region where the zones 122 and/or 132 are set up, based on whether they are on a crime watch list, or the like.
  • FIGURE 2 illustrates details pertaining to various components of the illustrative system architecture 100 of FIGURE 1, according to certain embodiments of this disclosure.
  • the camera 120 includes an image capturing component 200;
  • the electronic device identification sensor 130 includes an electronic device ID detecting and storing component 202;
  • the server 118 includes a license plate ID detecting component 204, a license plate ID comparing component 206, a suspicious vehicle identifying component 208, and a correlating component 210.
  • the components 204, 206, 208, and 210 may be included in the computing device 102 executing the client application 104.
  • Each of the components 200, 202, 204, 206, 208, and 210 may be implemented in computer instructions stored on one or more memory devices of their respective device and executed by one or more processors of their respective device.
  • the component 200 may be configured to capture a set of images 123 within a license plate detection zone 122. At least some of the captured images 123 may represent license plates of a set of vehicles 126 appearing within the field of view of the cameras 120.
  • the image capturing component 200 may configure one or more camera properties (e.g., zoom, focus, etc.) to obtain a clear image of the license plates.
  • the image capturing component 200 may implement various techniques to extract the license plate ID from the images 123, or the image capturing component 200 may transmit the set of images 123, without analyzing the images 123, to the server 118 via the network 112.
  • the component 202 may be configured to detect and store a set of electronic device IDs 133 of electronic devices located within one or more electronic device detection zones 132.
  • the electronic device ID detecting and storing component 202 may detect a WiFi signal, cellular signal, and/or a Bluetooth signal from the electronic device and be capable of obtaining the WiFi MAC address, cellular MAC address, and/or Bluetooth MAC address of the electronic device from the signal.
  • the electronic device IDs 133 may be stored locally in memory on the electronic device identification sensor 130, and/or transmitted to the server 118 and/or the computing device 102 via the network 112.
  • the component 204 may be configured to detect, using the set of images 123, a license plate ID of a vehicle 126.
  • the license plate ID detecting component 204 may perform optical character recognition (OCR), or any suitable identifier / text extraction technique, on the set of images 123 to detect the license plate IDs.
  • OCR optical character recognition
  • the component 206 may be configured to compare the license plate ID of the vehicle to a trusted vehicle license plate ID database 117.
  • the license plate ID comparing component 206 may compare the license plate ID with each trusted license plate ID in the trusted vehicle license plate ID database 117.
  • the component 208 may identify the vehicle 126 as a suspicious vehicle 126, the identification based at least in part on the comparison of the license plate ID of the vehicle 126 to the trusted vehicle license plate ID database 117. If there is not a trusted license plate ID that matches the license plate ID of the vehicle 126, then the suspicious vehicle identifying component 208 may identify the vehicle as a suspicious vehicle.
  • the component 210 may be configured to correlate the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133. Correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133 may include comparing one or more time stamps of the set of captured images 123 with one or more time stamps of the set of stored electronic device IDs 133.
  • correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133 may include analyzing at least one of: (i) at least one strength of signal associated with at least one of the set of stored electronic device IDs 133, and (ii) at least one visually estimated distance of at least one vehicle 126 associated with at least one of the set of stored images 123.
  • FIGURE 3 illustrates an example method 300 for monitoring vehicle traffic, according to certain embodiments of this disclosure.
  • the method 300 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both.
  • the method 300 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 300.
  • a computing system may refer to the computing device 102 or the cloud-based computing system 116.
  • the method 300 may be implemented as computer instructions that, when executed by a processing device, execute the operations.
  • the method 300 may be performed by a single processing thread.
  • the method 300 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 300.
  • a set of images 123 may be captured, using at least one camera 120, within a license plate detection zone 122. At least some of the set of images 123 may represent license plates of a set of vehicles 126 appearing within the camera’s field of view.
  • One or more camera properties e.g., zoomed in, focused, etc. may be configured to enable the at least one camera 120 to obtain clear images 123 of the license plates.
  • a set of electronic device identifiers 133 of electronic devices 140 located within one or more electronic device detection zones 132 may be detected and stored using an electronic device identification sensor 130.
  • the electronic device identification sensor 130 may include at least one of a WiFi signal detection device, cellular signal detection device, or a Bluetooth signal detection device.
  • the set of electronic device identifiers 133 may include at least one of a Bluetooth MAC address, cellular MAC address, or a WiFi MAC address.
  • at least one of the set of stored electronic device identifiers 133 may be compared with a list of trusted device identifiers.
  • a license plate ID of a vehicle 126 may be detected using the set of images 123.
  • the images 123 may be filtered, rendered, and/or processed in any suitable manner such that the license plate IDs may be clearly detected using the set of images 123.
  • object character recognition OCR
  • OCR may be used to detect the license plate IDs in the set of images 123.
  • the OCR may electronically convert each image in the set of images 123 of the license plate IDs into computer-encoded license plate IDs that may be stored and/or used for comparison.
  • a face of the individual 142 may be detected by a camera 120 in the facial detection zone 150.
  • An image 123 may be captured by the camera 120 and facial recognition may be performed on the image to detect the face of the individual.
  • the detected face and/or the image 123 may be transmitted to the cloud-based computing system 116 and/or the computing device 102.
  • the license plate ID of the vehicle 126 may be compared to a database of trusted vehicle license plate IDs.
  • the database 117 of trusted vehicle license plate IDs may be populated at least in part by adding license plate IDs of vehicles 126 that commonly enter the license plate detection zone 122 to the database 117 of trusted vehicle license plate IDs.
  • the database 117 of trusted vehicle license plate IDs may be populated at least in part by manual entry of license plate IDs associated with vehicles 126 trusted to be within the license plate detection zone 122.
  • the trusted vehicles may belong to the neighbors, family members of the neighbors, friends of the neighbors, law enforcement, and so forth.
  • the vehicle may be identified as a suspicious vehicle 126.
  • the identification may be based at least in part on the comparison of the license plate ID of the vehicle to the database 117 of trusted vehicle license plate IDs. For example, if the license plate ID is not matched with a trusted license plate ID stored in the database 117 of trusted vehicle license plate IDs, then the vehicle associated with the license plate ID may be identified as a suspicious vehicle 126.
  • the license plate ID of the vehicle 126 may be correlated with at least one of the set of stored electronic device identifiers 133.
  • the face of the individual 142 may also be correlated with the license plate ID and the at least one of the set of stored electronic device identifiers 133.
  • at least one personal identification database 119 may be accessed.
  • correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device identifiers 133 may include comparing one or more time stamps of the set of captured images 123 with one or more time stamps of the set of stored electronic device identifiers 133.
  • correlating the license plate ID of the vehicle 126 with the at least one of the set of stored electronic device identifiers 133 may include analyzing at least one of (i) at least one strength of signal associated with at least one of the set of stored electronic device identifiers 133, and (ii) at least one visually estimated distance of at least one vehicle associated with at least one of the set of stored images 123.
  • Personal identification information of at least one suspicious individual may be retrieved from the at least one personal identification database 119 by correlating information of the personal identification database 119 with the license plate ID of the vehicle 126 or at least one of the set of electronic device identifiers 133 correlated with the license plate ID of the vehicle 126.
  • the personal identification information may also be obtained using a face detected by the camera 120 to obtain the electronic device ID 133 and/or the license plate ID correlated with the face.
  • the personal identification information may include one or more of a name, a phone number, an email address, a residential address, a Bluetooth MAC address, a cellular MAC address, a WiFi MAC address, whether the suspicious individual is on a crime watch list, a criminal record of the suspicious individual, and so forth.
  • a user interface may be displayed on one or more computing devices 102 of one or more neighbors when the one or more computing devices are executing the client application 104, and the user interface may present a notification or alert.
  • the computing device 102 may present a push notification on the display screen and the user may provide user input (e.g., swipe the push notification) to expand the notification on the user interface to a larger portion of the display screen.
  • the alert or notification may indicate that there is a suspicious vehicle 126 identified within the zones 122 and/or 132 and may provide information pertaining to the vehicle 126 (e.g., make, model, color, license plate ID, etc.) and personal identification information of the suspicious individual (e.g., name, phone number, email address, Bluetooth MAC address, cellular MAC address, WiFi MAC address, whether the individual is on a crime watch list, whether the individual has a criminal record, etc.).
  • information pertaining to the vehicle 126 e.g., make, model, color, license plate ID, etc.
  • personal identification information of the suspicious individual e.g., name, phone number, email address, Bluetooth MAC address, cellular MAC address, WiFi MAC address, whether the individual is on a crime watch list, whether the individual has a criminal record, etc.
  • the user interface may present one or more options to perform preventative actions.
  • the preventative actions may include contacting an electronic device 140 of the suspicious individual using the personal identification information.
  • a user may use a computing device 102 to transmit a communication (e.g., at least one text message, phone call, email, or some combination thereof) to the suspicious individual using the retried personal information.
  • a communication e.g., at least one text message, phone call, email, or some combination thereof
  • the preventative actions may also include notifying law enforcement of the suspicious vehicle and/or individual. This preventative action may be available if it is determined that the suspicious individual is on a crime watch list.
  • a suspicious vehicle profile may be created.
  • the suspicious vehicle profile may include the license plate ID of the suspicious vehicle and/or the at least one correlated electronic device identifiers (e.g., Bluetooth MAC address, WiFi MAC address).
  • the user may select the notify law enforcement option on the user interface and the computing device 102 of the user may transmit the suspicious vehicle profile to another computing device 102 of a law enforcement entity that may be logged into the client application 104 using a law enforcement account.
  • the preventative action may include activating an alarm upon detection of the suspicious vehicle 126.
  • the alarm may be located in the neighborhood, for example, on a light pole, a tree, a pole, a sign, a mailbox, a fence, or the like.
  • the alarm may be included in the computing device 102 of a user (e.g., a neighbor) using the client application.
  • the alarm may include auditory (e.g., a message about the suspect, a sound, etc.), visual (e.g., flash certain colors of lights), and/or haptic (e.g., vibrations) elements.
  • the severity of the alarm may change the pattern of auditory, visual, and/or haptic elements based on what kind of crimes the suspicious individual has committed, whether the suspicious vehicle 126 is stolen, whether the suspicious vehicle 126 matches a description of a vehicle involved in an Amber alert, and so forth.
  • FIGURE 4 illustrates another example method 400 for monitoring vehicle traffic, according to certain embodiments of this disclosure.
  • Method 400 includes operations performed by one or more processing devices of one or more devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 400.
  • one or more operations of the method 400 are implemented in computer instructions that, when executed by a processing device, execute the operations of the steps.
  • the method 400 may be performed in the same or a similar manner as described above in regards to method 300.
  • the method 400 may begin with a setup phase where various steps 402, 404, 406, and/or 408 are performed to register data that may be used to determine whether a vehicle and/or individual is suspicious.
  • law evidence may be registered.
  • the law evidence may be obtained from a system of a law enforcement agency.
  • API application programming interface
  • API operations may be executed to obtain the law evidence.
  • the law evidence may indicate whether a person is on a crime watch list 410, whether the person has a warrant, whether person has a criminal record, and/or the WiFi / Bluetooth MAC data (address) / cellular data of electronic devices involved in incidents, as well as the owner information 412 of the electronic devices.
  • the crime watch list 410 information may be used to store crime watch list 414 in a database (e.g., personal identification database 119).
  • license plate registration (LPR) data may be collected using the one or more cameras 120 in the license plate detection zones 122 as LPR raw data 416.
  • the LPR raw data 416 may be used to obtain vehicle owner information (e.g., name, address, phone number, email address) and vehicle information (e.g., license plate ID, make, model, color, year, etc.).
  • vehicle information e.g., license plate ID, make, model, color, year, etc.
  • the LPR raw data 416 may include at least the license plate ID, which may be used to search the Department of Motor Vehicles (DMV) to obtain the vehicle owner information and/or vehicle information.
  • DMV Department of Motor Vehicles
  • the LPR raw data 416 may be collected from manual entry.
  • WiFi MAC addresses may be collected from various sources as WiFi MAC raw data 418.
  • the WiFi MAC raw data 418 may be collected from the electronic device identification sensors 130 in the electronic device detection zones 132.
  • trusted WiFi MAC addresses may be manually obtained from certain people owning electronic devices in an area covered by the electronic device detection zones 132 and stored in a database (e.g., personal identification database 119).
  • cellular raw data e.g., cellular MAC addresses
  • Bluetooth MAC addresses may be collected from various sources as raw data 420.
  • the Bluetooth MAC raw data 418 may be collected from the electronic device identification sensors 130 in the electronic device detection zones 132.
  • trusted Bluetooth MAC addresses may be manually obtained from certain people owning electronic devices in an area covered by the electronic device detection zones 132 and stored in a database (e.g., personal identification database 119).
  • the Bluetooth MAC addresses may be collected from the electronic device identification sensors 130 at the electronic device detection zones 132.
  • face images may be collected by the one or more cameras 120 in the facial detection zones 150. Facial recognition may be performed to detect and recognize faces in the face images.
  • MAC raw data 420, the cellular raw data, and/or the face raw data 451 may be correlated or paired to generate matched data 424. That is, the data from license plate ID detection, LPR systems, personal electronic device detection, and/or facial information may be combined to generate matched data 424 and stored in the database 117 and/or 119.
  • the license plate IDs are compared to the database 119 of trusted vehicle license plate IDs to determine whether the detected license plate ID is in the trusted vehicle license plate ID database 119. If not, the vehicle 126 may be identified as a suspicious vehicle and the license plate ID of the vehicle may be correlated with at least one of the set of stored electronic device IDs 133. This may result in creation of a database of detected electronic device identifiers 133 correlated with license plate IDs and facial information of individuals. Any unpaired data may be discarded after unsuccessful pairing.
  • owner data of the electronic devices and/or vehicle may be added to the matched data 424.
  • the owner data may include an owner ID, and/or name, address, and the like.
  • owner’s phone number and email may be added to the matched data.
  • WiFi / Bluetooth MAC / cellular data and owner data 412 from the law evidence may be included with the matched data 424 and the personal information of the owner to generate matched data with owner information 430.
  • the owner ID may be associated with combined personal information (e.g., name, address, phone number, email, etc.), vehicle information (e.g., license plate ID, make, model, color, year, vehicle owner information, etc.), and electronic device IDs 133 (e.g., WiFi MAC address, Bluetooth MAC adder).
  • vehicle information e.g., license plate ID, make, model, color, year, vehicle owner information, etc.
  • electronic device IDs 133 e.g., WiFi MAC address, Bluetooth MAC adder.
  • the matched data with owner information 430 may be further processed (e.g., formatted, edited, etc.) to generate matchable data. This may conclude the setup phase.
  • the method 400 may include a monitoring phase. During this phase, the method 400 may include monitoring steps 442, 444, and 445.
  • WiFi MAC address monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of WiFi MAC addresses as WiFi MAC raw data 448.
  • cellular signal monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of cellular MAC addresses as cellular raw data.
  • Bluetooth MAC address monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of Bluetooth MAC addresses as Bluetooth MAC raw data 450.
  • face monitoring may include the one or more cameras 120 capturing face images and recognizing faces in the face images as face raw data 451.
  • the WiFi MAC raw data 448, Bluetooth MAC raw data 450, and/or face raw data 451 may be compared to matchable data 432 at decision block 452.
  • the electronic device IDs 133 and/or faces detected by the electronic device identification sensors 130 and/or the cameras 120 may be compared to the matchable data 432.
  • the matchable data 432 may include personal identification information that is retrieved from at least the personal identification database 117. That is, the detected electronic device IDs 133 and/or faces may be compared to the database 117 and/or 119 to find any correlation of the detected electronic device IDs 133 and/or faces with license plate IDs.
  • a suspicious vehicle 126 / individual 143 may be detected.
  • the detected match event may be logged.
  • the user interface of the client application 104 executing on the computing device 102 may present an alert of the suspicious vehicle 126 / individual 142.
  • the detected notification event may be logged.
  • the electronic device 140 of the suspicious individual 142 may be notified that his presence is known (e.g., taunted).
  • the taunting event may be logged.
  • the crime watch list 414 may be used to determine if the identified individual 142 is on the crime watch list 414 using the individual’s personal information. If the individual 142 is on the watch list 414, then at block 462, the appropriate law enforcement agency may be notified. At block 456, the law enforcement agency notification event may be logged.
  • FIGURE 5 illustrates example use interfaces presented on computing devices during monitoring vehicle traffic, according to certain embodiments of this disclosure.
  • a user interface 500 may present vehicle information and electronic device information in a single user interface.
  • a notification may be presented on the user interface 500 of the client application 104 executing on the computing device 102 of a user (e.g., homeowner, neighbor, interested citizen).
  • the notification includes an alert displaying vehicle information and electronic device information.
  • the vehicle information includes the “Make: Jeep”, “Model: Wrangler”, “License Plate ID: ABC123”.
  • the electronic device information includes “Electronic Device ID: 00:11:22:33:FF:EE”, “Belongs to: John Smith”, “Phone Number: 123-456-7890”. Further, the user interface 500 presents that the owner has a warrant out for his arrest.
  • the notification event may be logged in the database 117 / 119 or any suitable database of the system 100.
  • the user interface 500 includes various preventative action options represented by user interface element 502 and 504.
  • user interface element 502 may be associated with contacting the detected suspicious individual 142 directly.
  • the user may be able to send a text message to the electronic device 140 of the suspicious individual 142.
  • the text message may read “Please leave the area immediately, or I will contact law enforcement.”
  • any suitable message may be sent.
  • the message / taunting event may be logged in the database 117 / 119 or any suitable database of the system 100.
  • the user interface element 504 may be displayed that provides the option to notify law enforcement. Upon selection of the user interface element 504, a notification may be transmitted to a computing device 102 of a law enforcement agency.
  • the notification may include vehicle information (e.g., “License Plate ID: ABC 123”), electronic device information (e.g., “Electronic Device ID: 00:11:22:33:FF:EE”), as well as location of the detection (e.g., “Geographic Location: latitude 47.6° North and longitude 122.33° West”), and personal information (“Name: John Smith”, “Phone Number: 123-456-7890”, a face of the individual 142).
  • vehicle information e.g., “License Plate ID: ABC 123”
  • electronic device information e.g., “Electronic Device ID: 00:11:22:33:FF:EE”
  • location of the detection e.g., “Geographic Location: latitude 47.6° North and longitude 122.33° West”
  • personal information e.g., “Name: John Smith”, “Phone Number: 123-456-7890”, a face of the individual 142).
  • the law enforcement agency event
  • the data tables may include: Client and ID Tables (logID, loginAttempts, clientUser, lawUser, billing), Data Site Info (monitoredSites, dataSites, dataGroups), Raw Collection Data (rawWiFiDataFound, rawBTDataFound, rawLPRDataFound, pairedData), Monitor Data Raw & Matched (monWiFiDataDetected, monBTDataDetected, monWiFiDataMatched, monBTDataMatched), Subject Data (subjectMatch, subjectlnfo, subjectLastSeen, criminal WatchList), Notification Logs (subNotifyLog, subNotifyReplyLog, clientNotifyLog).
  • Client and ID Tables logID, loginAttempts, clientUser, lawUser, billing
  • Data Site Info monitoredSites, dataSites, dataGroups
  • Raw Collection Data rawWiFiDataFound, rawBTDataFound, rawLPRDataFound, pairedData
  • Monitor Data Raw & Matched
  • Table 1 logID is used for login ID/passwords, authentication and password resets
  • Table 4 lawUser includes information for law enforcement persononel wanting to be notified of suspicious vehicles 126 / individuals 142.
  • monitoredSites includes information for WiFi / Bluetooth monitoring for detection, among other things.
  • Table 7 dataSites includes information for WiFi / Bluetooth / License Plate Registration detection sites. These sites may supply data to databases, among other things.
  • Table 8 dataGroups may group data groups and monitored sites. Groupings such as Homeowner Associations, neighborhoods, etc.
  • Table 9 rawWiFiDataFound includes raw data dump for WiFi from detection sites used to look for matches.
  • rawBTDataFound includes raw data dump for Bluetooth from detection sites used to look for matches.
  • rawLPRDataFound may include raw LPR data from detection sites used to look for matches.
  • pairedData includes matched data that may be the correlation between vehicle information (e.g., license plate IDs) and electronic device IDs 133.
  • Table 13 monWiFiDataDetected logs of any MAC address data detefcted before matching for WiFi.
  • Table 14 monBTDataDetected logs of any MAC address data detected before matching for Bluetooth.
  • Table 15 monWiFiDataMatched logs of any matches moniroted sites find on the database for WiFi.
  • Table 16 monBTDataMatched logs of any matches monitored sites find on the database for Bluetooth.
  • Table 17 subjectMatch includes a number of times subject detected in monitored sites and data sites.
  • Table 18 subjectlnfo includes information obtained for owner of license vehicle.
  • Table 19 subjectLastSeen includes locations where subject was seen with a timestamp.
  • Table 20 criminalWatchList includes a criminal watch list that is compared to subjects / individuals 142 to determine if they are a criminal and who to notify if found.
  • Table 21 subNotifyLog includes notifications sent to the subject to discourage crime.
  • Table 22 subNotifyReplyLog includes any replies from the subject after notification.
  • clientNotifyLog includes log of notification attempts to the client (e.g., computing device 102 of a user).
  • FIGURE 6 illustrates another high-level component diagram of an illustrative system architecture 600 including an artificial intelligence engine 170, according to certain embodiments of this disclosure.
  • the system architecture 600 of FIGURE 6 is substantially similar to the system architecture 100 of FIGURE 1.
  • the components that are depicted in FIGURE 6 may be described similarly to their like components depicted in FIGURE 1.
  • the additional components in FIGURE 6 include the artificial intelligence engine 170 and the one or more machine learning models 172. Although shown separately from the one or more servers 118, the artificial intelligence engine 170 may be hosted and/or executed by the one or more servers 118.
  • the artificial intelligence engine 170 that uses one or more machine learning models 172 to perform at least one of the embodiments disclosed herein.
  • the cloud-based computing system 116 may include a training engine 174 capable of generating the one or more machine learning models 172.
  • the machine learning models 172 may be trained to determine a probability of occurrence of a subsequent event based on one or more identifiers of one or more people, information pertaining to an incident that occurred at a location where the one or more people were present at a particular time, additional information (e.g., criminal record, mugshot, electronic medical record, etc.) pertaining to the one or more people, or some combination thereof. Further, the one or more machine learning models 172 may be trained to determine a preventative action to select and perform based on a severity of a subsequent incident that is determined occur and/or a probability of occurrence of the subsequent incident.
  • the machine learning model may be trained using training data that indicates certain preventative actions have higher success rates of reducing a probability that the subsequent event occurs discover, translate, design, generate, create, develop, classify, and/or test candidate drug compounds, among other things.
  • the one or more machine learning models 172 may be generated by the training engine 174 and may be implemented in computer instructions executable by one or more processing devices of the artificial intelligence engine 170, the training engine 174, and/or the servers 118. To generate the one or more machine learning models 172, the training engine 174 may train the one or more machine learning models 172.
  • the training engine 174 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above.
  • the training engine 174 may be cloud-based, be a real time software platform, include privacy software or protocols, and/or include security software or protocols.
  • the training engine 174 may train the one or more machine learning models 172.
  • the training engine 174 may use a base data set of a set of identifiers (e.g., electronic device IDs associated with the people, license plate numbers of the vehicles registered to the people, images of the people, etc.) of people that were present at a location an incident occurred at a particular time, information (e.g., type of incident, location of incident, time of incident, criminal or civil, damage to property, harm to people, etc.) pertaining to the incident, additional information pertaining to the people (e.g., one or more criminal records of the people, one or more mugshots of the people, addresses of the people, electronic medical records of the people, fingerprints of the people, images of the people, ages of the people, names of the people, email address associated with the people, phone numbers associated with the people, indications of the people being on a watch list, or some combination thereof) present at the incident at the location at the particular time
  • identifiers e.g., electronic
  • the machine learning models 172 may be trained to receive, as input, subsequent identifiers associated with a person, information about incidents, and/or additional information pertaining to the person, and output a probability of occurrence of a subsequent incident. For example, if an identifier of a person is detected as being present at a location where a riot occurs (e.g., incident) with a certain threshold of other identifiers of other people also present at the location where the riot occurs, there may be a correlation between those identifiers and incidents occurring. Accordingly, if the identifier of the person is detected the next night at another location where the other identifiers of the other people are also detected, the probability of a subsequent incident occurring may be high. In other words, the people may be working together to initiate and/or instigate riots. Thus, using the trained machine learning models 172, subsequent incidents may be prevented.
  • the one or more machine learning models 172 may refer to model artifacts created by the training engine 174 using training data that includes training inputs and corresponding target outputs.
  • the training engine 174 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning models 172 that capture these patterns.
  • the training engine 174 may reside on server 118.
  • the artificial intelligence engine 170, the database 117 / 119, and/or the training engine 130 may reside on the computing device 102.
  • the one or more machine learning models 172 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 172 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations.
  • deep networks are neural networks, including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself).
  • the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
  • Recurrent neural networks include the functionality, in the context of a hidden layer, to process information sequences and store information about previous computations. As such, recurrent neural networks may have or exhibit a “memory.” Recurrent neural networks may include connections between nodes that form a directed graph along a temporal sequence. Keeping and analyzing information about previous states enables recurrent neural networks to process sequences of inputs to recognize patterns (e.g., such as sequences of ingredients and correlations with certain types of activity level). Recurrent neural networks may be similar to Markov chains. For example, Markov chains may refer to stochastic models describing sequences of possible events in which the probability of any given event depends only on the state information contained in the previous event. Thus, Markov chains also use an internal memory to store at least the state of the previous event. These models may be useful in determining causal inference, such as whether an event at a current node changes as a result of the state of a previous node changing.
  • FIGURE 7 illustrates an example scenario 700 where a preventative action is performed based on a probability of occurrence of a subsequent incident, according to certain embodiments of this disclosure.
  • one or more people that are present at a location where an incident occurs such as a riot, may be an indicator that the one or more people are likely to riot in the future.
  • the probability of a riot occurring again may increase if the same one or more people are determined to be within a certain proximity to each other at a later date. For example, some people work together to insight and/or start riots.
  • These types of people be associated with additional information (e.g., criminal records) that may further increase the probability that a subsequent incident may occur when the one or more people are gathered.
  • some purposes of the present disclosure is to track certain people using identifiers associated with the people, determine probabilities of occurrences based on the tracked identifiers, information pertaining to incidents, and/or additional information pertaining to the people, and perform preventative actions.
  • the preventative action may include distally controlling other electronic devices to activate, deactivate, actuate, extend, retract, present notifications, etc.
  • Some technical benefits of the present disclosure may include accurate location tracking of people, vehicles, and the like. Further, based on the accurate location tracking, some embodiments may enable determining a probability of occurrence of a subsequent incident using a trained machine learning model, and performing a preventative action based on the probability of occurrence.
  • the preventative action may be performed by the cloud-computing system 116 using various APIs and/or services of electronic devices, and thus, the present disclosure enables interoperability between electronic devices (e.g., smartphones) of people, a cloud-based computing system 1116, and various other electronic devices (e.g., alarm systems, appliances, lights, locks, speakers, computing devices, irrigation systems, etc.).
  • the subsequent incident may be thwarted and/or mitigated (e.g., amount of personal and property damage lessened than what it may have been if the preventative action was not performed).
  • an incident occurred at a location at a first time (e.g., 9 PM January 1 st ).
  • the location where the incident occurred may be in electronic device detection zone 132-2 in which one or more electronic device identification sensors 130 are installed.
  • the one or more electronic device identification sensors 130 may detect the electronic device 140 of the person 142 in the electronic device detection zone 132-2 at the location where, and the first time when, the incident occurred.
  • An identifier may be transmitted to the cloud-based computing system 116 via the network 112.
  • the identifier may be an electronic device ID 133 (e.g., WiFi MAC address, Bluetooth MAC address, etc.).
  • the one or more electronic device identification sensors 130 may transmit the identifiers (e.g., electronic device IDs 133) associated with each person in the electronic device detection zone 132-2 where the incident occurred.
  • the cloud-based computing system 116 may receive information pertaining to the incident that occurred at the location where the person 142 was present at the first time.
  • the information pertaining to the incident may include a description of the incident that occurred, a type of incident (e.g., criminal, civil, riot, vandalism, looting, drug trafficking, human trafficking, loitering, etc.), a timestamp of the incident, a duration of the incident, a location of the incident, and the like.
  • the identifiers of the people and the information of the incident may be correlated and stored in database 119 and/or 117 for use by the trained machine learning models 123 to continuously update their determinations of probabilities of occurrences of subsequent incidents.
  • a computing device 702 may provide the information pertaining to the incident to the cloud-based computing system 116 and/or additional information pertaining to the people present at the location of the incident at time Tl.
  • the computing device 702 may be associated with any suitable source that provides information pertaining to incidents that occur.
  • the computing device 702 may be associated with a law enforcement agency that provides police reports generated as a result of the incident, mugshots of people detected as being present at the location during the incident, criminal records of people detected as being present at the location during the incident, and so forth.
  • the criminal record may indicate that a person is a 3 time convicted felon for armed robbery, and a machine learning model 172 may be trained to output a high probability of occurrence of a subsequent incident at a subsequent time when the person is detected if the person was recently detected as being present at a location where an armed robbery occurred.
  • the computing device 702 may be associated with a healthcare facility that uses an electronic medical record (EMR) system.
  • EMR system may transmit information (e.g., medical records) of people to the cloud-based computing system 116 for use when determining the probability of occurrence of subsequent incidents.
  • the medical record may provide an indication if the person has been diagnosed as having a mental health related condition, which may be a relevant factor when determining whether a subsequent incident may occur.
  • the computing device 702 may be associated with a news broadcasting entity, a social network entity, a social media entity, or the like.
  • the machine learning model 172 may receive the one or more identifiers (e.g., electronic device IDs 133) associated with the people present at the incident at the first time, the information pertaining to the incident, and/or the additional information pertaining to the people.
  • the cloud-based computing system 116 may correlate the identifiers, information pertaining to the incident, and/or the additional information pertaining to the people and store the correlated data in a database.
  • the machine learning models may be trained to determine probabilities of occurrences of subsequent incidents using the correlated data (e.g., the identifiers associated with the people, information pertaining to the incident, additional information pertaining to the people, etc.).
  • the identifier of the electronic device 140 may be detected in the electronic device detection zone 132-2 again. In some embodiments, there may be a threshold number of the same identifiers detected in the electronic device detection zone 132-2 as were detected the night before when the incident occurred at time T1.
  • the identifier may be transmitted to the cloud-based computing system 116, along with information pertaining to the location (e.g., GPS coordinates, etc.).
  • the cloud-based computing system 116 may use the identifier to obtain any additional data about the people (e.g., license plate numbers of vehicles registered to the people, criminal records of the people, mugshots of the people, medical records of the people, etc.).
  • the AI engine 170 may input the identifiers of the people, the information pertaining to the location where the electronic devices 140 are located at time T2, and/or the additional information pertaining to the people into the machine learning model 172.
  • the trained machine learning model may be receive the input and output a probability of occurrence of a subsequent incident 703.
  • the probability of occurrence of a subsequent incident 703 may be a value, a percentage, a number, or the like.
  • the probability of occurrence may be used by the AI engine 170 to determine whether or not to perform a preventative action (e.g., the probability of occurrence has to satisfy a threshold) and which preventative action to perform.
  • One or more machine learning models 172 may be trained to input the probability of occurrence of the subsequent incident 703, the information pertaining to the location, the identifiers associated with the people, and/or the additional information pertaining to the people, and to output a preventative action that is likely to squash, thwart, mitigate, and/or prevent the subsequent incident from occurring.
  • the higher the probability of occurrence of the subsequent incident 703, the more severe of a preventative action may be selected and performed.
  • the machine learning model 703 may determine the preventative action to perform and perform the preventative action 706.
  • the preventative action 706 may include causing an alarm system 704 to activate in the electronic device detection zone 132-2 to attempt to scare the people 142 out of the zone.
  • an alarm system 704 to activate in the electronic device detection zone 132-2 to attempt to scare the people 142 out of the zone.
  • FIGURE 8 illustrates an example method 800 for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure.
  • the method 800 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both.
  • the method 800 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 800.
  • a computing system may refer to the computing device 102 or the cloud-based computing system 116.
  • the method 800 may be implemented as computer instructions that, when executed by a processing device, execute the operations.
  • the method 800 may be performed by a single processing thread.
  • the method 800 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 800.
  • an identifier associated with a person may be received at a processor.
  • the identifier may be received from a location where the person was present at a first time.
  • one or more location services applications executing on one or more electronic devices 140 may transmit the location(s) (e.g., using GPS) of the one or more electronic devices 140 of one or more people.
  • the identifier may be an electronic device ID 133 (e.g., media access control (MAC) address (e.g., WiFi and/or Bluetooth)), an image of the person (e.g., obtained via camera 120), a license plate number of a vehicle registered to the person, or some combination thereof.
  • MAC media access control
  • electronic device identification sensors 130 may be configured to detect one or more electronic device IDs 133 (e.g., WiFi MAC addresses, Bluetooth MAC addresses, and/or cellular MAC addresses) of electronic device 140 within the electronic device detection zone 132.
  • one or more cameras 120 may be used in one or more license plate detection zones to capture images of license plates of vehicles of people and the images may be processed to determine the license plate numbers of the vehicles registered to the people.
  • One or more images of people may be obtained in one or more facial detection zones 150 and the one or more images may be processed (e.g., facial recognition techniques) to determine identities of the people.
  • the identifier may be associated with the person prior to or concurrently with block 802 occurring.
  • the identifier of the person may be a MAC address of the electronic device 140 of the person 142 and the MAC address may have been associated with the person when the person purchased the electronic device 140.
  • the identifier associated with the person, the license plate number of the vehicle registered to the person, and/or an image of the person may be correlated together when such information is received at the cloud-based computing system 116.
  • information pertaining to an incident that occurred at the location where the person was present at the first time may be received.
  • the information may be received via any suitable source, such as an API associated with a law enforcement agency, fire department, healthcare facility, etc.
  • the source may be a news broadcasting channel, website, and/or application; a media channel, website, and/or application; a social media website and/or application; or the like.
  • the incident may include rioting, arson, looting, robbery, violence, assault, battery, drug trafficking, human tracking, kidnapping, any type of criminal activity, or the like.
  • the incident may include any suitable incident that may be defined and programmed to be tracked by the cloud-based computing system 116.
  • the incident may be non-criminal activity, such as peaceful protests, concert events, sporting events, movie theater events, gatherings, a presence of the person in a particular zone, etc.
  • the disclosed techniques may be used to track the location of the person using which zone the person has entered (e.g., license plate detection zone 122, electronic device detection zone 132, facial detection zone 150, manual input zone 160, etc.).
  • the incident may include the person being present at the location, which may trigger a preventative action to occur. For example, if a person is on a ban list and not allowed on a certain property, their detected presence at the location of the certain property may cause the person to be kicked off the property by security.
  • a preventative action may be performed (e.g., notify the electronic device of the second person to honor the restraining order and move farther away from the first person, notify the electronic device of the person that the second person is within the prohibited distance defined by the restraining order, notify law enforcement agency, etc.).
  • emergency services e.g., law enforcement agency
  • the identifier of the person may be received once from a particular location, and the detection of the person at the particular location may cause a preventative action to be performed.
  • any of the preventative actions described herein may be performed when the identifier of the person is detected at a certain location.
  • the identifier associated with the person may be received at a second time subsequent to the first time.
  • the identifier associated with the person may be received at the second time from the location (e.g., same or similar location the identifier was received at the first time) or may be received at the second time from another location different than the location the identifier was received at the first time.
  • the identifier may be received at the first time (e.g., 9:00 PM on January 1 st ) at a first location (e.g., movie theater), where a riot erupted (incident occurred) and the identifier may be received at a second time (e.g., 9:00 PM on January 2 nd ) at a second location (e.g., grocery store).
  • some embodiments may predict a probability of occurrence of a subsequent riot erupting (subsequent event occurring) at the second location based on the identifier of the person and information pertaining to the incident that occurred at the first location at the first time.
  • the probability of occurrence of a subsequent incident may be determined by the processor via a trained machine learning model using the identifier associated with the person and the information pertaining to the incident that occurred at the location where the person was present at the first time.
  • the subsequent incident may include a criminal offense, a civil offense, a triggered event (e.g., the presence of a person at a particular location where the person is banned from being, is restrained from being, or both; the presence of a wanted person at any detected location; etc.).
  • the triggered event may be a predefined triggered event.
  • the cloud-based computing system 116 may include provide a user interface that enables defining, programming, scripting, specifying, modifying, deleting, uploading, etc.
  • the machine learning model may be trained to determine the probability of occurrence of the subsequent incident based on a set of training data including at least one of (i) other identifiers of other people that were present at the location the incident occurred at the first time, one or more criminal records of the person, (ii) the other people, or some combination thereof, and/or (iii) a pattern recognized using the identifier, the information the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
  • a preventative action may be performed based on the probability of occurrence of the subsequent incident.
  • Example preventative actions are described in more detail below with reference to FIGURE 10.
  • the preventative action that is performed may be selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both. For example, if the subsequent incident, such as a riot, is determined to have a severity above a certain threshold the preventative action may be more drastic than if the severity is less than the certain threshold.
  • the preventative action for the riot may include notifying emergency services, such as a law enforcement agency, of the probability of occurrence of the subsequent incident at a particular location (e.g., geographical coordinates), which may result in the emergency services dispatching personnel to the particular location at the second time.
  • the preventative action may include causing an electronic device to activate, such as by turning on a live feed of a camera at a location where the subsequent incident is likely to occur. If the subsequent incident is determined to have a severity below a certain threshold, such as a peaceful protest, a less drastic preventative action may occur such as causing an alarm system of a department store to arm just in case the peaceful protest turns into looting.
  • FIGURE 9 illustrates another example method 900 for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure.
  • the method 900 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both.
  • the method 900 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 900.
  • a computing system may refer to the computing device 102 or the cloud-based computing system 116.
  • the method 900 may be implemented as computer instructions that, when executed by a processing device, execute the operations. In certain implementations, the method 900 may be performed by a single processing thread. Alternatively, the method 900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 900. The method 900 may include blocks of operations that are performed prior to, concurrently with, or subsequently to the blocks of operations of the method 800 of FIGURE 8.
  • additional information pertaining to the person may be received from a third-party source at a third time subsequent to the first and second times.
  • the third-party source may include an API of a law enforcement agency, an electronic medical record system, a public data source, a website, a distribution system, an email system, or any suitable source.
  • the additional information may include a criminal record of the person, a mugshot of the person, a fingerprint of the person, an image of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, an email address of the person, a phone number of the person, an indication of the person being on a watch list, or some combination thereof.
  • the additional information may be correlated with at least the identifier of the person.
  • the criminal record of the person may be correlated with the electronic device ID associated with the person, the license plate number of the vehicle associated with the person, an image of the person, or the like. Accordingly, when the identifier is received at subsequent times for the person, the additional information may be retrieve and used to make predictions (probabilities of occurrences of subsequent incidents) and/or decisions that may cause performance of preventative actions.
  • the identifier associated with the person may be received at a fourth time.
  • the fourth time may be subsequent to the first, second, and/or third time.
  • the identifier may be received from the electronic device 140 of the person 142, the network 112 [0130]
  • the probability of occurrence of a subsequent incident may be determined via the trained machine learning model using the identifier, the information pertaining to the incident that occurred at the location where the person was present at the first time, and/or the additional information.
  • a preventative action may be performed based on the probability of occurrence of the subsequent incident.
  • Various examples of preventative actions are discussed with reference to FIGURE 10.
  • FIGURE 10 illustrates an example method 1000 of various performing one or more preventative action, according to certain embodiments of this disclosure.
  • the method 1000 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both.
  • the method 1000 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 1000.
  • a computing system may refer to the computing device 102 or the cloud-based computing system 116.
  • the method 1000 may be implemented as computer instructions that, when executed by a processing device, execute the operations.
  • the method 1000 may be performed by a single processing thread. Alternatively, the method 1000 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 1000. It should be noted that one or more of the preventative actions 1002, 1004, 1006, 1008, 1010, 1012, and/or 1014 may be performed in any combination simultaneously or in a sequence in a time series. Further, the preventative actions are for explanatory purposes and it is noted that additional preventative actions may be performed. [0133] At block 1002, a first preventative action may include transmitting a notification to a computing device of an emergency responder.
  • the notification may be presented on a user interface of the computing device of the emergency responder and may include graphical elements to take certain actions, such as initiate a phone call with another person’s computing device via a cellular carrier’s radio tower, send a text message a to computing device of another person, or the like.
  • the computing device of the emergency responder may include a network interface device configured to connect to the network 112.
  • the network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wireless local area network (WLAN), such as WiFi.
  • the cloud-based computing system 116 may communicate with the computing device of the emergency responder via the network 112.
  • a second preventative action may include causing an alarm system to active.
  • Activating the alarm system may include transmitting a control signal from the cloud- based computing system 116 to the alarm system to cause the alarm to arm itself, to disarm itself, to activate by emitting audio, light, causing the alarm system to transmit signals to other systems (e.g., emergency responder systems of law enforcement, fire departments, healthcare facilities, etc.), or some combination thereof.
  • other systems e.g., emergency responder systems of law enforcement, fire departments, healthcare facilities, etc.
  • the alarm system may include a network interface device configured to connect to the network 112.
  • the network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi.
  • the cloud-based computing system 116 may communicate with the alarm system via the network 112.
  • a third preventative action may include causing an electronic device to activate.
  • the electronic device may be any suitable electronic device, such as an appliance, an electronic lock (e.g., lock or unlock a door and/or window and/or hatch and/or latch), a smart thermostat, a garage door, an actuating arm, a door, a window, a shutter, a gate, and the like.
  • Activating may also refer to actuating, such as extending and/or retracting, opening and/or closing, locking and/or unlocking, turning on and/or turning off, etc.
  • the electronic device may include a network interface device configured to connect to the network 112.
  • the network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi.
  • the cloud-based computing system 116 may communicate with the electronic device via the network 112.
  • a fourth preventative action may include causing a light to activate.
  • the light may turn on or off, may change colors that are emitted, may change a brightness, may emit a strobe representing a pattern that encodes various messages, or the like.
  • the light may be a smart light that includes a network interface device configured to connect to the network 112.
  • the network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi.
  • the cloud- based computing system 116 may communicate with the light via the network 112.
  • a fifth preventative action may include causing an event to be triggered.
  • the event may include responding to a situation where a person is detected (e.g., via their identifier being received) in a zone where they are banned, or restrained from being. Further event may include responding to a situation where a person wanted by a law enforcement agency is detected in a zone.
  • the event may include transmitting notifications to certain computing devices according to a certain response protocol based on the type of event, or performing any combination of the disclosed preventative actions herein.
  • a sixth preventative action may include causing a speaker to emit a recorded message or audio. Instructions may be transmitted to the speaker and the instructions may include the message and/or audio to be emitted. For example, the message may state “Please step away from the area.”
  • the speaker may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi.
  • the cloud-based computing system 116 may communicate with the speaker via the network 112.
  • a seventh preventative action may include causing an irrigation system to activate.
  • Activating the irrigation system may include transmitting one or more control signals to the irrigation system to turn on and/or turn off one or more zones of the irrigation system. For example, if an incident is a fire or arson at a location, the irrigation system of that location may be activated to attempt to put out the fire with the water from the irrigation system.
  • the irrigation system may include a network interface device configured to connect to the network 112.
  • the network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi.
  • the cloud-based computing system 116 may communicate with the irrigation system via the network 112
  • FIGURE 11 illustrates example computer system 1100 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure.
  • computer system 1100 may correspond to the computing device 102, server 118 of the cloud-based computing system 116, artificial intelligence engine 170 of the cloud-based computing system 116, training engine 174 of the cloud-based computing system 116, the cameras 120, and/or the electronic device identification sensors 130 of FIGURE 1.
  • the computer system 1100 may be capable of executing client application 104 of FIGURE. 1.
  • the computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet.
  • the computer system may operate in the capacity of a server in a client-server network environment.
  • the computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an electronic device identification sensor, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
  • PC personal computer
  • PDA personal Digital Assistant
  • STB set-top box
  • mobile phone a camera
  • video camera a video camera
  • electronic device identification sensor an electronic device identification sensor
  • the computer system 1100 includes a processing device 1102, a main memory 1104 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1106 (e.g., solid state drive (SSD), flash memory, static random access memory (SRAM)), and a data storage device 1108, which communicate with each other via a bus 1110.
  • main memory 1104 e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • static memory 1106 e.g., solid state drive (SSD), flash memory, static random access memory (SRAM)
  • SRAM static random access memory
  • Processing device 1102 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1102 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
  • the processing device 1102 may also be one or more special- purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • network processor or the like.
  • the processing device 1102 is configured to execute instructions for performing any of the operations and steps discussed herein.
  • the computer system 1100 may further include a network interface device 1112.
  • the computer system 1100 also may include a video display 1114 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), one or more input devices 1116 (e.g., a keyboard and/or a mouse), and one or more speakers 1118 (e.g., a speaker).
  • a video display 1114 e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)
  • input devices 1116 e.g., a keyboard and/or a mouse
  • speakers 1118 e.g., a speaker
  • the video display 1114 and the input device(s) 1116 may be combined into a single component or device (e.g., an LCD touch screen).
  • the data storage device 1116 may include a computer-readable medium 1120 on which the instructions 1122 (e.g., implementing control system, user portal, clinical portal, and/or any functions performed by any device and/or component depicted in the FIGURES and described herein) embodying any one or more of the methodologies or functions described herein is stored.
  • the instructions 1122 may also reside, completely or at least partially, within the main memory 1104 and/or within the processing device 1102 during execution thereof by the computer system 1100. As such, the main memory 1104 and the processing device 1102 also constitute computer- readable media.
  • the instructions 1122 may further be transmitted or received over a network via the network interface device 1112.
  • computer-readable storage medium 1120 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “computer- readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • the term “computer- readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • a method for using artificial intelligence to determine a probability of occurrence of a subsequent incident comprising:
  • MAC media access control
  • a system comprising:
  • a processing device communicatively coupled to the memory, the processing device executes the instructions to:
  • [0186] receive information pertaining to an incident that occurred at the location where the person was present at the first time; [0187] receive, at a second time subsequent to the first time, the identifier associated with the person;
  • MAC media access control
  • [0196] receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and
  • [0200] determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident. [0201] 14. The system of clause 10, wherein the identifier associated with the person is received at the second time from the location or at the second time at another location different than the location.
  • a tangible, non-transitory machine-readable medium storing instructions that, when executed, cause a processing device to:
  • [0208] receive an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time;
  • [0209] receive information pertaining to an incident that occurred at the location where the person was present at the first time; [0210] receive, at a second time subsequent to the first time, the identifier associated with the person;
  • [0212] perform, based on the probability of occurrence of the subsequent incident, a preventative action.
  • MAC media access control
  • [0219] receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and
  • [0222] receive, at a fourth time, the identifier associated with the person; and [0223] determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.

Abstract

A method is disclosed for using artificial intelligence to determine a probability of occurrence of a subsequent incident. The method includes receiving, at a processor, an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time. The method also includes receiving information pertaining to an incident that occurred at the location where the person was present at the first time. The method also includes receiving, at a second time subsequent to the first time, the identifier associated with the person. The method also includes determining, by the processor via a trained machine learning model using the identifier and the information, the probability of occurrence of the subsequent incident. The method also includes performing, based on the probability of occurrence of the subsequent incident, a preventative action.

Description

SYSTEM AND METHOD FOR USING ARTIFICIAL INTELLIGENCE TO DETERMINE A PROBABILITY OF OCCURRENCE OF A SUBSEQUENT INCIDENT
CROSS-REFERENCES TO RELATED APPLICATIONS [0001] This application claims priority to both U.S. Application No. 17/039,505 filed September 30, 2020, and titled “System and Method for Using Artificial Intelligence to Determine a Probability of Occurrence of a Subsequent Incident and Performing a Preventative Action” and U.S. Application No. 16/910,949 filed June 24, 2020 titled “System and Method for Correlating Electronic Device Identifiers and Vehicle Information”. Both applications are hereby incorporated by reference as if reproduced in full below.
TECHNICAL FIELD
[0002] This disclosure relates generally to determining probabilities occurrence of subsequent incidents. More specifically, this disclosure relates to a system and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident and performing a preventative action.
BACKGROUND
[0003] Many public and private areas, including airports, business parks, companies, border checkpoints, neighborhoods, etc. employ measures to enhance the safety of the people and property on the area premises. For example, some neighborhoods are gated and visitors to the communities may be forced to check-in with a guard at a security gate prior to being allowed into the neighborhood. Some neighborhoods employ a crime watch group that includes a group of concerned citizens who work together with law enforcement to help keep their neighborhood safe. Such a program may rely on volunteers to patrol the neighborhood to help law enforcement discover and/or thwart suspicious and/or criminal activity. However, these and other conventional measures lack the ability to correlate certain information that provides for enhanced identification, tracking, and notification of and/or to suspicious vehicles / individuals. In addition, various incidents (e.g., riots, protests, looting, robbery, arson, etc.) may occur at certain locations and there may be no information that is usable to correlate a person to the incident at the location. The people that participated in incident may perform a subsequent incident at a later date, and it may be desirable to prevent such subsequent incident from occurring.
SUMMARY
[0004] In general, the present disclosure provides a system and method for correlating wireless network information.
[0005] In one aspect, a system for monitoring vehicle traffic may include at least one camera positioned to capture a set of images within a license plate detection zone, at least some of the captured images representing license plates of a set of vehicles appearing within the camera’s field of view. The system may also include at least one electronic device identification sensor configured to detect and store a set of electronic device identifiers of electronic devices located within one or more electronic device detection zones. The system may also include one or more non-transitory computer-readable storage media having stored thereon computer-executable instructions that, when executed by one or more processors, cause a computing system to: detect, using the set of images, a license plate ID of a vehicle; compare the license plate ID of the vehicle to a database of trusted vehicle license plate IDs; identify the vehicle as a suspicious vehicle, the identification based at least in part on the comparison of the license plate ID of the vehicle to the database of trusted vehicle license plate IDs; and correlate the license plate ID of the vehicle with at least one of the set of stored electronic device identifiers. [0006] In another aspect, a method for using artificial intelligence to determine a probability of occurrence of a subsequent incident is disclosed. The method may include receiving, at a processor, an identifier associated with a person, where the identifier is received from a location where the person was present at a first time. The method may also include receiving information pertaining to an incident that occurred at the location where the person was present at the first time. The method may also include receiving, at a second time subsequent to the first time, the identifier associated with the person. The method may also include determining, by the processor via a trained machine learning model using the identifier and the information, the probability of occurrence of the subsequent incident. The method may also include performing, based on the probability of occurrence of the subsequent incident, a preventative action.
[0007] In some embodiments, a tangible, non-transitory computer-readable medium stores instructions that, when executed, cause a processing device to perform any of the operations, steps, and/or functions of any of the methods disclosed herein.
[0008] In some embodiments, a system includes a memory device storing instructions, and a processing device communicatively coupled to the memory device, where the processing device executes the instructions to perform any of the operations, steps, and/or functions of any of the methods disclosed herein.
[0009] Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0010] Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The term “couple” and its derivatives refer to any direct or indirect communication between two or more elements, whether or not those elements are in physical contact with one another. The terms “transmit,” “receive,” and “communicate,” as well as derivatives thereof, encompass both direct and indirect communication. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, means to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The term “controller” means any device, system or part thereof that controls at least one operation. Such a controller may be implemented in hardware or a combination of hardware and software and/or firmware. The functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
[0011] Moreover, various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), solid state drives (SSDs), flash, or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
[0012] Definitions for other certain words and phrases are provided throughout this patent document. Those of ordinary skill in the art should understand that in many if not most instances, such definitions apply to prior as well as future uses of such defined words and phrases.
[0013] It should be noted that the term “cellular media access control (MAC) address” may refer to a MAC, international mobile subscriber identity (IMSI), mobile station international subscriber directory number (MSISDN), enhanced network selection (ENS), or any other form of unique identifying number.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] For a more complete understanding of this disclosure and its advantages, reference is now made to the following description, taken in conjunction with the accompanying drawings, in which:
[0015] FIGURE 1 illustrates a high-level component diagram of an illustrative system architecture, according to certain embodiments of this disclosure;
[0016] FIGURE 2 illustrates details pertaining to various components of the illustrative system architecture of FIGURE 1, according to certain embodiments of this disclosure;
[0017] FIGURE 3 illustrates example method for monitoring vehicle traffic, according to certain embodiments of this disclosure;
[0018] FIGURE 4 illustrates another example method for monitoring vehicle traffic, according to certain embodiments of this disclosure; [0019] FIGURE 5 illustrates example use interfaces presented on computing devices during monitoring vehicle traffic, according to certain embodiments of this disclosure;
[0020] FIGURE 6 illustrates another high-level component diagram of an illustrative system architecture including an artificial intelligence engine, according to certain embodiments of this disclosure;
[0021] FIGURE 7 illustrates an example scenario where a preventative action is performed based on a probability of occurrence of a subsequent incident, according to certain embodiments of this disclosure;
[0022] FIGURE 8 illustrates an example method for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure;
[0023] FIGURE 9 illustrates another example method for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure;
[0024] FIGURE 10 illustrates an example method of various performing one or more preventative action, according to certain embodiments of this disclosure; and
[0025] FIGURE 11 illustrates an example computer system according to certain embodiments of this disclosure.
DETAILED DESCRIPTION
[0026] Improvement is desired in the field of public safety for certain areas (e.g., neighborhood, airport, business park, border checkpoint, city, etc.). As discussed above, there are various measures that may be conventionally used, such as gated communities, neighborhood crime watch groups, and so forth. However, the conventional measures lack efficiency and accuracy in identifying suspicious vehicles / individuals and reporting of the suspicious vehicles / individuals, among other things. In some instances, the conventional measures may fail to report the suspicious vehicle / individual, altogether. The causes of the inefficient and/or failed reporting may be at least in part attributable to people (e.g., neighbors in a neighborhood) not having access to verified vehicle and/or personal information of an individual. Further, the conventional measures lack the ability to quickly, accurately, and automatically identify the vehicle as a suspicious vehicle, correlate vehicle information (e.g., license plate identifier (ID)), electronic device information (e.g., electronic device identifier (ID)), face information, etc., and/or perform a preventative action based on the identification.
[0027] Take the following example for illustrative purposes. A neighbor may witness an unknown vehicle drive through the neighborhood several times within a given time period during a day. The neighbor may not recognize the license plate ID or driver and may think about reporting the unknown vehicle to law enforcement. Instead, the neighbor may decide to proceed to do another activity. Subsequently, the person may burglarize a house in the neighborhood. Even if the neighbor attempted to lookup the license plate ID, and was able to find out information about an owner of the vehicle, the neighbor may not be able to determine whether the driver of the vehicle is the actual owner, the neighbor may not be able to determine whether the owner or driver is on a crime watch list, and so forth. Further, the neighbor may not be privy to the electronic device identifier of the electronic device the suspicious individual is carrying or that is installed in the vehicle, which may be used to track the whereabouts of the individual / vehicle in a monitored area. Even if a neighbor obtains an electronic device identifier, there currently is no technique for determining personal information associated with the electronic device identifier. To reiterate, conventional techniques for public safety lack the ability to identify a suspicious vehicle / individual and/or to correlate vehicle information, facial information, and/or electronic device identifiers of electronic devices of the driver to make an informed decision quickly, accurately, and automatically.
[0028] Aspects of the present disclosure relate to embodiments that overcome the shortcomings described above. The present disclosure relates to a system and method for correlating electronic device identifiers with vehicle information. The system may include one or more license plate detection zones, one or more electronic device detection zones, and/or one or more facial detection zones. The zones may be partially or wholly overlapping and there may be multiple zones established that span a desired area (e.g., a neighborhood, a city block, a public / private parking lot, any street, etc.). The license plate detection zones, the electronic device detection zones, and/or the facial detection zones may include devices that are communicatively coupled to one or more computing systems via a network. The license plate detection zones may include one or more cameras configured to capture images of at least license plates on vehicles that enter the license plate detection zone. The electronic device detection zone may include one or more electronic device identification sensors, such as a WiFi signal detection device or a Bluetooth® signal detection device. The electronic device identification sensors may be configured to detect and store WiFi Machine Access Control (MAC) addresses, Bluetooth MAC addresses, and/or cellular MAC addresses (e.g., International Mobile Subscriber Identity (IMSI), Mobile Station International Subscriber Directory Number (MSISDN), and Electronic Serial Numbers (ESN)) of electronic devices that enter the electronic device detection zone based on the signals emitted by the electronic devices. The facial detection zones may include one or more cameras configured to capture images or digital frames that are used to recognize a face. Any suitable MAC address may be detected, and to that end, a MAC address may be any combination of the IDs described herein (e.g., MAC, MSISIDN, IMSI, ESN, etc.). [0029] The computing system may analyze the images captured by the cameras and detect a license plate identifier (ID) of a vehicle. The license plate ID may be compared with trusted license plate IDs that are stored in a database. When there is not a trusted license plate ID that matches the license plate ID, the computing system may identify the vehicle as a suspicious vehicle. Then, the computing system may correlate the license plate ID of the vehicle with at least one of the stored electronic device identifiers. In some embodiments, the license plate ID and the at least one of the stored electronic device identifiers may be correlated with a face of the individual. In some embodiments, personal information, such as name, address, Bluetooth MAC address, WiFi MAC address, criminal record, whether the suspicious individual is on a crime watch list, etc. may be retrieved using the license plate ID or the at least one of the stored electronic device identifiers that is correlated with the license plate ID of the suspicious vehicle.
[0030] The system may include several computer applications that may be accessed by registered users of the system. For example, a client application may be accessed by a computing device of a user, such as a neighbor in a neighborhood implementing the system. The client application may present a user interface including an alert when a suspicious vehicle and/or individual is detected. The user interface may present several preventative actions for the user. For example, the user may contact the suspicious individual using the personal information (e.g., send a threatening text message), notify law enforcement, and so forth. Accordingly, a client application may be accessed by a computing device of a law enforcer. The client application may present a user interface including the notification that a suspicious vehicle and/or individual is detected in the particular zones.
[0031] Take the following example of a setup of the system for illustration purposes. In a neighborhood, that may only be accessed by two entrances, license plate detection zones and electronic device detection zones may be placed to cover both lanes at both entrances. In some instances, a facial detection zone may be placed at the entrances with the other zones. Each vehicle may be correlated with each electronic device that enters the neighborhood. Further, the recognized face may be correlated with the electronic device and the vehicle information. The houses inside the neighborhood may setup electronic device detection zones and/or a facial detection zone inside their property to detect electronic device IDs and/or faces and compare them with electronic device IDs and/or faces in a database that stores every correlation that has been made by the system to date (including the most recent correlations of electronic device IDs, faces, and/or vehicles entering the neighborhood). The home owner may be notified via the client application on their computing device if an electronic device and/or face is detected on their property. Further, in some embodiments, the individual associated with the electronic device and/or face may be notified on the electronic device that the homeowner is aware of their presence. If a known criminal with a warrant is detected at either the zones at the entrance or at the zones at the homeowner’s property, the appropriate law enforcement agency may be notified of their whereabouts.
[0032] The disclosed techniques provide numerous benefits over conventional systems. For example, the system provides efficient, accurate, and automatic identification of suspicious vehicles and/or individuals. Further, the system enables correlating vehicle license plate IDs with electronic device identifiers to enable enhanced detection and/or preventative actions, such as directly communicating with the electronic device of the suspicious individual and/or notifying law enforcement using the client application in real-time or near real-time when the suspicious vehicle enters one or more zones. For example, once the electronic device identifier is detected, a correlation may be obtained with a license plate ID to obtain personal information about the owner that enables contacting the owner directly and/or determining whether the owner is a criminal. The client application provides pertinent information pertaining to both the suspicious vehicle and/or individual in a single user interface without the user having to perform any searches of the license plate ID or electronic device identifier. As such, in some embodiments, the disclosed techniques reduce processing, memory, and/or network resources by reducing searches that the user may perform to find the information. Also, the disclosed techniques provide an enhanced user interface that presents the suspicious vehicle and/or individual information in single location, which may improve a user’s experience using the computing device.
[0033] FIGURES 1 through 6, discussed below, and the various embodiments used to describe the principles of this disclosure are by way of illustration only and should not be construed in any way to limit the scope of the disclosure.
[0034] FIGURE 1 illustrates a high-level component diagram of an illustrative system architecture 100 according to certain embodiments of this disclosure. In some embodiments, the system architecture 100 may include a computing device 102 communicatively coupled to a cloud- based computing system 116, one or more cameras 120, one or more electronic device identification sensors 130, and/or one or more electronic device 140 of a suspicious individual. The cloud-based computing system 116 may include one or more servers 118. Each of the computing device 102, the servers 118, the cameras 120, the electronic device identification sensors 130, and the electronic device 140 may include one or more processing devices, memory devices, and network interface devices. In some embodiments, the electronic device 140 may be referred to as a computing device herein. The electronic device 140 may be a smartphone, a wearable device (e.g., smart watch), a laptop, or any suitable portable electronic device including one or more processing devices, memory devices, and network interface devices.
[0035] The network interface devices may enable communication via a wireless protocol for transmitting data over short distances, such as Bluetooth, ZigBee, etc. Additionally, the network interface devices may enable communicating data over long distances, and in one example, the computing device 102 may communicate with a network 112. Network 112 may be a public network (e.g., connected to the Internet via wired (Ethernet) or wireless (WiFi)), a private network (e.g., a local area network (LAN) or wide area network (WAN)), or a combination thereof.
[0036] The computing device 102 may be any suitable computing device, such as a laptop, tablet, smartphone, or computer. The computing device may be configured to execute a client application 104 that presents a user interface. The client application 104 may be implemented in computer instructions stored on one or more memory devices and executed by one or more processing devices of the computing device 102. The client application 104 may be a standalone application installed on the computing device 102 or may be an application that is executed by another application (e.g., a website in a web browser).
[0037] The computing device 102 may include a display that is capable of presenting the user interface of the client application 104. The user interface may present various screens to a user depending on what type of user is logged into the client application 104. For example, a user, such as a neighbor or person interested in a particular license plate detection zone 122 and/or electronic device detection zone 132, may be presented with a user interface for logging into the system where the user enters credentials (username and password), a user interface that displays alerts of suspicious vehicles and/or individuals in the zones 122 and/or 132 where the user interface includes options for preventative actions, a user interface that presents logged events over time, and so forth. For example, the client application 104 may enable the user to directly contact (e.g., send text message, send email, call) the electronic device 140 of a suspicious individual 142 using personal information obtained about the individual 142. Another user, such as a law enforcer, may be presented with a user interface for logging into the system where the user enters credentials (username and password), a user interface that displays notifications when the user selects to notify law enforcement where the notifications may include information related to the suspicious vehicle and/or individual 142. [0038] In some embodiments, the cameras 120 may be located in the license plate detection zones 122. Although just one camera 120 and one license plate detection zone 122 are depicted, it should be noted that any suitable number of cameras 120 may be located in any suitable number of license plate detection zones 122. For example, multiple license plate detection zones 122 may be used to cover a desired area. A license plate detection zone 122 may refer to an area of coverage that is within the cameras’ 120 field of view. The cameras 120 may be any suitable camera and/or video camera capable of capturing a set of images 123 that at least represent license plates of a vehicle 126 that enters the license plate detection zone 122. The set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112.
[0039] In some embodiments, the electronic device identification sensors 130 may be located in the electronic device detection zones 132. In some embodiments, the license plate detection zone 122 and the electronic device detection zone 132-1 may partially or wholly overlap. The combination of license plate detection zones 122 and the electronic device detection zones 132 may be setup at entrances / exits to certain areas, and/or any other suitable area in a monitored area, to correlate each vehicle information with respective electronic device identifiers 133 of electronic devices 140 being carried in respective vehicles 126. Each of the license plate detection zones 122 and electronic device detection zones 132 may have unique geographic identifiers so the data can be tracked by location. It should be noted that any suitable number of electronic device identification sensors 130 may be located in any suitable number of electronic device detection zones 132. For example, multiple electronic device detection zones 132 may be used to cover a desired area. An electronic device detection zone 132 may refer to an area of coverage that is within the electronic device identification sensor 130 detection area. [0040] In one example, an electronic device detection zone 132-2 and/or a facial detection zone 150 may be setup at a home of a homeowner, such that an electronic device 140 and/or a face of a suspicious individual 142 may be detected and stored when the suspicious individual 142 enters the zone 132-2. The electronic device ID 133 and/or an image of the face may be transmitted to the cloud-based computing device 116 or the computing device 102 via the network 112. In some instances, the suspicious individual 142 may be contacted on their electronic device 140 with a message indicating the homeowner is aware of their presence and to leave the premises. In some instances, if a known criminal individual 142 with a warrant is detected at the combination of zones 122 and 132-1 at an entrance or at the zone 132-2 and 150 at the home, then the proper law enforcement agency may be contacted with the whereabouts of the individual 142.
[0041] In some embodiments, the cameras 120 may be located in the facial detection zones 150. Although just one camera 120 and one facial detection zone 150 are depicted, it should be noted that any suitable number of cameras 120 may be located in any suitable number of facial detection zones 122. For example, multiple facial detection zones 150 may be used to cover a desired area. A facial detection zone 150 may refer to an area of coverage that is within the cameras’ 120 field of view. The cameras 120 may be any suitable camera and/or video camera capable of capturing a set of images 123 that at least represent faces of an individual 142 that enters the facial detection zone 150. The set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112. In some embodiments, the cloud-based computing system 116 and/or the computing device 102 may perform facial recognition by comparing a face detected in the image to a database of faces to find a match and/or perform biometric artificial intelligence that may uniquely identify an individual 142 by analyzing patterns based on the individual’s facial textures and shape. [0042] The electronic device identification sensors 130 may be configured to detect a set of electronic device IDs 133 (e.g., WiFi MAC addresses, Bluetooth MAC addresses, and/or cellular MAC addresses) of electronic device 140 within the electronic device detection zone 132. As depicted, the electronic device 140 of a suspicious individual is within the vehicle 126 passing through the electronic device detection zone 132. That is, the electronic device identification sensors 130 may be any suitable WiFi signal detection device capable of detecting WiFi MAC addresses and/or Bluetooth signal detection device capable of detecting Bluetooth MAC addresses of electronic devices 140 that enter the electronic device detection zone 132. The set of images 123 may be transmitted by the camera 120 to the cloud-based computing system 116 and/or the computing device 102 via the network 112. The electronic device identification sensor 130 may store the set of electronic device IDs 133 locally in a memory. The electronic device identification sensor 130 may also transmit the set of electronic device IDs 133 to the cloud-based computing system 116 and/or the computing device 102 via the network 112 for storage.
[0043] As noted above, the cloud-based computing system 116 may include the one or more servers 118 that form a distributed computing architecture. Each of the servers 118 may be any suitable computing system and may include one or more processing devices, memory devices, data storage, and/or network interface devices. The servers 118 may be in communication with one another via any suitable communication protocol. The servers 118 may each include at least one trusted vehicle license plate IDs database 117 and at least one personal identification database 119. In some embodiments, the databases 117 and 119 may be stored on the computing device 102
[0044] The trusted vehicle license plate IDs database 117 may be populated by a processing device adding license plate IDs of vehicles that commonly enter the license plate detection zone 122. In some embodiments, the trusted vehicle license plate IDs database 117 may be populated at least in part by manual entry of license plate IDs associated with vehicles trusted to be within the license plate detection zone 122. These license plate IDs may be associated with vehicles owned by neighbors in a neighborhood, or family members of the neighbors, friends of the neighbors, visitors of the neighbors, contractors hired by the neighbors, any suitable person that is trusted, etc.
[0045] The personal identification database 119 may be populated by a processing device adding personal identification information associated with electronic device IDs 133 of electronic devices carried by people that commonly enter the electronic device detection zone 132 (e.g., a list of trusted electronic device IDs). In some embodiments, the personal identification database 119 may be populated at least in part by manual entry of personal identification information associated with electronic device IDs 133 associated with electronic devices 140 trusted to be within the electronic device detection zone 132 (e.g., a list of trusted electronic device IDs). These electronic device IDs 133 may be associated with electronic devices 140 owned by neighbors in a neighborhood, or family members of the neighbors, friends of the neighbors, visitors of the neighbors, contractors hired by the neighbors, etc. Further, in some embodiments, the personal identification database 119 may be populated by entering a list of known suspect individuals from the police department, people entering or exiting border checkpoints, etc.
[0046] The personal identification information for untrusted electronic device IDs may also be entered into the personal identification database 119. The personal identification database 119 may also be populated by a processing device adding personal identification information associated with electronic device IDs 133 of electronic devices carried by people that commonly enter the facial detection zone 132 (e.g., face images of trusted individuals). The personal identification information may include names, addresses, faces, email addresses, phone numbers, electronic device identifiers associated with electronic devices owned by the people (e.g., Bluetooth MAC addresses, WiFi MAC addresses), correlated license plate IDs with the electronic device identifiers, etc. The correlations between the license plate IDs, the electronic device identifiers, and/or the faces may be performed by a processing device using the data obtained from the cameras 120 and the electronic device identification sensors 130. Some of this information may be obtained from public sources, phone books, the Internet, and/or companies that distribute electronic devices. In some embodiments, the personal identification information added to the personal identification database 119 may be associated with people selected based on their residing in or near a certain radius of a geographic region where the zones 122 and/or 132 are set up, based on whether they are on a crime watch list, or the like.
[0047] FIGURE 2 illustrates details pertaining to various components of the illustrative system architecture 100 of FIGURE 1, according to certain embodiments of this disclosure. For example, the camera 120 includes an image capturing component 200; the electronic device identification sensor 130 includes an electronic device ID detecting and storing component 202; the server 118 includes a license plate ID detecting component 204, a license plate ID comparing component 206, a suspicious vehicle identifying component 208, and a correlating component 210. In some embodiments, the components 204, 206, 208, and 210 may be included in the computing device 102 executing the client application 104. Each of the components 200, 202, 204, 206, 208, and 210 may be implemented in computer instructions stored on one or more memory devices of their respective device and executed by one or more processors of their respective device.
[0048] With regards to the image capturing component 200, the component 200 may be configured to capture a set of images 123 within a license plate detection zone 122. At least some of the captured images 123 may represent license plates of a set of vehicles 126 appearing within the field of view of the cameras 120. The image capturing component 200 may configure one or more camera properties (e.g., zoom, focus, etc.) to obtain a clear image of the license plates. The image capturing component 200 may implement various techniques to extract the license plate ID from the images 123, or the image capturing component 200 may transmit the set of images 123, without analyzing the images 123, to the server 118 via the network 112.
[0049] With regards to the electronic device ID detecting and storing component 202, the component 202 may be configured to detect and store a set of electronic device IDs 133 of electronic devices located within one or more electronic device detection zones 132. The electronic device ID detecting and storing component 202 may detect a WiFi signal, cellular signal, and/or a Bluetooth signal from the electronic device and be capable of obtaining the WiFi MAC address, cellular MAC address, and/or Bluetooth MAC address of the electronic device from the signal. The electronic device IDs 133 may be stored locally in memory on the electronic device identification sensor 130, and/or transmitted to the server 118 and/or the computing device 102 via the network 112.
[0050] With regards to the license plate ID detecting component 204, the component 204 may be configured to detect, using the set of images 123, a license plate ID of a vehicle 126. The license plate ID detecting component 204 may perform optical character recognition (OCR), or any suitable identifier / text extraction technique, on the set of images 123 to detect the license plate IDs.
[0051] With regards to the license plate ID comparing component 206, the component 206 may be configured to compare the license plate ID of the vehicle to a trusted vehicle license plate ID database 117. The license plate ID comparing component 206 may compare the license plate ID with each trusted license plate ID in the trusted vehicle license plate ID database 117.
[0052] With regards to the suspicious vehicle identifying component 208, the component 208 may identify the vehicle 126 as a suspicious vehicle 126, the identification based at least in part on the comparison of the license plate ID of the vehicle 126 to the trusted vehicle license plate ID database 117. If there is not a trusted license plate ID that matches the license plate ID of the vehicle 126, then the suspicious vehicle identifying component 208 may identify the vehicle as a suspicious vehicle.
[0053] With regards to the correlating component 210, the component 210 may be configured to correlate the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133. Correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133 may include comparing one or more time stamps of the set of captured images 123 with one or more time stamps of the set of stored electronic device IDs 133. Also, correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device IDs 133 may include analyzing at least one of: (i) at least one strength of signal associated with at least one of the set of stored electronic device IDs 133, and (ii) at least one visually estimated distance of at least one vehicle 126 associated with at least one of the set of stored images 123.
[0054] FIGURE 3 illustrates an example method 300 for monitoring vehicle traffic, according to certain embodiments of this disclosure. The method 300 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both. The method 300 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 300. For example, a computing system may refer to the computing device 102 or the cloud-based computing system 116. The method 300 may be implemented as computer instructions that, when executed by a processing device, execute the operations. In certain implementations, the method 300 may be performed by a single processing thread. Alternatively, the method 300 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 300.
[0055] At block 302, a set of images 123 may be captured, using at least one camera 120, within a license plate detection zone 122. At least some of the set of images 123 may represent license plates of a set of vehicles 126 appearing within the camera’s field of view. One or more camera properties (e.g., zoomed in, focused, etc.) may be configured to enable the at least one camera 120 to obtain clear images 123 of the license plates.
[0056] At block 304, a set of electronic device identifiers 133 of electronic devices 140 located within one or more electronic device detection zones 132 may be detected and stored using an electronic device identification sensor 130. In some embodiments, the electronic device identification sensor 130 may include at least one of a WiFi signal detection device, cellular signal detection device, or a Bluetooth signal detection device. In some embodiments, the set of electronic device identifiers 133 may include at least one of a Bluetooth MAC address, cellular MAC address, or a WiFi MAC address. In some embodiments, at least one of the set of stored electronic device identifiers 133 may be compared with a list of trusted device identifiers.
[0057] At block 306, a license plate ID of a vehicle 126 may be detected using the set of images 123. The images 123 may be filtered, rendered, and/or processed in any suitable manner such that the license plate IDs may be clearly detected using the set of images 123. In some embodiments, object character recognition (OCR) may be used to detect the license plate IDs in the set of images 123. The OCR may electronically convert each image in the set of images 123 of the license plate IDs into computer-encoded license plate IDs that may be stored and/or used for comparison.
[0058] In some embodiments, a face of the individual 142 may be detected by a camera 120 in the facial detection zone 150. An image 123 may be captured by the camera 120 and facial recognition may be performed on the image to detect the face of the individual. The detected face and/or the image 123 may be transmitted to the cloud-based computing system 116 and/or the computing device 102.
[0059] At block 308, the license plate ID of the vehicle 126 may be compared to a database of trusted vehicle license plate IDs. In some embodiments, the database 117 of trusted vehicle license plate IDs may be populated at least in part by adding license plate IDs of vehicles 126 that commonly enter the license plate detection zone 122 to the database 117 of trusted vehicle license plate IDs. In some embodiments, the database 117 of trusted vehicle license plate IDs may be populated at least in part by manual entry of license plate IDs associated with vehicles 126 trusted to be within the license plate detection zone 122. For example, the trusted vehicles may belong to the neighbors, family members of the neighbors, friends of the neighbors, law enforcement, and so forth.
[0060] At block 310, the vehicle may be identified as a suspicious vehicle 126. The identification may be based at least in part on the comparison of the license plate ID of the vehicle to the database 117 of trusted vehicle license plate IDs. For example, if the license plate ID is not matched with a trusted license plate ID stored in the database 117 of trusted vehicle license plate IDs, then the vehicle associated with the license plate ID may be identified as a suspicious vehicle 126.
[0061] At block 312, the license plate ID of the vehicle 126 may be correlated with at least one of the set of stored electronic device identifiers 133. In some embodiments, the face of the individual 142 may also be correlated with the license plate ID and the at least one of the set of stored electronic device identifiers 133. In some embodiments, at least one personal identification database 119 may be accessed. In some embodiments, correlating the license plate ID of the vehicle 126 with at least one of the set of stored electronic device identifiers 133 may include comparing one or more time stamps of the set of captured images 123 with one or more time stamps of the set of stored electronic device identifiers 133. In some embodiments, correlating the license plate ID of the vehicle 126 with the at least one of the set of stored electronic device identifiers 133 may include analyzing at least one of (i) at least one strength of signal associated with at least one of the set of stored electronic device identifiers 133, and (ii) at least one visually estimated distance of at least one vehicle associated with at least one of the set of stored images 123.
[0062] Personal identification information of at least one suspicious individual may be retrieved from the at least one personal identification database 119 by correlating information of the personal identification database 119 with the license plate ID of the vehicle 126 or at least one of the set of electronic device identifiers 133 correlated with the license plate ID of the vehicle 126. The personal identification information may also be obtained using a face detected by the camera 120 to obtain the electronic device ID 133 and/or the license plate ID correlated with the face. The personal identification information may include one or more of a name, a phone number, an email address, a residential address, a Bluetooth MAC address, a cellular MAC address, a WiFi MAC address, whether the suspicious individual is on a crime watch list, a criminal record of the suspicious individual, and so forth.
[0063] In some embodiments, a user interface may be displayed on one or more computing devices 102 of one or more neighbors when the one or more computing devices are executing the client application 104, and the user interface may present a notification or alert. In some embodiments, the computing device 102 may present a push notification on the display screen and the user may provide user input (e.g., swipe the push notification) to expand the notification on the user interface to a larger portion of the display screen. The alert or notification may indicate that there is a suspicious vehicle 126 identified within the zones 122 and/or 132 and may provide information pertaining to the vehicle 126 (e.g., make, model, color, license plate ID, etc.) and personal identification information of the suspicious individual (e.g., name, phone number, email address, Bluetooth MAC address, cellular MAC address, WiFi MAC address, whether the individual is on a crime watch list, whether the individual has a criminal record, etc.).
[0064] Further, the user interface may present one or more options to perform preventative actions. The preventative actions may include contacting an electronic device 140 of the suspicious individual using the personal identification information. For example, a user may use a computing device 102 to transmit a communication (e.g., at least one text message, phone call, email, or some combination thereof) to the suspicious individual using the retried personal information.
[0065] In addition, the preventative actions may also include notifying law enforcement of the suspicious vehicle and/or individual. This preventative action may be available if it is determined that the suspicious individual is on a crime watch list. A suspicious vehicle profile may be created. The suspicious vehicle profile may include the license plate ID of the suspicious vehicle and/or the at least one correlated electronic device identifiers (e.g., Bluetooth MAC address, WiFi MAC address). The user may select the notify law enforcement option on the user interface and the computing device 102 of the user may transmit the suspicious vehicle profile to another computing device 102 of a law enforcement entity that may be logged into the client application 104 using a law enforcement account.
[0066] In some embodiments, the preventative action may include activating an alarm upon detection of the suspicious vehicle 126. The alarm may be located in the neighborhood, for example, on a light pole, a tree, a pole, a sign, a mailbox, a fence, or the like. The alarm may be included in the computing device 102 of a user (e.g., a neighbor) using the client application. The alarm may include auditory (e.g., a message about the suspect, a sound, etc.), visual (e.g., flash certain colors of lights), and/or haptic (e.g., vibrations) elements. In some embodiments, the severity of the alarm may change the pattern of auditory, visual, and/or haptic elements based on what kind of crimes the suspicious individual has committed, whether the suspicious vehicle 126 is stolen, whether the suspicious vehicle 126 matches a description of a vehicle involved in an Amber alert, and so forth.
[0067] FIGURE 4 illustrates another example method 400 for monitoring vehicle traffic, according to certain embodiments of this disclosure. Method 400 includes operations performed by one or more processing devices of one or more devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 400. In some embodiments, one or more operations of the method 400 are implemented in computer instructions that, when executed by a processing device, execute the operations of the steps. The method 400 may be performed in the same or a similar manner as described above in regards to method 300.
[0068] The method 400 may begin with a setup phase where various steps 402, 404, 406, and/or 408 are performed to register data that may be used to determine whether a vehicle and/or individual is suspicious. For example, at block 402, law evidence may be registered. The law evidence may be obtained from a system of a law enforcement agency. For example, an application programming interface (API) of the law enforcement system may be exposed and API operations may be executed to obtain the law evidence. The law evidence may indicate whether a person is on a crime watch list 410, whether the person has a warrant, whether person has a criminal record, and/or the WiFi / Bluetooth MAC data (address) / cellular data of electronic devices involved in incidents, as well as the owner information 412 of the electronic devices. The crime watch list 410 information may be used to store crime watch list 414 in a database (e.g., personal identification database 119). [0069] At block 404, license plate registration (LPR) data may be collected using the one or more cameras 120 in the license plate detection zones 122 as LPR raw data 416. The LPR raw data 416 may be used to obtain vehicle owner information (e.g., name, address, phone number, email address) and vehicle information (e.g., license plate ID, make, model, color, year, etc.). For example, the LPR raw data 416 may include at least the license plate ID, which may be used to search the Department of Motor Vehicles (DMV) to obtain the vehicle owner information and/or vehicle information. In some instances, the LPR raw data 416 may be collected from manual entry. At block 406, WiFi MAC addresses may be collected from various sources as WiFi MAC raw data 418. For example, the WiFi MAC raw data 418 may be collected from the electronic device identification sensors 130 in the electronic device detection zones 132. In some instances, trusted WiFi MAC addresses may be manually obtained from certain people owning electronic devices in an area covered by the electronic device detection zones 132 and stored in a database (e.g., personal identification database 119). In some embodiments, cellular raw data (e.g., cellular MAC addresses) may be collected from electronic device identification sensors 130. At block 408, Bluetooth MAC addresses may be collected from various sources as raw data 420. For example, the Bluetooth MAC raw data 418 may be collected from the electronic device identification sensors 130 in the electronic device detection zones 132. In some instances, trusted Bluetooth MAC addresses may be manually obtained from certain people owning electronic devices in an area covered by the electronic device detection zones 132 and stored in a database (e.g., personal identification database 119). In some embodiments, the Bluetooth MAC addresses may be collected from the electronic device identification sensors 130 at the electronic device detection zones 132. At block 409, face images may be collected by the one or more cameras 120 in the facial detection zones 150. Facial recognition may be performed to detect and recognize faces in the face images. [0070] At block 422, the LPR raw data 416, the WiFi MAC raw data 418, the Bluetooth
MAC raw data 420, the cellular raw data, and/or the face raw data 451 may be correlated or paired to generate matched data 424. That is, the data from license plate ID detection, LPR systems, personal electronic device detection, and/or facial information may be combined to generate matched data 424 and stored in the database 117 and/or 119. In some embodiments, the license plate IDs are compared to the database 119 of trusted vehicle license plate IDs to determine whether the detected license plate ID is in the trusted vehicle license plate ID database 119. If not, the vehicle 126 may be identified as a suspicious vehicle and the license plate ID of the vehicle may be correlated with at least one of the set of stored electronic device IDs 133. This may result in creation of a database of detected electronic device identifiers 133 correlated with license plate IDs and facial information of individuals. Any unpaired data may be discarded after unsuccessful pairing.
[0071] At block 426, owner data of the electronic devices and/or vehicle may be added to the matched data 424. The owner data may include an owner ID, and/or name, address, and the like. Further, at block 428, owner’s phone number and email may be added to the matched data. In addition, WiFi / Bluetooth MAC / cellular data and owner data 412 from the law evidence may be included with the matched data 424 and the personal information of the owner to generate matched data with owner information 430. Accordingly, the owner ID may be associated with combined personal information (e.g., name, address, phone number, email, etc.), vehicle information (e.g., license plate ID, make, model, color, year, vehicle owner information, etc.), and electronic device IDs 133 (e.g., WiFi MAC address, Bluetooth MAC adder). At block 432, the matched data with owner information 430 may be further processed (e.g., formatted, edited, etc.) to generate matchable data. This may conclude the setup phase. [0072] Next, the method 400 may include a monitoring phase. During this phase, the method 400 may include monitoring steps 442, 444, and 445. At block 442, WiFi MAC address monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of WiFi MAC addresses as WiFi MAC raw data 448. In some embodiments, cellular signal monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of cellular MAC addresses as cellular raw data. At block 444, Bluetooth MAC address monitoring may include one or more electronic device identification sensors 130 detecting and storing a set of Bluetooth MAC addresses as Bluetooth MAC raw data 450. At block 451, face monitoring may include the one or more cameras 120 capturing face images and recognizing faces in the face images as face raw data 451. The WiFi MAC raw data 448, Bluetooth MAC raw data 450, and/or face raw data 451 may be compared to matchable data 432 at decision block 452.
[0073] At block 452, the electronic device IDs 133 and/or faces detected by the electronic device identification sensors 130 and/or the cameras 120 may be compared to the matchable data 432. The matchable data 432 may include personal identification information that is retrieved from at least the personal identification database 117. That is, the detected electronic device IDs 133 and/or faces may be compared to the database 117 and/or 119 to find any correlation of the detected electronic device IDs 133 and/or faces with license plate IDs.
[0074] If there is a matching electronic device ID to the detected electronic device ID and/or a matching face to the detected face, and there is a correlation with a license plate ID in the database 117 and/or 119, then a suspicious vehicle 126 / individual 143 may be detected. At block 456, the detected match event may be logged. At block 456, the user interface of the client application 104 executing on the computing device 102 may present an alert of the suspicious vehicle 126 / individual 142. At block 456, the detected notification event may be logged. At block 458, the electronic device 140 of the suspicious individual 142 may be notified that his presence is known (e.g., taunted). At block 456, the taunting event may be logged.
[0075] At decision block 460, the crime watch list 414 may be used to determine if the identified individual 142 is on the crime watch list 414 using the individual’s personal information. If the individual 142 is on the watch list 414, then at block 462, the appropriate law enforcement agency may be notified. At block 456, the law enforcement agency notification event may be logged.
[0076] FIGURE 5 illustrates example use interfaces presented on computing devices during monitoring vehicle traffic, according to certain embodiments of this disclosure. It should be noted that a user interface 500 may present vehicle information and electronic device information in a single user interface. When a suspicious vehicle 126 / individual 142 is detected based on the vehicle license plate ID and/or the electronic device IDs 133, a notification may be presented on the user interface 500 of the client application 104 executing on the computing device 102 of a user (e.g., homeowner, neighbor, interested citizen). As depicted, the notification includes an alert displaying vehicle information and electronic device information. The vehicle information includes the “Make: Jeep”, “Model: Wrangler”, “License Plate ID: ABC123”. The electronic device information includes “Electronic Device ID: 00:11:22:33:FF:EE”, “Belongs to: John Smith”, “Phone Number: 123-456-7890”. Further, the user interface 500 presents that the owner has a warrant out for his arrest. The notification event may be logged in the database 117 / 119 or any suitable database of the system 100.
[0077] The user interface 500 includes various preventative action options represented by user interface element 502 and 504. For example, user interface element 502 may be associated with contacting the detected suspicious individual 142 directly. Upon selection of the user interface element 502, the user may be able to send a text message to the electronic device 140 of the suspicious individual 142. For example, the text message may read “Please leave the area immediately, or I will contact law enforcement.” However, any suitable message may be sent. The message / taunting event may be logged in the database 117 / 119 or any suitable database of the system 100.
[0078] Since the suspicious individual 142 has a warrant out for his arrest and/or is on a crime watch list, the user interface element 504 may be displayed that provides the option to notify law enforcement. Upon selection of the user interface element 504, a notification may be transmitted to a computing device 102 of a law enforcement agency. The notification may include vehicle information (e.g., “License Plate ID: ABC 123”), electronic device information (e.g., “Electronic Device ID: 00:11:22:33:FF:EE”), as well as location of the detection (e.g., “Geographic Location: latitude 47.6° North and longitude 122.33° West”), and personal information (“Name: John Smith”, “Phone Number: 123-456-7890”, a face of the individual 142). The law enforcement agency event may be logged in the database 117 / 119 or any suitable database of the system 100.
[0079] Below are example data tables that may be used to implement the system and method for monitoring vehicle traffic disclosed herein. The data tables may include: Client and ID Tables (logID, loginAttempts, clientUser, lawUser, billing), Data Site Info (monitoredSites, dataSites, dataGroups), Raw Collection Data (rawWiFiDataFound, rawBTDataFound, rawLPRDataFound, pairedData), Monitor Data Raw & Matched (monWiFiDataDetected, monBTDataDetected, monWiFiDataMatched, monBTDataMatched), Subject Data (subjectMatch, subjectlnfo, subjectLastSeen, criminal WatchList), Notification Logs (subNotifyLog, subNotifyReplyLog, clientNotifyLog).
Figure imgf000030_0001
Figure imgf000031_0001
Table 1: loginID
[0080] Table 1 : logID is used for login ID/passwords, authentication and password resets
Figure imgf000031_0002
Table 2: loginAttempts
[0081] Table 2: loginAttempts logs the number of times logins were attempted for both successes and failures
Figure imgf000031_0003
Figure imgf000032_0002
Table 3: clientUser
[0082]
Figure imgf000032_0001
Figure imgf000032_0003
Figure imgf000033_0001
Table 4: lawUser
[0083] Table 4: lawUser includes information for law enforcement persononel wanting to be notified of suspicious vehicles 126 / individuals 142.
Figure imgf000033_0002
[0084]
Figure imgf000034_0001
Table 6: monitoredSites
[0085] Table 6: monitoredSites includes information for WiFi / Bluetooth monitoring for detection, among other things.
Figure imgf000035_0001
Table 7: dataSites
[0086] Table 7: dataSites includes information for WiFi / Bluetooth / License Plate Registration detection sites. These sites may supply data to databases, among other things.
Figure imgf000035_0002
Figure imgf000036_0001
Table 8: dataGroups
[0087] Table 8: dataGroups may group data groups and monitored sites. Groupings such as Homeowner Associations, neighborhoods, etc.
Figure imgf000036_0002
Table 9: rawWiFiDataFound
[0088] Table 9: rawWiFiDataFound includes raw data dump for WiFi from detection sites used to look for matches.
Figure imgf000036_0003
Figure imgf000037_0002
Table 10: rawBTDataFound
[0089] Table 10: rawBTDataFound includes raw data dump for Bluetooth from detection sites used to look for matches.
Figure imgf000037_0001
[0090] Table 11 : rawLPRDataFound may include raw LPR data from detection sites used to look for matches.
Figure imgf000038_0001
Table 12: pairedData
[0091] Table 12: pairedData includes matched data that may be the correlation between vehicle information (e.g., license plate IDs) and electronic device IDs 133.
Figure imgf000038_0002
Figure imgf000039_0001
Table 13: monWiFiDataDetected
[0092] Table 13: monWiFiDataDetected logs of any MAC address data detefcted before matching for WiFi.
Figure imgf000039_0002
Table 14: monBTDataDetected
[0093] Table 14: monBTDataDetected logs of any MAC address data detected before matching for Bluetooth.
Figure imgf000039_0003
Figure imgf000040_0001
Table 15: monWiFiDataMatched
[0094] Table 15: monWiFiDataMatched logs of any matches moniroted sites find on the database for WiFi.
Figure imgf000040_0002
Table 16: monBTDataMatched
[0095] Table 16: monBTDataMatched logs of any matches monitored sites find on the database for Bluetooth.
Figure imgf000040_0003
Figure imgf000041_0002
Table 17: subjectMatch
[0096] Table 17: subjectMatch includes a number of times subject detected in monitored sites and data sites.
Figure imgf000041_0001
Table 18: subjectlnfo
[0097] Table 18: subjectlnfo includes information obtained for owner of license vehicle.
Figure imgf000041_0003
Figure imgf000042_0001
Table 19: subjectLastSeen
[0098] Table 19: subjectLastSeen includes locations where subject was seen with a timestamp.
Figure imgf000042_0002
Table 20: criminalWatchList
[0099] Table 20: criminalWatchList includes a criminal watch list that is compared to subjects / individuals 142 to determine if they are a criminal and who to notify if found.
Figure imgf000042_0003
Table 21: subNotifyLog
[0100] Table 21: subNotifyLog includes notifications sent to the subject to discourage crime.
Figure imgf000043_0001
Table 22: subNotifyReplyLog
[0101] Table 22: subNotifyReplyLog includes any replies from the subject after notification.
Figure imgf000043_0002
Table 23: clientNotifyLog
[0102] Table 23: clientNotifyLog includes log of notification attempts to the client (e.g., computing device 102 of a user).
[0103] FIGURE 6 illustrates another high-level component diagram of an illustrative system architecture 600 including an artificial intelligence engine 170, according to certain embodiments of this disclosure. The system architecture 600 of FIGURE 6 is substantially similar to the system architecture 100 of FIGURE 1. The components that are depicted in FIGURE 6 may be described similarly to their like components depicted in FIGURE 1. The additional components in FIGURE 6 include the artificial intelligence engine 170 and the one or more machine learning models 172. Although shown separately from the one or more servers 118, the artificial intelligence engine 170 may be hosted and/or executed by the one or more servers 118. The artificial intelligence engine 170 that uses one or more machine learning models 172 to perform at least one of the embodiments disclosed herein.
[0104] In some embodiments the cloud-based computing system 116 may include a training engine 174 capable of generating the one or more machine learning models 172. The machine learning models 172 may be trained to determine a probability of occurrence of a subsequent event based on one or more identifiers of one or more people, information pertaining to an incident that occurred at a location where the one or more people were present at a particular time, additional information (e.g., criminal record, mugshot, electronic medical record, etc.) pertaining to the one or more people, or some combination thereof. Further, the one or more machine learning models 172 may be trained to determine a preventative action to select and perform based on a severity of a subsequent incident that is determined occur and/or a probability of occurrence of the subsequent incident. For example, the machine learning model may be trained using training data that indicates certain preventative actions have higher success rates of reducing a probability that the subsequent event occurs discover, translate, design, generate, create, develop, classify, and/or test candidate drug compounds, among other things. The one or more machine learning models 172 may be generated by the training engine 174 and may be implemented in computer instructions executable by one or more processing devices of the artificial intelligence engine 170, the training engine 174, and/or the servers 118. To generate the one or more machine learning models 172, the training engine 174 may train the one or more machine learning models 172.
[0105] The training engine 174 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above. The training engine 174 may be cloud-based, be a real time software platform, include privacy software or protocols, and/or include security software or protocols.
[0106] To generate the one or more machine learning models 172, the training engine 174 may train the one or more machine learning models 172. The training engine 174 may use a base data set of a set of identifiers (e.g., electronic device IDs associated with the people, license plate numbers of the vehicles registered to the people, images of the people, etc.) of people that were present at a location an incident occurred at a particular time, information (e.g., type of incident, location of incident, time of incident, criminal or civil, damage to property, harm to people, etc.) pertaining to the incident, additional information pertaining to the people (e.g., one or more criminal records of the people, one or more mugshots of the people, addresses of the people, electronic medical records of the people, fingerprints of the people, images of the people, ages of the people, names of the people, email address associated with the people, phone numbers associated with the people, indications of the people being on a watch list, or some combination thereof) present at the incident at the location at the particular time, and/or a pattern recognized using the one or more identifiers, the information pertaining to the incident, or some combination thereof. The machine learning models 172 may be trained to receive, as input, subsequent identifiers associated with a person, information about incidents, and/or additional information pertaining to the person, and output a probability of occurrence of a subsequent incident. For example, if an identifier of a person is detected as being present at a location where a riot occurs (e.g., incident) with a certain threshold of other identifiers of other people also present at the location where the riot occurs, there may be a correlation between those identifiers and incidents occurring. Accordingly, if the identifier of the person is detected the next night at another location where the other identifiers of the other people are also detected, the probability of a subsequent incident occurring may be high. In other words, the people may be working together to initiate and/or instigate riots. Thus, using the trained machine learning models 172, subsequent incidents may be prevented.
[0107] The one or more machine learning models 172 may refer to model artifacts created by the training engine 174 using training data that includes training inputs and corresponding target outputs. The training engine 174 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning models 172 that capture these patterns. Although depicted separately from the server 118, in some embodiments, the training engine 174 may reside on server 118. Further, in some embodiments, the artificial intelligence engine 170, the database 117 / 119, and/or the training engine 130 may reside on the computing device 102.
[0108] The one or more machine learning models 172 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 172 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations. Examples of deep networks are neural networks, including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself). For example, the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
[0109] Recurrent neural networks include the functionality, in the context of a hidden layer, to process information sequences and store information about previous computations. As such, recurrent neural networks may have or exhibit a “memory.” Recurrent neural networks may include connections between nodes that form a directed graph along a temporal sequence. Keeping and analyzing information about previous states enables recurrent neural networks to process sequences of inputs to recognize patterns (e.g., such as sequences of ingredients and correlations with certain types of activity level). Recurrent neural networks may be similar to Markov chains. For example, Markov chains may refer to stochastic models describing sequences of possible events in which the probability of any given event depends only on the state information contained in the previous event. Thus, Markov chains also use an internal memory to store at least the state of the previous event. These models may be useful in determining causal inference, such as whether an event at a current node changes as a result of the state of a previous node changing.
[0110] FIGURE 7 illustrates an example scenario 700 where a preventative action is performed based on a probability of occurrence of a subsequent incident, according to certain embodiments of this disclosure. As may be appreciated, one or more people that are present at a location where an incident occurs, such as a riot, may be an indicator that the one or more people are likely to riot in the future. The probability of a riot occurring again may increase if the same one or more people are determined to be within a certain proximity to each other at a later date. For example, some people work together to insight and/or start riots. These types of people be associated with additional information (e.g., criminal records) that may further increase the probability that a subsequent incident may occur when the one or more people are gathered. Thus, some purposes of the present disclosure is to track certain people using identifiers associated with the people, determine probabilities of occurrences based on the tracked identifiers, information pertaining to incidents, and/or additional information pertaining to the people, and perform preventative actions. The preventative action may include distally controlling other electronic devices to activate, deactivate, actuate, extend, retract, present notifications, etc.
[0111] Some technical benefits of the present disclosure may include accurate location tracking of people, vehicles, and the like. Further, based on the accurate location tracking, some embodiments may enable determining a probability of occurrence of a subsequent incident using a trained machine learning model, and performing a preventative action based on the probability of occurrence. The preventative action may be performed by the cloud-computing system 116 using various APIs and/or services of electronic devices, and thus, the present disclosure enables interoperability between electronic devices (e.g., smartphones) of people, a cloud-based computing system 1116, and various other electronic devices (e.g., alarm systems, appliances, lights, locks, speakers, computing devices, irrigation systems, etc.). As a result of performing the preventative action, the subsequent incident may be thwarted and/or mitigated (e.g., amount of personal and property damage lessened than what it may have been if the preventative action was not performed).
[0112] As depicted, at time Tl, an incident occurred at a location at a first time (e.g., 9 PM January 1st). The location where the incident occurred may be in electronic device detection zone 132-2 in which one or more electronic device identification sensors 130 are installed. The one or more electronic device identification sensors 130 may detect the electronic device 140 of the person 142 in the electronic device detection zone 132-2 at the location where, and the first time when, the incident occurred. An identifier may be transmitted to the cloud-based computing system 116 via the network 112. In some embodiments, the identifier may be an electronic device ID 133 (e.g., WiFi MAC address, Bluetooth MAC address, etc.). It should be noted that if numerous people are carrying their electronic devices in the electronic device detection zone 132-2, the one or more electronic device identification sensors 130 may transmit the identifiers (e.g., electronic device IDs 133) associated with each person in the electronic device detection zone 132-2 where the incident occurred.
[0113] In some embodiments, the cloud-based computing system 116 may receive information pertaining to the incident that occurred at the location where the person 142 was present at the first time. The information pertaining to the incident may include a description of the incident that occurred, a type of incident (e.g., criminal, civil, riot, vandalism, looting, drug trafficking, human trafficking, loitering, etc.), a timestamp of the incident, a duration of the incident, a location of the incident, and the like. The identifiers of the people and the information of the incident may be correlated and stored in database 119 and/or 117 for use by the trained machine learning models 123 to continuously update their determinations of probabilities of occurrences of subsequent incidents.
[0114] As depicted, a computing device 702 may provide the information pertaining to the incident to the cloud-based computing system 116 and/or additional information pertaining to the people present at the location of the incident at time Tl. The computing device 702 may be associated with any suitable source that provides information pertaining to incidents that occur. For example, the computing device 702 may be associated with a law enforcement agency that provides police reports generated as a result of the incident, mugshots of people detected as being present at the location during the incident, criminal records of people detected as being present at the location during the incident, and so forth. The criminal record may indicate that a person is a 3 time convicted felon for armed robbery, and a machine learning model 172 may be trained to output a high probability of occurrence of a subsequent incident at a subsequent time when the person is detected if the person was recently detected as being present at a location where an armed robbery occurred. The computing device 702 may be associated with a healthcare facility that uses an electronic medical record (EMR) system. The EMR system may transmit information (e.g., medical records) of people to the cloud-based computing system 116 for use when determining the probability of occurrence of subsequent incidents. For example, the medical record may provide an indication if the person has been diagnosed as having a mental health related condition, which may be a relevant factor when determining whether a subsequent incident may occur. The computing device 702 may be associated with a news broadcasting entity, a social network entity, a social media entity, or the like.
[0115] The machine learning model 172 may receive the one or more identifiers (e.g., electronic device IDs 133) associated with the people present at the incident at the first time, the information pertaining to the incident, and/or the additional information pertaining to the people. In some embodiments, the cloud-based computing system 116 may correlate the identifiers, information pertaining to the incident, and/or the additional information pertaining to the people and store the correlated data in a database. In some embodiments, the machine learning models may be trained to determine probabilities of occurrences of subsequent incidents using the correlated data (e.g., the identifiers associated with the people, information pertaining to the incident, additional information pertaining to the people, etc.).
[0116] At time T2 (e.g., 9 PM January 2nd), subsequent to time T1 but prior to a subsequent incident occurring, the identifier of the electronic device 140 may be detected in the electronic device detection zone 132-2 again. In some embodiments, there may be a threshold number of the same identifiers detected in the electronic device detection zone 132-2 as were detected the night before when the incident occurred at time T1. The identifier may be transmitted to the cloud-based computing system 116, along with information pertaining to the location (e.g., GPS coordinates, etc.). The cloud-based computing system 116 may use the identifier to obtain any additional data about the people (e.g., license plate numbers of vehicles registered to the people, criminal records of the people, mugshots of the people, medical records of the people, etc.). In some embodiments, the AI engine 170 may input the identifiers of the people, the information pertaining to the location where the electronic devices 140 are located at time T2, and/or the additional information pertaining to the people into the machine learning model 172. The trained machine learning model may be receive the input and output a probability of occurrence of a subsequent incident 703. The probability of occurrence of a subsequent incident 703 may be a value, a percentage, a number, or the like.
[0117] The probability of occurrence may be used by the AI engine 170 to determine whether or not to perform a preventative action (e.g., the probability of occurrence has to satisfy a threshold) and which preventative action to perform. One or more machine learning models 172 may be trained to input the probability of occurrence of the subsequent incident 703, the information pertaining to the location, the identifiers associated with the people, and/or the additional information pertaining to the people, and to output a preventative action that is likely to squash, thwart, mitigate, and/or prevent the subsequent incident from occurring. In some embodiments, the higher the probability of occurrence of the subsequent incident 703, the more severe of a preventative action may be selected and performed. The machine learning model 703 may determine the preventative action to perform and perform the preventative action 706. In one example, the preventative action 706 may include causing an alarm system 704 to activate in the electronic device detection zone 132-2 to attempt to scare the people 142 out of the zone. As described further below with reference to FIGURE 10, there are a multitude of preventative actions that may be performed based on the probability of occurrence of a subsequent incident.
[0118] FIGURE 8 illustrates an example method 800 for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure. The method 800 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both. The method 800 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 800. For example, a computing system may refer to the computing device 102 or the cloud-based computing system 116. The method 800 may be implemented as computer instructions that, when executed by a processing device, execute the operations. In certain implementations, the method 800 may be performed by a single processing thread. Alternatively, the method 800 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 800.
[0119] At block 802, an identifier associated with a person may be received at a processor. The identifier may be received from a location where the person was present at a first time. In some embodiments, one or more location services applications executing on one or more electronic devices 140 may transmit the location(s) (e.g., using GPS) of the one or more electronic devices 140 of one or more people. In some embodiments, the identifier may be an electronic device ID 133 (e.g., media access control (MAC) address (e.g., WiFi and/or Bluetooth)), an image of the person (e.g., obtained via camera 120), a license plate number of a vehicle registered to the person, or some combination thereof. In some embodiments, electronic device identification sensors 130 may be configured to detect one or more electronic device IDs 133 (e.g., WiFi MAC addresses, Bluetooth MAC addresses, and/or cellular MAC addresses) of electronic device 140 within the electronic device detection zone 132. Further, one or more cameras 120 may be used in one or more license plate detection zones to capture images of license plates of vehicles of people and the images may be processed to determine the license plate numbers of the vehicles registered to the people. One or more images of people may be obtained in one or more facial detection zones 150 and the one or more images may be processed (e.g., facial recognition techniques) to determine identities of the people.
[0120] In some embodiments, the identifier may be associated with the person prior to or concurrently with block 802 occurring. For example, the identifier of the person may be a MAC address of the electronic device 140 of the person 142 and the MAC address may have been associated with the person when the person purchased the electronic device 140. In some embodiments, as described herein, the identifier associated with the person, the license plate number of the vehicle registered to the person, and/or an image of the person may be correlated together when such information is received at the cloud-based computing system 116.
[0121] At block 804, information pertaining to an incident that occurred at the location where the person was present at the first time may be received. For example, the information may be received via any suitable source, such as an API associated with a law enforcement agency, fire department, healthcare facility, etc. The source may be a news broadcasting channel, website, and/or application; a media channel, website, and/or application; a social media website and/or application; or the like. The incident may include rioting, arson, looting, robbery, violence, assault, battery, drug trafficking, human tracking, kidnapping, any type of criminal activity, or the like. In some embodiments, the incident may include any suitable incident that may be defined and programmed to be tracked by the cloud-based computing system 116. For example, the incident may be non-criminal activity, such as peaceful protests, concert events, sporting events, movie theater events, gatherings, a presence of the person in a particular zone, etc. For example, the disclosed techniques may be used to track the location of the person using which zone the person has entered (e.g., license plate detection zone 122, electronic device detection zone 132, facial detection zone 150, manual input zone 160, etc.).
[0122] In some embodiments, the incident may include the person being present at the location, which may trigger a preventative action to occur. For example, if a person is on a ban list and not allowed on a certain property, their detected presence at the location of the certain property may cause the person to be kicked off the property by security. In other examples, if a first person has a restraining order against a second person and both locations of the first and second person are detected within a certain distance prohibited by the restraining order, a preventative action may be performed (e.g., notify the electronic device of the second person to honor the restraining order and move farther away from the first person, notify the electronic device of the person that the second person is within the prohibited distance defined by the restraining order, notify law enforcement agency, etc.). In another example, if a person is on a wanted list, and their identifier is detected at a certain location, emergency services (e.g., law enforcement agency) may be notified and dispatched to the location to arrest the wanted person. In such examples, the identifier of the person may be received once from a particular location, and the detection of the person at the particular location may cause a preventative action to be performed. In such embodiments, it should be noted that any of the preventative actions described herein may be performed when the identifier of the person is detected at a certain location.
[0123] At block 806, the identifier associated with the person may be received at a second time subsequent to the first time. In some embodiments, the identifier associated with the person may be received at the second time from the location (e.g., same or similar location the identifier was received at the first time) or may be received at the second time from another location different than the location the identifier was received at the first time. For example, the identifier may be received at the first time (e.g., 9:00 PM on January 1st) at a first location (e.g., movie theater), where a riot erupted (incident occurred) and the identifier may be received at a second time (e.g., 9:00 PM on January 2nd) at a second location (e.g., grocery store). As described further herein, some embodiments may predict a probability of occurrence of a subsequent riot erupting (subsequent event occurring) at the second location based on the identifier of the person and information pertaining to the incident that occurred at the first location at the first time.
[0124] At block 808, the probability of occurrence of a subsequent incident may be determined by the processor via a trained machine learning model using the identifier associated with the person and the information pertaining to the incident that occurred at the location where the person was present at the first time. The subsequent incident may include a criminal offense, a civil offense, a triggered event (e.g., the presence of a person at a particular location where the person is banned from being, is restrained from being, or both; the presence of a wanted person at any detected location; etc.). In some embodiments, the triggered event may be a predefined triggered event. The cloud-based computing system 116 may include provide a user interface that enables defining, programming, scripting, specifying, modifying, deleting, uploading, etc. any suitable triggered event for a subsequent incident that may cause the preventative action to be performed. In some embodiments, the machine learning model may be trained to determine the probability of occurrence of the subsequent incident based on a set of training data including at least one of (i) other identifiers of other people that were present at the location the incident occurred at the first time, one or more criminal records of the person, (ii) the other people, or some combination thereof, and/or (iii) a pattern recognized using the identifier, the information the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
[0125] At block 810, a preventative action may be performed based on the probability of occurrence of the subsequent incident. Example preventative actions are described in more detail below with reference to FIGURE 10. The preventative action that is performed may be selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both. For example, if the subsequent incident, such as a riot, is determined to have a severity above a certain threshold the preventative action may be more drastic than if the severity is less than the certain threshold. For example, the preventative action for the riot may include notifying emergency services, such as a law enforcement agency, of the probability of occurrence of the subsequent incident at a particular location (e.g., geographical coordinates), which may result in the emergency services dispatching personnel to the particular location at the second time. Further, the preventative action may include causing an electronic device to activate, such as by turning on a live feed of a camera at a location where the subsequent incident is likely to occur. If the subsequent incident is determined to have a severity below a certain threshold, such as a peaceful protest, a less drastic preventative action may occur such as causing an alarm system of a department store to arm just in case the peaceful protest turns into looting.
[0126] FIGURE 9 illustrates another example method 900 for performing, based on a probability of occurrence of a subsequent incident, a preventative action, according to certain embodiments of this disclosure. The method 900 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both. The method 900 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 900. For example, a computing system may refer to the computing device 102 or the cloud-based computing system 116. The method 900 may be implemented as computer instructions that, when executed by a processing device, execute the operations. In certain implementations, the method 900 may be performed by a single processing thread. Alternatively, the method 900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 900. The method 900 may include blocks of operations that are performed prior to, concurrently with, or subsequently to the blocks of operations of the method 800 of FIGURE 8.
[0127] At block 902, additional information pertaining to the person may be received from a third-party source at a third time subsequent to the first and second times. The third-party source may include an API of a law enforcement agency, an electronic medical record system, a public data source, a website, a distribution system, an email system, or any suitable source. The additional information may include a criminal record of the person, a mugshot of the person, a fingerprint of the person, an image of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, an email address of the person, a phone number of the person, an indication of the person being on a watch list, or some combination thereof.
[0128] At block 904, the additional information may be correlated with at least the identifier of the person. For example, the criminal record of the person may be correlated with the electronic device ID associated with the person, the license plate number of the vehicle associated with the person, an image of the person, or the like. Accordingly, when the identifier is received at subsequent times for the person, the additional information may be retrieve and used to make predictions (probabilities of occurrences of subsequent incidents) and/or decisions that may cause performance of preventative actions.
[0129] At block 906, the identifier associated with the person may be received at a fourth time. The fourth time may be subsequent to the first, second, and/or third time. The identifier may be received from the electronic device 140 of the person 142, the network 112 [0130] At block 908, the probability of occurrence of a subsequent incident may be determined via the trained machine learning model using the identifier, the information pertaining to the incident that occurred at the location where the person was present at the first time, and/or the additional information.
[0131] At block 910, a preventative action may be performed based on the probability of occurrence of the subsequent incident. Various examples of preventative actions are discussed with reference to FIGURE 10.
[0132] FIGURE 10 illustrates an example method 1000 of various performing one or more preventative action, according to certain embodiments of this disclosure. The method 1000 may be performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), firmware, software, or a combination of both. The method 1000 and/or each of their individual functions, subroutines, or operations may be performed by one or more processors of one or more of the devices in FIGURE 1 (e.g., computing device 102, cloud-based computing system 116 including servers 118, cameras 120, electronic device identification sensors 130) implementing the method 1000. For example, a computing system may refer to the computing device 102 or the cloud-based computing system 116. The method 1000 may be implemented as computer instructions that, when executed by a processing device, execute the operations. In certain implementations, the method 1000 may be performed by a single processing thread. Alternatively, the method 1000 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the method 1000. It should be noted that one or more of the preventative actions 1002, 1004, 1006, 1008, 1010, 1012, and/or 1014 may be performed in any combination simultaneously or in a sequence in a time series. Further, the preventative actions are for explanatory purposes and it is noted that additional preventative actions may be performed. [0133] At block 1002, a first preventative action may include transmitting a notification to a computing device of an emergency responder. The notification may be presented on a user interface of the computing device of the emergency responder and may include graphical elements to take certain actions, such as initiate a phone call with another person’s computing device via a cellular carrier’s radio tower, send a text message a to computing device of another person, or the like. The computing device of the emergency responder may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wireless local area network (WLAN), such as WiFi. The cloud-based computing system 116 may communicate with the computing device of the emergency responder via the network 112.
[0134] At block 1004, a second preventative action may include causing an alarm system to active. Activating the alarm system may include transmitting a control signal from the cloud- based computing system 116 to the alarm system to cause the alarm to arm itself, to disarm itself, to activate by emitting audio, light, causing the alarm system to transmit signals to other systems (e.g., emergency responder systems of law enforcement, fire departments, healthcare facilities, etc.), or some combination thereof. Accordingly, some embodiments of the present disclosure, enable electronically controlling other devices from distal locations based on the determined probability of occurrence of subsequent incidents. The alarm system may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi. The cloud-based computing system 116 may communicate with the alarm system via the network 112.
[0135] At block 1006, a third preventative action may include causing an electronic device to activate. The electronic device may be any suitable electronic device, such as an appliance, an electronic lock (e.g., lock or unlock a door and/or window and/or hatch and/or latch), a smart thermostat, a garage door, an actuating arm, a door, a window, a shutter, a gate, and the like. Activating may also refer to actuating, such as extending and/or retracting, opening and/or closing, locking and/or unlocking, turning on and/or turning off, etc. The electronic device may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi. The cloud-based computing system 116 may communicate with the electronic device via the network 112.
[0136] At block 1008, a fourth preventative action may include causing a light to activate. The light may turn on or off, may change colors that are emitted, may change a brightness, may emit a strobe representing a pattern that encodes various messages, or the like. The light may be a smart light that includes a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi. The cloud- based computing system 116 may communicate with the light via the network 112.
[0137] At block 1010, a fifth preventative action may include causing an event to be triggered. The event may include responding to a situation where a person is detected (e.g., via their identifier being received) in a zone where they are banned, or restrained from being. Further event may include responding to a situation where a person wanted by a law enforcement agency is detected in a zone. The event may include transmitting notifications to certain computing devices according to a certain response protocol based on the type of event, or performing any combination of the disclosed preventative actions herein.
[0138] At block 1012, a sixth preventative action may include causing a speaker to emit a recorded message or audio. Instructions may be transmitted to the speaker and the instructions may include the message and/or audio to be emitted. For example, the message may state “Please step away from the area.” The speaker may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi. The cloud-based computing system 116 may communicate with the speaker via the network 112.
[0139] At block 1014, a seventh preventative action may include causing an irrigation system to activate. Activating the irrigation system may include transmitting one or more control signals to the irrigation system to turn on and/or turn off one or more zones of the irrigation system. For example, if an incident is a fire or arson at a location, the irrigation system of that location may be activated to attempt to put out the fire with the water from the irrigation system. The irrigation system may include a network interface device configured to connect to the network 112. The network interface device may communicate with the network via a short range wireless protocol (e.g., Bluetooth, ZigBee, etc.) or a wide local area network (WLAN), such as WiFi. The cloud-based computing system 116 may communicate with the irrigation system via the network 112
[0140] FIGURE 11 illustrates example computer system 1100 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure. In one example, computer system 1100 may correspond to the computing device 102, server 118 of the cloud-based computing system 116, artificial intelligence engine 170 of the cloud-based computing system 116, training engine 174 of the cloud-based computing system 116, the cameras 120, and/or the electronic device identification sensors 130 of FIGURE 1. The computer system 1100 may be capable of executing client application 104 of FIGURE. 1. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet. The computer system may operate in the capacity of a server in a client-server network environment. The computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an electronic device identification sensor, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term “computer” shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0141] The computer system 1100 includes a processing device 1102, a main memory 1104 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1106 (e.g., solid state drive (SSD), flash memory, static random access memory (SRAM)), and a data storage device 1108, which communicate with each other via a bus 1110.
[0142] Processing device 1102 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1102 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 1102 may also be one or more special- purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 1102 is configured to execute instructions for performing any of the operations and steps discussed herein. [0143] The computer system 1100 may further include a network interface device 1112. The computer system 1100 also may include a video display 1114 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), one or more input devices 1116 (e.g., a keyboard and/or a mouse), and one or more speakers 1118 (e.g., a speaker). In one illustrative example, the video display 1114 and the input device(s) 1116 may be combined into a single component or device (e.g., an LCD touch screen).
[0144] The data storage device 1116 may include a computer-readable medium 1120 on which the instructions 1122 (e.g., implementing control system, user portal, clinical portal, and/or any functions performed by any device and/or component depicted in the FIGURES and described herein) embodying any one or more of the methodologies or functions described herein is stored. The instructions 1122 may also reside, completely or at least partially, within the main memory 1104 and/or within the processing device 1102 during execution thereof by the computer system 1100. As such, the main memory 1104 and the processing device 1102 also constitute computer- readable media. The instructions 1122 may further be transmitted or received over a network via the network interface device 1112.
[0145] While the computer-readable storage medium 1120 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer- readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “computer- readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media. [0146] Clauses:
[0147] 1. A method for using artificial intelligence to determine a probability of occurrence of a subsequent incident, the method comprising:
[0148] receiving, at a processor, an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time;
[0149] receiving information pertaining to an incident that occurred at the location where the person was present at the first time;
[0150] receiving, at a second time subsequent to the first time, the identifier associated with the person;
[0151] determining, by the processor via a trained machine learning model using the identifier and the information, the probability of occurrence of the subsequent incident; and
[0152] performing, based on the probability of occurrence of the subsequent incident, a preventative action.
[0153] 2. The method of clause 1, wherein the identifier comprises:
[0154] a media access control (MAC) address of a computing device of the person,
[0155] an image of the person,
[0156] a license plate number of a vehicle registered to the person, or
[0157] some combination thereof.
[0158] 3. The method of clause 1, further comprising:
[0159] receiving, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an image of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, an email address, a phone number, an indication of the person being on a watch list, or some combination thereof; and
[0160] correlating the additional information with at least the identifier of the person.
[0161] 4. The method of clause 3, further comprising:
[0162] receiving, at a fourth time, the identifier associated with the person; and
[0163] determining, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.
[0164] 5. The method of clause 1, wherein the identifier associated with the person is received at the second time from the location or at the second time at another location different than the location.
[0165] 6. The method of clause 1, further comprising training, based on the identifier and the information, the trained machine learning model to determine the probability of occurrence of the subsequent incident based on a plurality of training data comprising at least one of:
[0166] other identifiers of other people that were present at the location the incident occurred at the first time,
[0167] one or more criminal records of the person, the other people, or some combination thereof, or
[0168] a pattern recognized using the identifier, the information, the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
[0169] 7. The method of clause 1, wherein the preventative action that is performed is selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both. [0170] 8 The method of clause 1, performing the preventative action further comprises:
[0171] transmitting a notification to a computing device of an emergency responder;
[0172] transmitting a notification to a computing device having the identifier associated with the person;
[0173] transmitting a notification to a computing device of a broadcasting entity;
[0174] causing an alarm system to activate;
[0175] causing an electronic device to activate;
[0176] causing a light to activate;
[0177] causing an event to be triggered;
[0178] causing a speaker to emit a recorded message or audio;
[0179] causing an irrigation system to activate; or
[0180] some combination thereof.
[0181] 9. The method of clause 1, wherein the subsequent incident comprises a criminal offense, a civil offense, a triggered event, or some combination thereof.
[0182] 10. A system comprising:
[0183] a memory device storing instructions; and
[0184] a processing device communicatively coupled to the memory, the processing device executes the instructions to:
[0185] receive an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time;
[0186] receive information pertaining to an incident that occurred at the location where the person was present at the first time; [0187] receive, at a second time subsequent to the first time, the identifier associated with the person;
[0188] determine, via a trained machine learning model using the identifier and the information, a probability of occurrence of a subsequent incident; and
[0189] perform, based on the probability of occurrence of the subsequent incident, a preventative action.
[0190] 11. The system of clause 10, wherein the identifier comprises:
[0191] a media access control (MAC) address of a computing device of the person,
[0192] an image of the person,
[0193] a license plate number of a vehicle registered to the person, or
[0194] some combination thereof.
[0195] 12. The system of clause 10, wherein the processing device is further to:
[0196] receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and
[0197] correlate the additional information with at least the identifier of the person.
[0198] 13. The system of clause 12, wherein the processing device is further to:
[0199] receive, at a fourth time, the identifier associated with the person; and
[0200] determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident. [0201] 14. The system of clause 10, wherein the identifier associated with the person is received at the second time from the location or at the second time at another location different than the location.
[0202] 15. The system of clause 10, wherein the processing device is further to train, based on the identifier and the information, the trained machine learning model to determine the probability of occurrence of the subsequent incident based on a plurality of training data comprising at least one of:
[0203] other identifiers of other people that were present at the location the incident occurred at the first time,
[0204] one or more criminal records of the person, the other people, or some combination thereof, or
[0205] a pattern recognized using the identifier, the information, the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
[0206] 16. The system of clause 10, wherein the preventative action that is performed is selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both.
[0207] 17. A tangible, non-transitory machine-readable medium storing instructions that, when executed, cause a processing device to:
[0208] receive an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time;
[0209] receive information pertaining to an incident that occurred at the location where the person was present at the first time; [0210] receive, at a second time subsequent to the first time, the identifier associated with the person;
[0211] determine, via a trained machine learning model using the identifier and the information, a probability of occurrence of a subsequent incident; and
[0212] perform, based on the probability of occurrence of the subsequent incident, a preventative action.
[0213] 18. The computer-readable medium of clause 17, wherein the identifier comprises:
[0214] a media access control (MAC) address of a computing device of the person,
[0215] an image of the person,
[0216] a license plate number of a vehicle registered to the person, or [0217] some combination thereof.
[0218] 19. The computer-readable medium of clause 17, wherein the processing device is further to:
[0219] receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and
[0220] correlate the additional information with at least the identifier of the person.
[0221] 20. The computer-readable medium of clause 19, wherein the processing device is further to:
[0222] receive, at a fourth time, the identifier associated with the person; and [0223] determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.
[0224] None of the description in this application should be read as implying that any particular element, step, or function is an essential element that must be included in the claim scope. The scope of patented subject matter is defined only by the claims. Moreover, none of the claims is intended to invoke 35 U.S.C. § 112(f) unless the exact words “means for” are followed by a participle.

Claims

WHAT IS CLAIMED IS:
1. A method for using artificial intelligence to determine a probability of occurrence of a subsequent incident, the method comprising: receiving, at a processor, an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time; receiving information pertaining to an incident that occurred at the location where the person was present at the first time; receiving, at a second time subsequent to the first time, the identifier associated with the person; determining, by the processor via a trained machine learning model using the identifier and the information, the probability of occurrence of the subsequent incident; and performing, based on the probability of occurrence of the subsequent incident, a preventative action.
2. The method of claim 1, wherein the identifier comprises: a media access control (MAC) address of a computing device of the person, an image of the person, a license plate number of a vehicle registered to the person, or some combination thereof.
3. The method of claim 1, further comprising: receiving, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an image of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, an email address, a phone number, an indication of the person being on a watch list, or some combination thereof; and correlating the additional information with at least the identifier of the person.
4. The method of claim 3, further comprising: receiving, at a fourth time, the identifier associated with the person; and determining, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.
5. The method of claim 1, wherein the identifier associated with the person is received at the second time from the location or at the second time at another location different than the location.
6. The method of claim 1, further comprising training, based on the identifier and the information, the trained machine learning model to determine the probability of occurrence of the subsequent incident based on a plurality of training data comprising at least one of: other identifiers of other people that were present at the location the incident occurred at the first time, one or more criminal records of the person, the other people, or some combination thereof, or a pattern recognized using the identifier, the information, the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
7. The method of claim 1, wherein the preventative action that is performed is selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both.
8. The method of claim 1, performing the preventative action further comprises: transmitting a notification to a computing device of an emergency responder; transmitting a notification to a computing device having the identifier associated with the person; transmitting a notification to a computing device of a broadcasting entity; causing an alarm system to activate; causing an electronic device to activate; causing a light to activate; causing an event to be triggered; causing a speaker to emit a recorded message or audio; causing an irrigation system to activate; or some combination thereof.
9. The method of claim 1, wherein the subsequent incident comprises a criminal offense, a civil offense, a triggered event, or some combination thereof.
10. A system comprising: a memory device storing instructions; and a processing device communicatively coupled to the memory, the processing device executes the instructions to: receive an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time; receive information pertaining to an incident that occurred at the location where the person was present at the first time; receive, at a second time subsequent to the first time, the identifier associated with the person; determine, via a trained machine learning model using the identifier and the information, a probability of occurrence of a subsequent incident; and perform, based on the probability of occurrence of the subsequent incident, a preventative action.
11. The system of claim 10, wherein the identifier comprises: a media access control (MAC) address of a computing device of the person, an image of the person, a license plate number of a vehicle registered to the person, or some combination thereof.
12. The system of claim 10, wherein the processing device is further to: receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and correlate the additional information with at least the identifier of the person.
13. The system of claim 12, wherein the processing device is further to: receive, at a fourth time, the identifier associated with the person; and determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.
14. The system of claim 10, wherein the identifier associated with the person is received at the second time from the location or at the second time at another location different than the location.
15. The system of claim 10, wherein the processing device is further to train, based on the identifier and the information, the trained machine learning model to determine the probability of occurrence of the subsequent incident based on a plurality of training data comprising at least one of: other identifiers of other people that were present at the location the incident occurred at the first time, one or more criminal records of the person, the other people, or some combination thereof, or a pattern recognized using the identifier, the information, the other identifiers of other people that were present at the location the incident occurred at the first time, the one or more criminal records, or some combination thereof.
16. The system of claim 10, wherein the preventative action that is performed is selected based on a severity of the subsequent incident, the probability of occurrence of the subsequent incident, or both.
17. A tangible, non-transitory machine-readable medium storing instructions that, when executed, cause a processing device to: receive an identifier associated with a person, wherein the identifier is received from a location where the person was present at a first time; receive information pertaining to an incident that occurred at the location where the person was present at the first time; receive, at a second time subsequent to the first time, the identifier associated with the person; determine, via a trained machine learning model using the identifier and the information, a probability of occurrence of a subsequent incident; and perform, based on the probability of occurrence of the subsequent incident, a preventative action.
18. The computer-readable medium of claim 17, wherein the identifier comprises: a media access control (MAC) address of a computing device of the person, an image of the person, a license plate number of a vehicle registered to the person, or some combination thereof.
19. The computer-readable medium of claim 17, wherein the processing device is further to: receive, from a third-party source at a third time subsequent to the first and second times, additional information pertaining to the person, wherein the additional information comprises a criminal record of the person, a mugshot of the person, a fingerprint of the person, an electronic medical record of the person, an address of the person, an age of the person, a name of the person, or some combination thereof; and correlate the additional information with at least the identifier of the person.
20. The computer-readable medium of claim 19, wherein the processing device is further to: receive, at a fourth time, the identifier associated with the person; and determine, via the trained machine learning model using the identifier and the additional information, the probability of occurrence of the subsequent incident.
PCT/US2020/053816 2020-06-24 2020-10-01 System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident WO2021262213A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20941898.7A EP4172886A1 (en) 2020-06-24 2020-10-01 System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident
IL299461A IL299461A (en) 2020-06-24 2020-10-01 System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US16/910,949 2020-06-24
US16/910,949 US11270129B2 (en) 2019-06-25 2020-06-24 System and method for correlating electronic device identifiers and vehicle information
US17/039,505 US20210019645A1 (en) 2019-06-25 2020-09-30 System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident and performing a preventative action
US17/039,505 2020-09-30

Publications (1)

Publication Number Publication Date
WO2021262213A1 true WO2021262213A1 (en) 2021-12-30

Family

ID=79281673

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/053816 WO2021262213A1 (en) 2020-06-24 2020-10-01 System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident

Country Status (3)

Country Link
EP (1) EP4172886A1 (en)
IL (1) IL299461A (en)
WO (1) WO2021262213A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230206373A1 (en) * 2021-12-29 2023-06-29 Motorola Solutions, Inc. System, device and method for electronic identity verification in law enforcement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170293847A1 (en) * 2014-06-30 2017-10-12 Palantir Technologies, Inc. Crime risk forecasting
US20180069937A1 (en) * 2016-09-02 2018-03-08 VeriHelp, Inc. Event correlation and association using a graph database
US20190088096A1 (en) * 2014-09-18 2019-03-21 Indyme Solutions, Llc Merchandise Activity Sensor System and Methods of Using Same
US20200162701A1 (en) * 2018-05-30 2020-05-21 Amazon Technologies, Inc. Identifying and locating objects by associating video data of the objects with signals identifying wireless devices belonging to the objects

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170293847A1 (en) * 2014-06-30 2017-10-12 Palantir Technologies, Inc. Crime risk forecasting
US20190088096A1 (en) * 2014-09-18 2019-03-21 Indyme Solutions, Llc Merchandise Activity Sensor System and Methods of Using Same
US20180069937A1 (en) * 2016-09-02 2018-03-08 VeriHelp, Inc. Event correlation and association using a graph database
US20200162701A1 (en) * 2018-05-30 2020-05-21 Amazon Technologies, Inc. Identifying and locating objects by associating video data of the objects with signals identifying wireless devices belonging to the objects

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230206373A1 (en) * 2021-12-29 2023-06-29 Motorola Solutions, Inc. System, device and method for electronic identity verification in law enforcement

Also Published As

Publication number Publication date
EP4172886A1 (en) 2023-05-03
IL299461A (en) 2023-02-01

Similar Documents

Publication Publication Date Title
US20210019645A1 (en) System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident and performing a preventative action
US10176706B2 (en) Using degree of confidence to prevent false security system alarms
US10257469B2 (en) Neighborhood camera linking system
US10726709B2 (en) System and method for reporting the existence of sensors belonging to multiple organizations
US11610405B2 (en) System and method for correlating electronic device identifiers and vehicle information
US10986717B1 (en) Reduced false alarms for outdoor IP security cameras
US10614689B2 (en) Methods and systems for using pattern recognition to identify potential security threats
US9706379B2 (en) Method and system for generation and transmission of alert notifications relating to a crowd gathering
US10304303B2 (en) System and method for a security checkpoint using radio signals
US20200342727A1 (en) Doorbell communities
CN104205127A (en) Recognition-based security
US10212778B1 (en) Face recognition systems with external stimulus
CN109872482A (en) Wisdom security protection monitoring and managing method, system and storage medium
US11243965B2 (en) Method and apparatus to correlate mobile device wireless activity and security data
US11735017B2 (en) Artificial intelligence (AI)-based security systems for monitoring and securing physical locations
US20230386305A1 (en) Artificial Intelligence (AI)-Based Security Systems for Monitoring and Securing Physical Locations
Anderez et al. The rise of technology in crime prevention: Opportunities, challenges and practitioners perspectives
EP4172886A1 (en) System and method for using artificial intelligence to determine a probability of occurrence of a subsequent incident
US20150287306A1 (en) Proactive Loss Prevention System
Gayathri et al. Intelligent smart home security system: A deep learning approach
US11523485B1 (en) Reduced false alarms for outdoor IP security cameras
US20240104411A1 (en) System and method for predicting the presence of an entity at certain locations
US20230386212A1 (en) Method and system for monitoring activities and events in real-time through self-adaptive ai
Gómez et al. Review of the use of IoT technologies and devices in physical security systems
Castaño-Gómez et al. Review of the use of IoT technologies and devices in physical security systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20941898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020941898

Country of ref document: EP

Effective date: 20230124